MemGPT provides you a free endpoint for trying. It's at https://inference.memgpt.ai/chat/completions
From official docs it's claimed that the free endpoint is running on variants of Mixtral 8x7b
:
MemGPT Free Endpoint: select this if you'd like to try MemGPT on a top open LLM for free (currently variants of Mixtral 8x7b!)
However, after I manually try running Mixtral 8x7b
on my own machine, I saw that the model cannot be compared to the free endpoint in term of accuracy (function calling + response). This makes me want to find out the real model behind this endpoint.
Free endpoint just forwards your call to OpenAI's ChatGPT 3.5. However, I cannot be sure if they log our requests or not.