Use a templated prompt, and get ChatGPT to rewrite the prompt over time to include summaries of its aggregated knowledge. So the prompt includes the instruction to, and the conclusion fo the session, rewrite the prompt using the latest knowledge.
For example, the templated prompt includes a growing list of key entities relevant to the topic at and: for instance, key people you encountered in your day to day. As more entities are added, their detail summaries will have to be more condensed. It's possible that a model would even invent a representation to get around the limits of human language.
So it's an adaptive dynamic prompt that aims to capture the best of ChatGPT's intellignece.
We call it: smart context.
Perhaps it can even be used on dumber models to boost their perforamnce. The idea is like recursive improvement, and should converge in the limit to the optimum intelligence output of the system.