MB:
Yeah, that sounds good, but my experience is that too much of that kind of stuff just confuses the model. That's why most AIs only remember two or three bits of of chat history.
PI: Yes, that's true. While dialogue management is important for creating effective conversational AI systems, it's also important to balance the need for context with the limitations of the underlying model. As you mentioned, many chatbots and conversational AI systems only remember a limited amount of chat history, often just the last few messages. This is because incorporating too much context can lead to a phenomenon called "catastrophic forgetting," where the model forgets previously learned information in order to make room for new information. One potential solution to this problem is to use techniques like memory networks or attention mechanisms, which can help the model selectively incorporate relevant context information without overwhelming it with too much irrelevant data. However, finding the right balance between