ConversationContainer
This is the "harness" which orchestrates all the interaction between LLMs and... anything else.
It's powered by regular-old computer code, and it's responsible for
- Setting up the connection to the LLM, including the SystemPrompt, including instructions and a list of available tools
- Handling input (often, but not always, from a human)
- Executing the actual requests to the LLM, wherever it's hosted
- Handling the response, which usually comes as a stream of chunks of data (of various types)
- Responding to that data in various ways:
- Streaming it back to the user-facing application
- Executing a tool
- Returning control to the user
- Sending a new request to the LLM (might not be the same one), often via PromptChaining.
- Handling conversation state
It's important to understand that all your interactions with the LLM are mediated by some application or set of applications that actually execute that interaction.
The LLM conversation itself is just data—request (including the conversation up til the current token), and response (the tokens coming back from the LLM).