|
|
Multi-Turn Conversations with Ollama in Python
Author: Venkata Sudhakar
Ollama supports multi-turn conversations where the model remembers previous messages in the same session. This is essential for building chatbots that can ask clarifying questions, remember context from earlier in the conversation, and provide coherent follow-up responses. The messages list in the ollama.chat() call serves as the conversation history - you append each user and assistant message to maintain context. At ShopMax India, multi-turn chat powers the in-store kiosk assistant that helps customers compare products across multiple questions. Managing conversation history requires keeping track of the messages list across turns. Each exchange adds a user message and an assistant message to the list. The model receives the full history on every call so it can reference earlier statements. For long conversations, you may want to trim old messages to stay within the model context window. The below example shows how to build a multi-turn product advisor chatbot using Ollama.
It gives the following output,
User: I need a laptop for video editing.
Assistant: For video editing, the ShopMax CreatorBook Pro with
an i9 processor, 32GB RAM, and NVIDIA RTX 4060 GPU is ideal.
Priced from Rs 1,10,000 at ShopMax stores in Mumbai and Bangalore.
User: What is my budget range for that?
Assistant: For professional video editing, plan a budget between
Rs 80,000 and Rs 1,50,000 depending on the resolution and
complexity of your projects.
User: Do you have anything with a dedicated GPU?
Assistant: Yes, the ShopMax CreatorBook Pro I mentioned includes
a dedicated NVIDIA RTX 4060 GPU with 8GB VRAM, which handles
4K editing smoothly in Premiere Pro and DaVinci Resolve.
Notice that in the third response the model correctly refers back to the product it mentioned in the first response - this is the multi-turn context at work. For production deployments at ShopMax India, store the conversation list in a session object tied to each customer interaction so that different customers have independent conversation histories. Clear the history when the session ends to free memory and protect privacy.
|
|