tl  tr
  Home | Tutorials | Articles | Videos | Products | Tools | Search
Interviews | Open Source | Tag Cloud | Follow Us | Bookmark | Contact   
 Generative AI > Ollama > Ollama Function Calling and Tool Use in Python

Ollama Function Calling and Tool Use in Python

Author: Venkata Sudhakar

ShopMax India needs agents that can look up product inventory, check order status, and calculate delivery costs. Ollama supports function calling in models like llama3.2 and mistral, allowing the model to decide which function to call and with what arguments based on the user query.

Function calling works by passing a tools array to the chat API describing available functions with their name, description, and parameter schema. When the model decides to call a function, the response contains a tool_calls list. Your code executes the function and returns the result back to the model for a final text response.

The below example shows how ShopMax India implements a product price lookup agent using Ollama function calling with llama3.2.


It gives the following output,

The price of TV001 in Mumbai at ShopMax India is Rs 45,000.

Not all Ollama models support function calling - check the Ollama model page to confirm tool use support before deployment. Use llama3.2 or mistral-nemo for reliable results. For ShopMax India production, define comprehensive schemas with clear parameter descriptions to reduce model errors when extracting arguments from natural language queries.


 
  


  
bl  br