
Ollama Tool Calling in 5 Lines of Python
Ollama added tool calling support. Models like qwen2.5 , llama3.1 , and mistral can now call functions — inspect a schema, decide which function to invoke, pass structured arguments, and use the result in their response. It's genuinely powerful. And using it is genuinely painful. What tool calling actually looks like Here's the minimum viable code to get Ollama tool calling working with requests . Not pseudocode — this is the actual flow you have to implement: import json import requests # Step 1: Define your tool schema manually tools = [{ " type " : " function " , " function " : { " name " : " get_weather " , " description " : " Get weather for a city. " , " parameters " : { " type " : " object " , " properties " : { " city " : { " type " : " string " , " description " : " The city name " } }, " required " : [ " city " ] } } }] # Step 2: Send the chat request with tool definitions response = requests . post ( " http://localhost:11434/api/chat " , json = { " model " : " qwen2.5:3b " ,
Continue reading on Dev.to
Opens in a new tab

