llm-axe ollama 函数调用试用
以前简单介绍了llm-axe是支持函数调用的,以下是对于ollama 函数调用的简单试用
参考使用
- me.py
from llm_axe.agents import FunctionCaller
from llm_axe.models import OllamaChat
import time
def get_time():
return time.strftime("%I:%M %p")
def get_date():
return time.strftime("%Y-%m-%d")
def get_location():
return "Beijing, China"
def add(num1:int, num2:int):
return num1 + num2
def multiply(num1:int, num2:int):
return num1 * num2
def get_distance(lat1:int, lon1:int, lat2:int, lon2:int):
"""
Calculates the distance between two points on the Earth's surface using the Haversine formula.
:param lat1: latitude of point 1
:param lon1: longitude of point 1
:param lat2: latitude of point 2
:param lon2: longitude of point 2
"""
return(lat1, lon1, lat2, lon2)
# 会调用get_date 函数
prompt = "今天是几号"
llm = OllamaChat(model="qwen2:7b",host="http://localhost:11434")
fc = FunctionCaller(llm, [get_time, get_date, get_location,get_distance, add, multiply])
result = fc.get_function(prompt)
# If no function was found, exit
if(result is None):
print("No function found")
exit()
func = result['function']
params = result['parameters']
print(func(**params))
print(result['parameters'])
print(result['prompts'])
print(result['raw_response'])
- 效果
内部处理
实际上目前对于开源ollama 的函数调用还是提示词以及json 结构化输出,然后再进行函数的执行,还有一种基于instructor 的结构化输出,然后在进行函数执行的方法
参考资料
https://github.com/emirsahin1/llm-axe
https://python.useinstructor.com/