首先感谢 cloudflare,真的是良心企业,所以大家要珍惜啊,不要在 tz 方面滥用了!
workers AI 公测一段时间了,起初模型很少,昨天发现有人说加了很多模型,体验了一下确实不错啊
下面其中一个 chat 的模型使用 python 命令行调用的示例,cf 家的东西,相信后面正式价格肯定不会太高。
api token 获取教程可以参考官方文档:https://developers.cloudflare.com/workers-ai/
proxies 为dl,如果不用可以删除
import requests, json
API_BASE_URL = "https://api.cloudflare.com/client/v4/accounts/改为你自己的account_id/ai/run/"
headers = {"Authorization": "Bearer 改为你自己的token"}
# 构造代理字典
proxies = {
"http": "http://127.0.0.1:10809",
"https": "http://127.0.0.1:10809",
}
def run(model, prompt):
input = {
"stream": True,
"messages": [
{"role": "system", "content": "你是一个友好的人工智能,请使用中文回答我的所有问题"},
{"role": "user", "content": prompt}
]
}
response = requests.post(f"{API_BASE_URL}{model}", headers=headers, json=input, proxies=proxies, stream=True)
if response.status_code == 200:
print("Bot:", end="")
chunk_size = 512
for chunk in response.iter_content(chunk_size=chunk_size):
if chunk:
chunk_res = chunk.decode('utf-8').replace("data: ", "")
if "[DONE]" in chunk_res:
break
try:
for i in chunk_res.split("\n\n"):
if i and i != '':
res = json.loads(i)
print(res.get("response"), end='')
except Exception as err:
print(err)
print(chunk_res)
return
print("\n")
def main():
print("Welcome to the cloudflare assistant! Type 'exit' to end.")
while True:
user_input = input("You: ")
if user_input.lower() == 'exit':
print("Goodbye! assistant ended.")
break
run("@cf/meta/llama-2-7b-chat-int8", user_input)
if __name__ == "__main__":
main()