Mistral AI ↗ 帮助您使用 Mistral 的先进 AI 模型快速构建。
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/mistral
在向 Mistral AI 发出请求时,您需要:
- AI 网关账户 ID
- AI 网关网关名称
- Mistral AI API 令牌
- Mistral AI 模型名称
您的新基础 URL 将使用上述数据的结构:https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/mistral/
。
然后您可以附加您要访问的端点,例如:v1/chat/completions
因此您的最终 URL 将组合为:https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/mistral/v1/chat/completions
。
curl -X POST https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/mistral/v1/chat/completions \ --header 'content-type: application/json' \ --header 'Authorization: Bearer MISTRAL_TOKEN' \ --data '{ "model": "mistral-large-latest", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ]}'
如果您使用 @mistralai/mistralai
包,您可以这样设置您的端点:
import { Mistral } from "@mistralai/mistralai";
const client = new Mistral({ apiKey: MISTRAL_TOKEN, serverURL: `https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/mistral`,});
await client.chat.create({ model: "mistral-large-latest", messages: [ { role: "user", content: "What is Cloudflare?", }, ],});
You can also use the OpenAI-compatible endpoint (/ai-gateway/chat-completion/
) to access Mistral models using the OpenAI API schema. To do so, send your requests to:
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions
Specify:
{"model": "mistral/{model}"}
- @2025 Cloudflare Ubitools
- Cf Repo