本指南将引导您完成 Workers AI 项目的设置和部署。您将使用 Workers、AI 网关绑定和一个大型语言模型 (LLM),在 Cloudflare 全球网络上部署您的第一个由 AI 驱动的应用程序。
- Sign up for a Cloudflare account ↗.
- Install
Node.js
↗.
Node.js version manager
Use a Node version manager like Volta ↗ or nvm ↗ to avoid permission issues and change Node.js versions. Wrangler, discussed later in this guide, requires a Node version of 16.17.0
or later.
您将使用 create-Cloudflare CLI (C3) 创建一个新的 Worker 项目。C3 是一个命令行工具,旨在帮助您设置和部署新的应用程序到 Cloudflare。
通过运行以下命令创建一个名为 hello-ai
的新项目:
npm create cloudflare@latest -- hello-ai
yarn create cloudflare hello-ai
pnpm create cloudflare@latest hello-ai
运行 npm create cloudflare@latest
将提示您安装 create-cloudflare 包并引导您完成设置。C3 还将安装 Wrangler,即 Cloudflare 开发者平台 CLI。
For setup, select the following options:
- For What would you like to start with?, choose
Hello World example
. - For Which template would you like to use?, choose
Worker only
. - For Which language do you want to use?, choose
TypeScript
. - For Do you want to use git for version control?, choose
Yes
. - For Do you want to deploy your application?, choose
No
(we will be making some changes before deploying).
这将创建一个新的 hello-ai
目录。您的新 hello-ai
目录将包括:
- 一个位于
src/index.ts
的 "Hello World" Worker。 - 一个 Wrangler 配置文件
进入您的应用程序目录:
cd hello-ai
您必须为您的 Worker 创建一个 AI 绑定以连接到 Workers AI。绑定允许您的 Worker 与 Cloudflare 开发者平台上的资源(如 Workers AI)进行交互。
要将 Workers AI 绑定到您的 Worker,请将以下内容添加到您的 Wrangler 配置文件的末尾:
{ "ai": { "binding": "AI" }}
[ai]binding = "AI"
您的绑定在您的 Worker 代码中通过 env.AI
可用。
下一步您将需要您的 gateway id
。您可以在本教程中了解如何创建 AI 网关。
您现在已准备好在您的 Worker 中运行推理任务。在这种情况下,您将使用一个 LLM,llama-3.1-8b-instruct-fast
,来回答一个问题。您的网关 ID 可以在仪表板上找到。
使用以下代码更新您的 hello-ai
应用程序目录中的 index.ts
文件:
export interface Env { // 如果您在 [Wrangler 配置文件](/workers/wrangler/configuration/) 中为 'binding' 设置了另一个名称, // 请将 "AI" 替换为您定义的变量名。 AI: Ai;}
export default { async fetch(request, env): Promise<Response> { // 在此处指定网关标签和其他选项 const response = await env.AI.run( "@cf/meta/llama-3.1-8b-instruct-fast", { prompt: "What is the origin of the phrase Hello, World", }, { gateway: { id: "GATEWAYID", // 在此处使用您的网关标签 skipCache: true, // 可选:如果需要,跳过缓存 }, }, );
// 将 AI 响应作为 JSON 对象返回 return new Response(JSON.stringify(response), { headers: { "Content-Type": "application/json" }, }); },} satisfies ExportedHandler<Env>;
至此,您已经为您的 Worker 创建了一个 AI 绑定,并配置了您的 Worker 以能够执行 Llama 3.1 模型。现在,您可以在全球部署之前在本地测试您的项目。
在您的项目目录中,通过运行 wrangler dev
在本地测试 Workers AI:
npx wrangler dev
运行 wrangler dev
后,系统会提示您登录。当您运行 npx wrangler dev
时,Wrangler 会给您一个 URL(很可能是 localhost:8787
)来审查您的 Worker。在您访问 Wrangler 提供的 URL 后,您将看到类似以下示例的消息:
{ "response": "A fascinating question!\n\nThe phrase \"Hello, World!\" originates from a simple computer program written in the early days of programming. It is often attributed to Brian Kernighan, a Canadian computer scientist and a pioneer in the field of computer programming.\n\nIn the early 1970s, Kernighan, along with his colleague Dennis Ritchie, were working on the C programming language. They wanted to create a simple program that would output a message to the screen to demonstrate the basic structure of a program. They chose the phrase \"Hello, World!\" because it was a simple and recognizable message that would illustrate how a program could print text to the screen.\n\nThe exact code was written in the 5th edition of Kernighan and Ritchie's book \"The C Programming Language,\" published in 1988. The code, literally known as \"Hello, World!\" is as follows:\n\n```main(){ printf(\"Hello, World!\");}```\n\nThis code is still often used as a starting point for learning programming languages, as it demonstrates how to output a simple message to the console.\n\nThe phrase \"Hello, World!\" has since become a catch-all phrase to indicate the start of a new program or a small test program, and is widely used in computer science and programming education.\n\nSincerely, I'm glad I could help clarify the origin of this iconic phrase for you!"}
在将您的 AI Worker 全球部署之前,请通过运行以下命令使用您的 Cloudflare 账户登录:
npx wrangler login
您将被引导到一个网页,要求您登录 Cloudflare 仪表板。登录后,系统会询问您是否允许 Wrangler 对您的 Cloudflare 账户进行更改。向下滚动并选择 允许 以继续。
最后,部署您的 Worker,使您的项目可以在互联网上访问。要部署您的 Worker,请运行:
npx wrangler deploy
部署后,您的 Worker 将在类似以下的 URL 上可用:
https://hello-ai.<YOUR_SUBDOMAIN>.workers.dev
您的 Worker 将被部署到您的自定义 workers.dev
子域。您现在可以访问该 URL 来运行您的 AI Worker。
通过完成本教程,您创建了一个 Worker,通过 AI 网关绑定将其连接到 Workers AI,并成功使用 Llama 3.1 模型运行了一个推理任务。
- @2025 Cloudflare Ubitools
- Cf Repo