Agents SDK v0.1.0 and workers-ai-provider v2.0.0 with AI SDK v5 support
We've shipped a new release for the Agents SDK ↗ bringing full compatibility with AI SDK v5 ↗ and introducing automatic message migration that handles all legacy formats transparently.
This release includes improved streaming and tool support, tool confirmation detection (for "human in the loop" systems), enhanced React hooks with automatic tool resolution, improved error handling for streaming responses, and seamless migration utilities that work behind the scenes.
This makes it ideal for building production AI chat interfaces with Cloudflare Workers AI models, agent workflows, human-in-the-loop systems, or any application requiring reliable message handling across SDK versions — all while maintaining backward compatibility.
Additionally, we've updated workers-ai-provider v2.0.0, the official provider for Cloudflare Workers AI models, to be compatible with AI SDK v5.
Creates a new chat interface with enhanced v5 capabilities.
// Basic chat setupconst { messages, sendMessage, addToolResult } = useAgentChat({ agent, experimental_automaticToolResolution: true, tools,});
// With custom tool confirmationconst chat = useAgentChat({ agent, experimental_automaticToolResolution: true, toolsRequiringConfirmation: ["dangerousOperation"],});
Tools are automatically categorized based on their configuration:
const tools = { // Auto-executes (has execute function) getLocalTime: { description: "Get current local time", inputSchema: z.object({}), execute: async () => new Date().toLocaleString(), },
// Requires confirmation (no execute function) deleteFile: { description: "Delete a file from the system", inputSchema: z.object({ filename: z.string(), }), },
// Server-executed (no client confirmation) analyzeData: { description: "Analyze dataset on server", inputSchema: z.object({ data: z.array(z.number()) }), serverExecuted: true, },} satisfies Record<string, AITool>;
Send messages using the new v5 format with parts array:
// Text messagesendMessage({ role: "user", parts: [{ type: "text", text: "Hello, assistant!" }],});
// Multi-part message with filesendMessage({ role: "user", parts: [ { type: "text", text: "Analyze this image:" }, { type: "image", image: imageData }, ],});
Simplified logic for detecting pending tool confirmations:
const pendingToolCallConfirmation = messages.some((m) => m.parts?.some( (part) => isToolUIPart(part) && part.state === "input-available", ),);
// Handle tool confirmationif (pendingToolCallConfirmation) { await addToolResult({ toolCallId: part.toolCallId, tool: getToolName(part), output: "User approved the action", });}
Seamlessly handle legacy message formats without code changes.
// All these formats are automatically converted:
// Legacy v4 string contentconst legacyMessage = { role: "user", content: "Hello world",};
// Legacy v4 with tool callsconst legacyWithTools = { role: "assistant", content: "", toolInvocations: [ { toolCallId: "123", toolName: "weather", args: { city: "SF" }, state: "result", result: "Sunny, 72°F", }, ],};
// Automatically becomes v5 format:// {// role: "assistant",// parts: [{// type: "tool-call",// toolCallId: "123",// toolName: "weather",// args: { city: "SF" },// state: "result",// result: "Sunny, 72°F"// }]// }
Migrate tool definitions to use the new inputSchema
property.
// Before (AI SDK v4)const tools = { weather: { description: "Get weather information", parameters: z.object({ city: z.string(), }), execute: async (args) => { return await getWeather(args.city); }, },};
// After (AI SDK v5)const tools = { weather: { description: "Get weather information", inputSchema: z.object({ city: z.string(), }), execute: async (args) => { return await getWeather(args.city); }, },};
Seamless integration with Cloudflare Workers AI models through the updated workers-ai-provider v2.0.0.
Use Cloudflare Workers AI models directly in your agent workflows:
import { createWorkersAI } from "workers-ai-provider";import { useAgentChat } from "agents/ai-react";
// Create Workers AI model (v2.0.0 - same API, enhanced v5 internals)const model = createWorkersAI({ binding: env.AI,})("@cf/meta/llama-3.2-3b-instruct");
Workers AI models now support v5 file handling with automatic conversion:
// Send images and files to Workers AI modelssendMessage({ role: "user", parts: [ { type: "text", text: "Analyze this image:" }, { type: "file", data: imageBuffer, mediaType: "image/jpeg", }, ],});
// Workers AI provider automatically converts to proper format
Enhanced streaming support with automatic warning detection:
// Streaming with Workers AI modelsconst result = await streamText({ model: createWorkersAI({ binding: env.AI })("@cf/meta/llama-3.2-3b-instruct"), messages, onChunk: (chunk) => { // Enhanced streaming with warning handling console.log(chunk); },});
Update your imports to use the new v5 types:
// Before (AI SDK v4)import type { Message } from "ai";import { useChat } from "ai/react";
// After (AI SDK v5)import type { UIMessage } from "ai";// or alias for compatibilityimport type { UIMessage as Message } from "ai";import { useChat } from "@ai-sdk/react";
- Migration Guide ↗ - Comprehensive migration documentation
- AI SDK v5 Documentation ↗ - Official AI SDK migration guide
- An Example PR showing the migration from AI SDK v4 to v5 ↗
- GitHub Issues ↗ - Report bugs or request features
We'd love your feedback! We're particularly interested in feedback on:
- Migration experience - How smooth was the upgrade process?
- Tool confirmation workflow - Does the new automatic detection work as expected?
- Message format handling - Any edge cases with legacy message conversion?
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Directory
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark