Vercel AI SDK
You can use Vercel AI SDK to interact with FriendliAI. This makes migration of existing applications already using Vercel AI SDK particularly easy.
How to use
Before you start, ensure you’ve already obtained the FRIENDLI_TOKEN
from the Friendli Suite.
Instantiation
Instantiate your models using a Friendli provider instance. We provide usage examples for each type of endpoint. Choose the one that best suits your needs:
Chat completion
Generate a response with the generateText
function:
Tool assisted chat completion
This feature is in Beta and available only on the Serverless Endpoints.
Using tool assisted chat completion API, models can utilize built-in tools prepared for tool calls, enhancing its capability to provide more comprehensive and actionable responses.
Available tools are listed here.
import { friendli } from "@friendliai/ai-provider";
import { streamText } from "ai";
const result = await streamText({
model: friendli("meta-llama-3.1-70b-instruct", {
tools: [
{"type": "web:search"},
{"type": "math:calculator"},
],
}),
prompt: "Find the current USD to CAD exchange rate and calculate how much $5,000 USD would be in Canadian dollars.",
});
for await (const textPart of result.textStream) {
console.log(textPart);
}
OpenAI Compatibility
🎯 You can also use @ai-sdk/openai
as the APIs are OpenAI-compatible.
import { createOpenAI } from '@ai-sdk/openai';
const friendli = createOpenAI({
baseURL: 'https://inference.friendli.ai/v1',
apiKey: process.env.FRIENDLI_TOKEN,
});
Further resources
Was this page helpful?