Integrations
LlamaIndex
You can use LlamaIndex to interact with FriendliAI. This makes migration of existing applications already using LlamaIndex particularly easy.
How to use
Before you start, ensure you’ve already obtained the FRIENDLI_TOKEN
from the Friendli Suite.
pip install llama-index llama-index-llms-friendli
Instantiation
Now we can instantiate our model object and generate chat completions.
The default model (i.e. mixtral-8x7b-instruct-v0-1
) will be used if no model is specified.
from llama_index.llms.friendli import Friendli
import os
os.environ['FRIENDLI_TOKEN'] = "YOUR_FRIENDLI_TOKEN"
llm = Friendli(model="meta-llama-3.1-70b-instruct")
Chat completion
Generate a response from a given conversation.
Completion
Generate a response from a given prompt.
Was this page helpful?