POST
/
v1
/
detokenize

To successfully run an inference request, it is mandatory to enter a Friendli Token (e.g. flp_XXX) value in the Bearer Token field. Refer to the authentication section on our introduction page to learn how to acquire this variable and visit here to generate your token.

Authorizations

Authorization
string
headerrequired

When using Friendli Endpoints API for inference requests, you need to provide a Friendli Token for authentication and authorization purposes.

For more detailed information, please refer here.

Headers

X-Friendli-Team
string

ID of team to run requests as (optional parameter).

Body

application/json
model
string
required

Code of the model to use. See available model list.

tokens
integer[]
required

A token sequence to detokenize.

Response

200 - application/json
text
string

Detokenized text output.