Skip to main content

Embeddings LLM

POST 

/v1/embeddings

Generate embeddings for the given input text using the specified model.

The model field in the request body specifies the model to use. Check the /models route for complete list of embedding models. You can specify the encoding format (float or np) and the dimensions (up to 1536) of the output embedding vector.

Request

Body

Embedding generation query

    model stringrequired

    Model to use for generating embeddings.

    input stringrequired

    Input text to generate embeddings for.

    encoding_format string

    Possible values: [float, np]

    Format of the output encoding. Only supported by OpenAI

    dimensions integer

    Possible values: <= 1536

    Number of dimensions for the embedding vector. Maximum value is 384-1536 check the models route for max_tokens per model. Only supported by OpenAI

Responses

Embedding vector generated successfully

Schema
    object string
    data object[]
  • Array [
  • object string
    index integer
    embedding object
    object string
    index integer
    embedding number[]
  • ]
  • provider string
    model string
    usage object
    prompt_tokens integer
    total_tokens integer
    prompt_characters integer
    cost string
    latency_ms integer
Loading...