Skip to main content

Custom Host

note

If your tasking AI server is locally deployed with Docker, and the target model is also running in your local environment, CUSTOM_HOST_ENDPOINT_URL should start with http://host.docker.internal:port instead of http://localhost:port. Replace port with your actual port number.

Background

In this fast developing AI era, with the industry growing rapidly, there are more and more AI models available on the market. Although TaskingAI is committed to keep integrating AI models from different providers, it is possible that your desired model is not officially supported by TaskingAI.

The Custom Host feature allows you to integrate your self-hosted model or a model from a provider that is not officially supported by TaskingAI.

Models

At the moment, the industry has no agreed standard for input and output schemas of embedding models. But in order to connect your model to TaskingAI, you need to make sure that your model can accept and return data in the format that TaskingAI expects.

TaskingAI currently supports the standard schema for embedding tasks from OpenAI for custom models, as this schema is widely used in the industry:

OpenAI Text Embedding

  • Model schema id: custom_host/openai-text-embedding

This schema is the latest version from OpenAI as of the writing of this documentation. A detailed explanation of the schema can be found at OpenAI's official documentation. This schema supports multiple text inputs at the same time

openai-text-embedding

How to use

To integrate an embedding model using the Custom Host feature, you need to prepare the following information:

  • Custom model host url: the endpoint of your self-hosted model or the model from a provider. The url should be accessible from the internet and should be able to accept POST requests.
    • Note:The url should be a callable endpoint, not the ip address of the server.
  • Model id: the model id that the custom host recognize. This should be provided by the model provider.
  • API key: if your target model requires an API key, you need to provide it, allowing TaskingAI to interact with the model.
  • Embedding size: the size of the embedding vector that the model will return. This should be provided by the model provider.

custom-host example