Import a language model into Foundry

Foundry allows you to import open-source language models to your Foundry workflows with no code. These language models can be submitted to a modeling objective for use in batch and live deployments.

To use other language models, you can also easily add support for a new language model. For details on the model adapters used below, see the documentation on language model adapters.

Import a language model

A supported open-source model can be integrated without writing any code by following the steps below:

  1. Create a new or open an existing modeling objective to which you will add the language model. In the modeling objective, select Add model.

Empty modeling objective

  1. Select the option Import an open-source model.

Add an open-source model language model in a modeling objective

  1. Select one of the available models, then select Next.

Choose open-source model language model in a modeling objective

  1. Choose where the created model resource will be saved in Foundry, and decide whether you want to create a sandbox deployment.

    Foundry provides a default model adapter for available open-source language models. However, you can also configure a custom model adapter.

Configure an open-source language model in a modeling objective

  1. If you created a sandbox deployment, you will be redirected to that sandbox deployment to begin inference.

Perform live inference with an open-source language model in a modeling objective

The first query for a new live deployment can take longer than subsequent requests.

The usage instructions for each language model will depend on the specific model adapter used.

Learn more about the model adapters that Foundry provides for the above open-source language models.