Deploy Python functions

Beta

Python functions are currently in a beta state and may not be available on all enrollments.

Prerequisites

This guide assumes you have already authored and published a Python function. Review the getting started with Python functions documentation for a tutorial. For examples of how to query the Ontology using the Python SDK, see the Python Ontology SDK documentation.

Choose between deployed and serverless execution modes

If serverless Python is enabled for your enrollment, new repositories will use it by default. We generally recommend serverless functions for most use cases. While a deployed function may be useful in some circumstances, the serverless execution mode requires less maintenance and avoids incurring the costs associated with long-lived deployments.

Deployed functions have some capabilities that are not available to serverless functions:

  • External sources are not currently supported by serverless functions. You must deploy your function in this case.
  • The long-lived nature of deployed functions means that local caching may be possible if the function is tolerant to restarts.

Deployed functions have some limitations that do not apply to serverless execution:

  • Serverless functions enable different versions of a single function to be executed on demand, making upgrades safer. With deployed functions, you can only run a single function version at a time.
  • Serverless functions only incur costs when executed, while deployed functions incur costs as long as the deployment is running.
  • Serverless functions require less upfront setup and long-term maintenance, as the infrastructure is managed automatically.

To enable serverless Python functions for your enrollment, contact your Palantir administrator.

Architecture

Python functions can be run in a serverless mode, leveraging on-demand resources in the same way as TypeScript functions, or they can be deployed to a long-lived container.

We recommend using serverless Python functions if enabled on your enrollment, rather than deployed functions. While there are some cases where deployed functions are useful, the serverless executor is generally more flexible.

When your function is deployed, a long-running environment will be created to handle incoming execution requests. The environment will be scaled according to the request volume and occasionally restarted by automated processes. All functions from a single repository are hosted by a single deployment.

Compute costs

Deployed Python functions will incur compute costs for the running deployment. Serverless functions will only incur costs when executed.

Deploy the Python function

Follow the steps below to prepare and configure a deployed Python function:

  1. Open your Python function repository and navigate to the Branches tab, then select Tags and releases.
  2. Hover over the function you want to deploy, then select Open in Ontology Manager.

Open the Python function in Ontology Manager.

  1. Select the version of the Python function you want to use from the version selector on the left.
  2. Choose Create and start deployment.

Create and start deployment of Python function

  1. If serverless functions are enabled in your environment, you will see an option to switch between serverless and deployed. If unselected and no deployment exists, serverless will be used by default.

A Python function in serverless mode

  1. Select Deployed in the dropdown, then Start to launch the deployment.

Change mode of Python function

  1. Wait for the deployment that is hosting the function to start up.