Python functions are currently in a beta state and may not be available on all enrollments.
This guide assumes you have already authored and published a Python function. Review the getting started with Python functions documentation for a tutorial. For examples of how to query the Ontology using the Python SDK, see the Python Ontology SDK documentation.
Unlike TypeScript functions, which run in a "serverless" mode, Python functions must be deployed before they can be used in Workshop. All functions from a single repository are hosted by a single deployment. This means that you can run multiple version of a function simultaneously by defining multiple versions in your repository.
Deployed Python functions can incur compute costs for the running deployment.
Follow the steps below to prepare and configure a Python function in Workshop:
In Workshop, search for the Python function from the Variables tab to the left side of the module. Deployed functions will show an icon with one of three states for both the function and the function version:
Only one version of the function’s repository is hosted at a given time. To make changes to functions with limited downtime we recommend adding a new function (like function_v1
) with the changes and tagging as described here. From your published functions under tags and releases, select Open in Ontology Manager.
In Ontology Manager, select the version of the function repository you want to use in applications, then select Upgrade.
Update all downstream applications using functions from this repository to the new version you have deployed. Note that the previous deployment version will no longer be running so your applications will have a short downtime as you make this change. You will have function_v0
and function_v1
available at the same time so while you need to switch to the new deployment version, you do not have to change the function you are using. When function_v0
is no longer used, you can delete the function.
If your function is not working as expected in Workshop, first check if the issue is related to the logic or the responsiveness of the function. If there is an issue with the logic, inspect the source code in the backing code repository. If there is an issue with the function being unresponsive or throwing an error, follow the steps below:
Upgrading
, hover over the function’s information icon and select Configure. This will take you to Ontology Manager where you can select Start Deployment to get your function running again.Running
or you need more information about the deployment’s behavior, select Deployment from the left panel in Ontology Manager to view detailed logs. SLS logs are also available if you select View live.