The model inference history is a dataset in Foundry that captures all inference requests (inputs) and inference results (outputs) that are sent to a Modeling Objective live deployment. This can enable many workflows, such as the following:
To create a model inference history, navigate to the Deployments page of your Modeling Objective and select the live deployment to which you would like to add the model inference history. Under the Model Inference History card, click on Create dataset. This will open the Create new model inference history dialog.
In the Create new model inference history dialog, enter a dataset name and location for the model inference history. We strongly recommend adding Security Markings since inputs and outputs may contain sensitive information and should only be accessible by individuals with the appropriate security permissions.
Once the model inference history is created, the following information will be recorded in the dataset:
timestamp
: Timestamp of the input requestuser_id
: Foundry user ID of the person who sent the requestrequest_uuid
: Unique identifier of the requestlive_rid
: Modeling live deployment resource identifierobjective_rid
: Modeling objective resource identifiermodel_rid
: Model asset resource identifiermodel_version_rid
: Model asset version resource identifierinput
: JSON representation of the input sent to the modeloutput
: JSON representation of the model outputerror
: Detailed error message or stacktrace encountered by the model while running inference on the provided inputTo temporarily enable or disable a model inference history, click the toggle in the top right corner of the Model Inference History card labeled Enable Recording.
To permanently disable a model inference history, select Remove to the right of the dataset in the Model Inference History card.
Removing a model inference history is permanent and cannot be undone. Once a dataset is removed as a model inference history, the dataset will still exist, but cannot be added back to a deployment.
To enable a model inference history a user requires the gatekeeper permission foundry-ml-live:edit-inference-ledger
. Typically, this is granted with the "owner" role on a modeling objective.
As inputs and outputs may contain sensitive information, you are required to save the dataset to the same project as its parent modeling objective.
To ensure the model inference history matches the designated schema, you cannot choose an existing dataset for the model inference history.
If you remove your model inference history, the dataset will not be deleted, however, Foundry will stop writing new records to it.
You can use the recording toggle to temporarily enable or disable your model inference history.
Not all inputs and outputs are guaranteed to appear in the inference history. To maintain platform performance, requests that fail when trying to write to the model inference history are not retried.