Model connectivity & developmentMigrate from foundry_ml to palantir_modelsPalantir models migration FAQ

Palantir models migration FAQ

Why might a dataset-backed model not appear on the campaign page? Which models are considered in use, and which are ignored?

The upgrade page displays all dataset-backed models that:

  • were built on the default branch in the past 90 days, or,
  • were an input to a build on the default branch within the past 90 days (including batch deployment builds governed by a modeling objective), or,
  • are the latest release on a modeling objective with an active live deployment.

As part of this intervention, dataset-backed models that do not meet these criteria are marked as Ignored, and are filtered out by default from the campaign view.

Models for which a replacement model was selected have their status set to Completed, which also filters them out from the campaign view by default.

When will the intervention on consuming resources start?

As explained in the migration overview, this effort will be split into two migration campaigns in Upgrade Assistant:

  • The first migration will concern dataset-backed models themselves
  • A subsequent migration will surface resources that consume those dataset-backed models

The second intervention will start no later than April 2025, to allow 6 months for users to migrate any consuming resources.

What happens after the deprecation date?

Models developed with foundry_ml will no longer be supported in modeling objectives, Python transforms, or modeling objective deployments. More concretely:

  • It will no longer be possible to publish new foundry_ml models
  • Code using foundry_ml will no longer be editable and checks will fail for any code importing foundry_ml
  • Any jobs or live deployments using foundry_ml models may break and will no longer be supported by Palantir

Can a model be migrated without re-training?

It is possible to migrate a model without re-training it. To do so, load the model and write it to an output dataset from a Code Repository with foundry_ml installed and using Python 3.9:

Copied!
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 from transforms.api import Input, Output, transform # Pickle may not be the right choice depending on the class of the models. # Refer to the documentation of the modeling library you are using to select a serialization method. import pickle from foundry_ml import Model @transform( output_dataset=Output("<path_to_output_dataset>"), source_model=Input("<path_to_foundry_ml_model>"), ) def compute(output_dataset, source_model): MODEL_STAGE_ID = <index_of_stage_to_save> model = Model.load(source_model) # Select the relevant model stage. model_stage = model.stages[MODEL_STAGE_ID].model # Write it to the output dataset. with output_dataset.filesystem().open("model.pkl", 'wb') as f: pickle.dump(model_stage, f)

From this output dataset, you can then publish a model from a separate repository using palantir_models and a more recent Python version. Refer to this tutorial to learn more.

Why is AIP not able to provide suggestions?

AIP may not be able to provide LLM-generated suggestions in the following cases:

  • The code for the model was not found on the default branch (called master on most environments), most likely because this branch is empty. This can be verified by navigating to the model on the default branch: if Foundry cannot find code for the model, there will be no View Code option.
  • The model is a TypeScript Function that was submitted to an objective. These models are identifiable by the Model execution plan on the model page showing foundryFunctionV2 as its type.
  • AIP might have failed to produce a suggestion due to incorrect LLM output. If the problem persists after reloading the page, contact Palantir support.