The upgrade page displays all dataset-backed models that:
As part of this intervention, dataset-backed models that do not meet these criteria are marked as Ignored
, and are filtered out by default from the campaign view.
Models for which a replacement model was selected have their status set to Completed
, which also filters them out from the campaign view by default.
As explained in the migration overview, this effort will be split into two migration campaigns in Upgrade Assistant:
The second intervention will start no later than April 2025, to allow 6 months for users to migrate any consuming resources.
Models developed with foundry_ml
will no longer be supported in modeling objectives, Python transforms, or modeling objective deployments. More concretely:
foundry_ml
modelsfoundry_ml
will no longer be editable and checks will fail for any code importing foundry_ml
foundry_ml
models may break and will no longer be supported by PalantirIt is possible to migrate a model without re-training it. To do so, load the model and write it to an output dataset from a Code Repository with foundry_ml
installed and using Python 3.9:
Copied!1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
from transforms.api import Input, Output, transform # Pickle may not be the right choice depending on the class of the models. # Refer to the documentation of the modeling library you are using to select a serialization method. import pickle from foundry_ml import Model @transform( output_dataset=Output("<path_to_output_dataset>"), source_model=Input("<path_to_foundry_ml_model>"), ) def compute(output_dataset, source_model): MODEL_STAGE_ID = <index_of_stage_to_save> model = Model.load(source_model) # Select the relevant model stage. model_stage = model.stages[MODEL_STAGE_ID].model # Write it to the output dataset. with output_dataset.filesystem().open("model.pkl", 'wb') as f: pickle.dump(model_stage, f)
From this output dataset, you can then publish a model from a separate repository using palantir_models
and a more recent Python version. Refer to this tutorial to learn more.
AIP may not be able to provide LLM-generated suggestions in the following cases:
master
on most environments), most likely because this branch is empty. This can be verified by navigating to the model on the default branch: if Foundry cannot find code for the model, there will be no View Code
option.foundryFunctionV2
as its type.