Table of Contents¶
- deeploy.client
- Client
- __init__
- validate_personal_key
- create_environment_variable
- get_all_environment_variables
- get_environment_variable_ids_for_deployment_artifact
- create_deployment
- create_sagemaker_deployment
- create_azure_ml_deployment
- create_external_deployment
- create_registration_deployment
- update_deployment
- update_sagemaker_deployment
- update_azure_ml_deployment
- update_external_deployment
- update_registration_deployment
- update_deployment_description
- create_job_schedule
- test_job_schedule
- update_job_schedule
- deactivate_job_schedule
- activate_job_schedule
- predict
- explain
- get_request_logs
- get_prediction_logs
- get_one_prediction_log
- evaluate
- upload_actuals
- generate_metadata_json
- generate_model_reference_json
- generate_explainer_reference_json
- generate_transformer_reference_json
- deeploy.models.create_deployment_base
- CreateDeploymentBase
- deeploy.models.create_deployment
- CreateDeployment
- autoscaling_type
- model_serverless
- model_blob_credentials_id
- model_s3_temporary_access_key_id
- model_s3_temporary_secret_access_key
- model_s3_temporary_session_token
- model_azure_temporary_sas_token
- model_azure_temporary_storage_account
- model_databricks_temporary_access_token
- model_docker_credentials_id
- model_instance_type
- model_mem_request
- model_mem_limit
- model_cpu_request
- model_cpu_limit
- model_gpu_request
- model_environment_variable_ids
- model_args
- explainer_serverless
- explainer_blob_credentials_id
- explainer_s3_temporary_access_key_id
- explainer_s3_temporary_secret_access_key
- explainer_s3_temporary_session_token
- explainer_azure_temporary_sas_token
- explainer_azure_temporary_storage_account
- explainer_databricks_temporary_access_token
- explainer_docker_credentials_id
- explainer_instance_type
- explainer_mem_request
- explainer_mem_limit
- explainer_cpu_request
- explainer_cpu_limit
- explainer_gpu_request
- explainer_environment_variable_ids
- explainer_args
- transformer_serverless
- transformer_docker_credentials_id
- transformer_instance_type
- transformer_mem_request
- transformer_mem_limit
- transformer_cpu_request
- transformer_cpu_limit
- transformer_gpu_request
- transformer_environment_variable_ids
- transformer_args
- model_config
- are_valid_temporary_credentials
- to_request_body
- deeploy.models.create_sagemaker_deployment
- CreateSageMakerDeployment
- deeploy.models.create_azure_ml_deployment
- CreateAzureMLDeployment
- deeploy.models.create_non_managed_deployment_base
- CreateNonManagedDeploymentBase
- deeploy.models.create_external_deployment
- CreateExternalDeployment
- deeploy.models.create_registration_deployment
- CreateRegistrationDeployment
- deeploy.models.create_evaluation
- CreateEvaluation
- deeploy.models.create_explainer_reference
- CreateExplainerReference
- deeploy.models.create_model_reference
- CreateModelReference
- deeploy.models.create_transformer_reference
- CreateTransformerReference
- deeploy.models.update_deployment_base
- UpdateDeploymentBase
- deeploy.models.update_deployment
- UpdateDeployment
- autoscaling_type
- model_serverless
- model_blob_credentials_id
- model_s3_temporary_access_key_id
- model_s3_temporary_secret_access_key
- model_s3_temporary_session_token
- model_azure_temporary_sas_token
- model_azure_temporary_storage_account
- model_databricks_temporary_access_token
- model_docker_credentials_id
- model_instance_type
- model_mem_request
- model_mem_limit
- model_cpu_request
- model_cpu_limit
- model_gpu_request
- model_environment_variable_ids
- model_args
- explainer_serverless
- explainer_blob_credentials_id
- explainer_s3_temporary_access_key_id
- explainer_s3_temporary_secret_access_key
- explainer_s3_temporary_session_token
- explainer_azure_temporary_sas_token
- explainer_azure_temporary_storage_account
- explainer_databricks_temporary_access_token
- explainer_docker_credentials_id
- explainer_instance_type
- explainer_mem_request
- explainer_mem_limit
- explainer_cpu_request
- explainer_cpu_limit
- explainer_gpu_request
- explainer_environment_variable_ids
- explainer_args
- transformer_serverless
- transformer_docker_credentials_id
- transformer_instance_type
- transformer_mem_request
- transformer_mem_limit
- transformer_cpu_request
- transformer_cpu_limit
- transformer_gpu_request
- transformer_environment_variable_ids
- transformer_args
- model_config
- are_valid_temporary_credentials
- to_request_body
- deeploy.models.update_sagemaker_deployment
- UpdateSageMakerDeployment
- deeploy.models.update_azure_ml_deployment
- UpdateAzureMLDeployment
- deeploy.models.create_non_managed_deployment_base
- CreateNonManagedDeploymentBase
- deeploy.models.update_external_deployment
- UpdateExternalDeployment
- deeploy.models.update_registration_deployment
- UpdateRegistrationDeployment
- deeploy.models.get_prediction_logs_options
- GetPredictionLogsOptions
- deeploy.models.get_request_logs_options
- GetRequestLogsOptions
- deeploy.models.create_job_schedule
- CreateJobSchedule
- deeploy.models.update_job_schedule
- UpdateJobSchedule
- deeploy.models.test_job_schedule
- TestJobSchedule
- deeploy.enums.model_type
- ModelType
- ModelFrameworkVersion
- deeploy.enums.explainer_type
- ExplainerType
- ExplainerFrameworkVersion
- deeploy.enums.transformer_type
- TransformerType
- deeploy.enums.inference_endpoint
- InferenceEndpoint
deeploy.client¶
Client Objects¶
class Client(object)
A class for interacting with Deeploy
__init__¶
| __init__(host: str, workspace_id: str, access_key: str = None, secret_key: str = None, deployment_token: str = None) -> None
Initialise the Deeploy client
Arguments:
host
str - The host at which Deeploy is located, i.e. deeploy.example.comworkspace_id
str - The ID of the workspace in which your repository is locatedaccess_key
str, optional - Personal Access Key generated from the Deeploy UIsecret_key
str, optional - Secret Access Key generated from the Deeploy UIdeployment_token
str, optional - Deployment token generated from the Deeploy UI
validate_personal_key¶
| validate_personal_key(func)
create_environment_variable¶
| @validate_personal_key
| create_environment_variable(options: CreateEnvironmentVariable) -> EnvironmentVariable
Create an environment variable in a Workspace"
Arguments:
options
CreateEnvironmentVariable - An instance of the CreateEnvironmentVariable class containing the environment variable configuration options
get_all_environment_variables¶
| @validate_personal_key
| get_all_environment_variables() -> List[EnvironmentVariable]
Get all environment variables from your Workspace
get_environment_variable_ids_for_deployment_artifact¶
| @validate_personal_key
| get_environment_variable_ids_for_deployment_artifact(deployment_id: str, artifact: Artifact) -> List[str]
Get the current environment variable IDs for an artifact of your Deployment This method can be used to help update your Deployment
Arguments:
deployment_id
str - The uuid of the Deployment of which to retrieve the environment variable IDsartifact
str - The artifact of which to retrieve the environment variable IDs from
create_deployment¶
| @validate_personal_key
| create_deployment(options: CreateDeployment, local_repository_path: Optional[str] = None) -> Deployment
Create a Deployment on Deeploy
Arguments:
options
CreateDeployment - An instance of the CreateDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
create_sagemaker_deployment¶
| @validate_personal_key
| create_sagemaker_deployment(options: CreateSageMakerDeployment, local_repository_path: Optional[str] = None) -> Deployment
Create a SageMaker Deployment on Deeploy
Arguments:
options
CreateSageMakerDeployment - An instance of the CreateSageMakerDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
create_azure_ml_deployment¶
| @validate_personal_key
| create_azure_ml_deployment(options: CreateAzureMLDeployment, local_repository_path: Optional[str] = None) -> Deployment
Create an Azure Machine Learning Deployment on Deeploy
Arguments:
options
CreateAzureMLDeployment - An instance of the CreateAzureMLDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
create_external_deployment¶
| @validate_personal_key
| create_external_deployment(options: CreateExternalDeployment, local_repository_path: Optional[str] = None) -> Deployment
Create a Deployment on Deeploy
Arguments:
options
CreateExternalDeployment - An instance of the CreateExternalDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
create_registration_deployment¶
| @validate_personal_key
| create_registration_deployment(options: CreateRegistrationDeployment, local_repository_path: Optional[str] = None) -> Deployment
Create a Deployment on Deeploy
Arguments:
options
CreateRegistrationDeployment - An instance of the CreateRegistrationDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
update_deployment¶
| @validate_personal_key
| update_deployment(deployment_id: str, options: UpdateDeployment, local_repository_path: Optional[str] = None) -> Deployment
Update a Deployment on Deeploy
Arguments:
deployment_id
str - The uuid of the Deployment to updateoptions
UpdateDeployment - An instance of the UpdateDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
update_sagemaker_deployment¶
| @validate_personal_key
| update_sagemaker_deployment(deployment_id: str, options: UpdateSageMakerDeployment, local_repository_path: Optional[str] = None) -> Deployment
Update a SageMaker Deployment on Deeploy
Arguments:
deployment_id
str - The uuid of the Deployment to updateoptions
UpdateSageMakerDeployment - An instance of the UpdateSageMakerDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
update_azure_ml_deployment¶
| @validate_personal_key
| update_azure_ml_deployment(deployment_id: str, options: UpdateAzureMLDeployment, local_repository_path: Optional[str] = None) -> Deployment
Update an Azure Machine Learning Deployment on Deeploy
Arguments:
deployment_id
str - The uuid of the Deployment to updateoptions
UpdateAzureMLDeployment - An instance of the UpdateAzureMLDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
update_external_deployment¶
| @validate_personal_key
| update_external_deployment(deployment_id: str, options: UpdateExternalDeployment, local_repository_path: Optional[str] = None) -> Deployment
Update a Deployment on Deeploy
Arguments:
deployment_id
str - The uuid of the Deployment to updateoptions
UpdateExternalDeployment - An instance of the UpdateExternalDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
update_registration_deployment¶
| @validate_personal_key
| update_registration_deployment(deployment_id: str, options: UpdateRegistrationDeployment, local_repository_path: Optional[str] = None) -> Deployment
Update a Deployment on Deeploy
Arguments:
deployment_id
str - The uuid of the Deployment to updateoptions
UpdateRegistrationDeployment - An instance of the UpdateRegistrationDeployment class containing the deployment configuration optionslocal_repository_path
str, optional - Absolute path to the local git repository which is connected to Deeploy used to check if your Repository is present in the Workspace
update_deployment_description¶
| @validate_personal_key
| update_deployment_description(deployment_id: str, options: UpdateDeploymentDescription) -> Deployment
Update the description of a Deployment on Deeploy
Arguments:
deployment_id
str - The uuid of the Deployment to updateoptions
UpdateDeploymentDescription - An instance of the UpdateDeploymentDescription class containing the deployment description options
create_job_schedule¶
| @validate_personal_key
| create_job_schedule(options: CreateJobSchedule) -> List[Dict]
Create a job schedule in a Workspace"
Arguments:
options
CreateJobSchedule - An instance of the CreateJobSchedule class containing the job schedule configuration options
test_job_schedule¶
| @validate_personal_key
| test_job_schedule(options: TestJobSchedule) -> JobSchedule
Test a job schedule in a Workspace"
Arguments:
options
TestJobSchedule - An instance of the TestJobSchedule class containing the test job schedule configuration options
update_job_schedule¶
| @validate_personal_key
| update_job_schedule(job_schedule_id: str, options: UpdateJobSchedule) -> JobSchedule
Create a job schedule in a Workspace"
Arguments:
job_schedule_id
str - The uuid of the job schedule to updateoptions
UpdateJobSchedule - An instance of the UpdateJobSchedule class containing the job schedule configuration options
deactivate_job_schedule¶
| @validate_personal_key
| deactivate_job_schedule(job_schedule_id: str) -> JobSchedule
Deactivate a job schedule in a Workspace"
Arguments:
job_schedule_id
str - The uuid of the job schedule to deactivate
activate_job_schedule¶
| @validate_personal_key
| activate_job_schedule(job_schedule_id: str) -> JobSchedule
Activate a job schedule in a Workspace"
Arguments:
job_schedule_id
str - The uuid of the job schedule to activate
predict¶
| predict(deployment_id: str, request_body: dict) -> V1Prediction or V2Prediction
Make a predict call
Arguments:
deployment_id
str - ID of the Deeploy deploymentrequest_body
dict - Request body with input data for the model
explain¶
| explain(deployment_id: str, request_body: dict, image: bool = False) -> object
Make an explain call
Arguments:
deployment_id
str - ID of the Deeploy deploymentrequest_body
dict - Request body with input data for the modelimage
bool - Return image or not
get_request_logs¶
| get_request_logs(deployment_id: str, params: GetRequestLogsOptions) -> List[RequestLog]
Retrieve request logs
Arguments:
deployment_id
str - ID of the Deeploy deploymentparams
GetRequestLogsOptions - An instance of the GetRequestLogsOptions class containing the params used for the retrieval of request logs
get_prediction_logs¶
| get_prediction_logs(deployment_id: str, params: GetPredictionLogsOptions) -> List[PredictionLog]
Retrieve prediction logs
Arguments:
deployment_id
str - ID of the Deeploy deploymentparams
GetPredictionLogsOptions - An instance of the GetPredictionLogsOptions class containing the params used for the retrieval of prediction logs
get_one_prediction_log¶
| get_one_prediction_log(deployment_id: str, request_log_id: str, prediction_log_id: str) -> PredictionLog
Deprecated in favor of get_prediction_logs
Retrieve one prediction log
Arguments:
deployment_id
str - ID of the Deeploy deploymentrequest_log_id
str - ID of the request_log containing the predictionprediction_log_id
str - ID of the prediction_log to be retrieved
evaluate¶
| evaluate(deployment_id: str, prediction_log_id: str, evaluation_input: CreateEvaluation) -> Evaluation
Evaluate a prediction log
Arguments:
deployment_id
str - ID of the Deeploy deploymentlog_id
int - ID of the log to be evaluatedevaluation_input
CreateEvaluation - An instance of the CreateEvaluation class containing the evaluation input
upload_actuals¶
| upload_actuals(deployment_id: str, actuals_input: CreateActuals) -> List[ActualResponse]
Upload actuals for prediction logs
Arguments:
deployment_id
str - ID of the Deeploy deploymentactuals_input
CreateActuals - An instance of the CreateActuals class containing the prediction log id's and corresponding actuals
generate_metadata_json¶
| generate_metadata_json(target_path: str, metadata_input: dict) -> str
Generate a metadata.json file
Arguments:
target_path
str - Absolute path to the directory in which the metadata.json should be saved.metadata_input
dict, Metadata - The keys and values you would like to include in your metadata.json
generate_model_reference_json¶
| generate_model_reference_json(target_path: str, reference_input: CreateModelReference) -> ModelReferenceJson
Generate a reference.json file for your model
Arguments:
target_path
str - Absolute path to the directory in which the model directory with reference.json file should be saved.reference_input
CreateModelReference - An instance of the CreateModelReference class containing the configuration options of your model
generate_explainer_reference_json¶
| generate_explainer_reference_json(target_path: str, reference_input: CreateExplainerReference) -> ExplainerReferenceJson
Generate a reference.json file for your explainer
Arguments:
target_path
str - Absolute path to the directory in which the explainer directory with reference.json file should be saved.reference_input
CreateExplainerReference - An instance of the CreateExplainerReference class containing the configuration options of your explainer
generate_transformer_reference_json¶
| generate_transformer_reference_json(target_path: str, reference_input: CreateTransformerReference) -> TransformerReferenceJson
Generate a reference.json file for your transformer
Arguments:
target_path
str - Absolute path to the directory in which the transformer directory with reference.json file should be saved.reference_input
CreateTransformerReference - An instance of the CreateTransformerReference class containing the configuration options of your transformer
deeploy.models.create_deployment_base¶
CreateDeploymentBase Objects¶
class CreateDeploymentBase(BaseModel)
Class that contains the base options for creating a Deployment
name¶
str: name of the Deployment
description¶
str, optional: the description of the Deployment
repository_id¶
str: uuid of the Repository
branch_name¶
str: the branch name of the Repository to deploy
commit¶
str, optional: the commit sha on the selected branch. If no commit is provided, the latest commit will be used
contract_path¶
str, optional: relative repository subpath that contains the Deeploy contract to deploy from
risk_classification¶
str, optional: enum value from RiskClassification class
model_type¶
int: enum value from ModelType class
model_framework_version¶
string: enum value from ModelFrameworkVersion class
explainer_type¶
int, optional: enum value from ExplainerType class. Defaults to 0 (no explainer)
explainer_framework_version¶
string: enum value from ExplainerFrameworkVersion class
transformer_type¶
int, optional: enum value from TransformerType class. Defaults to 0 (no transformer)
model_config¶
approver_user_ids¶
List, optional: list of user UUIDs that are requested to submit approval for this version
message_to_approvers¶
str, optional: a message to the requested approvers (only relevant if approver_user_ids is defined)
checklist_template_ids¶
List, optional: list of checklist template UUIDs that will be added to the deployment
documentation_template_ids¶
List, optional: list of documentation template UUIDs that will be added to the deployment
to_request_body¶
| to_request_body(deployment_type: DeploymentType) -> Dict
deeploy.models.create_deployment¶
CreateDeployment Objects¶
class CreateDeployment(CreateDeploymentBase)
Class that contains the options for creating a deployment
autoscaling_type¶
int, optional: enum value from AutoScalingType class. Defaults to None (no autoscaling).
model_serverless¶
bool, optional: whether to deploy the model in a serverless fashion. Defaults to False
model_blob_credentials_id¶
str, optional: uuid of credentials generated in Deeploy to access private Blob storage
model_s3_temporary_access_key_id¶
str, optional: the temporary AWS access key ID to access the model in S3
model_s3_temporary_secret_access_key¶
str, optional: the temporary AWS secret access key to access the model in S3
model_s3_temporary_session_token¶
str, optional: the temporary AWS session token to access the model in S3
model_azure_temporary_sas_token¶
str, optional: the temporary Azure SAS token to access the model in the Azure Blob Storage
model_azure_temporary_storage_account¶
str, optional: the temporary Azure storage account name to access the model in the Azure Blob Storage
model_databricks_temporary_access_token¶
str, optional: the temporary Databricks access token to access the model in the Databricks Unity Catalog
model_docker_credentials_id¶
str, optional: uuid of credentials generated in Deeploy to access private Docker repo
model_instance_type¶
str, optional: the preferred instance type for the model
model_mem_request¶
int, optional: RAM request of model pod, in Megabytes.
model_mem_limit¶
int, optional: RAM limit of model pod, in Megabytes.
model_cpu_request¶
float, optional: CPU request of model pod, in number of cores.
model_cpu_limit¶
float, optional: CPU limit of model pod, in number of cores.
model_gpu_request¶
float, optional: GPU request of model pod, in number of GPUs.
model_environment_variable_ids¶
list, optional: environment variable IDs of which the key and value will be passed to the model container as environment variables
model_args¶
dict, optional: arguments to pass to model container key is argument name, value is argument value
explainer_serverless¶
bool, optional: whether to deploy the explainer in a serverless fashion. Defaults to False
explainer_blob_credentials_id¶
str, optional: Credential id of credential generated in Deeploy to access private Blob storage
explainer_s3_temporary_access_key_id¶
str, optional: the temporary AWS access key ID to access the explainer in S3
explainer_s3_temporary_secret_access_key¶
str, optional: the temporary AWS secret access key to access the explainer in S3
explainer_s3_temporary_session_token¶
str, optional: the temporary AWS session token to access the explainer in S3
explainer_azure_temporary_sas_token¶
str, optional: the temporary Azure SAS token to access the explainer in the Azure Blob Storage
explainer_azure_temporary_storage_account¶
str, optional: the temporary Azure storage account name to access the explainer in the Azure Blob Storage
explainer_databricks_temporary_access_token¶
str, optional: the temporary Databricks access token to access the explainer in the Databricks Unity Catalog
explainer_docker_credentials_id¶
str, optional: Credential id of credential generated in Deeploy to access private Docker repo
explainer_instance_type¶
str, optional: The preferred instance type for the explainer pod.
explainer_mem_request¶
int, optional: RAM request of explainer pod, in Megabytes.
explainer_mem_limit¶
int, optional: RAM limit of explainer pod, in Megabytes.
explainer_cpu_request¶
float, optional: CPU request of explainer pod, in number of cores.
explainer_cpu_limit¶
float, optional: CPU limit of explainer pod, in number of cores.
explainer_gpu_request¶
float, optional: GPU request of explainer pod, in number of GPUs.
explainer_environment_variable_ids¶
list, optional: environment variable IDs of which the key and value will be passed to the modelexplainercontainer as environment variables
explainer_args¶
dict, optional: arguments to pass to explainer container key is argument name, value is argument value
transformer_serverless¶
bool, optional: whether to deploy the transformer in a serverless fashion. Defaults to False
transformer_docker_credentials_id¶
str, optional: Credential id of credential generated in Deeploy to access private Docker repo
transformer_instance_type¶
str, optional: The preferred instance type for the transformer pod.
transformer_mem_request¶
int, optional: RAM request of transformer pod, in Megabytes.
transformer_mem_limit¶
int, optional: RAM limit of transformer pod, in Megabytes.
transformer_cpu_request¶
float, optional: CPU request of transformer pod, in number of cores.
transformer_cpu_limit¶
float, optional: CPU limit of transformer pod, in number of cores.
transformer_gpu_request¶
float, optional: GPU request of transformer pod, in number of GPUs.
transformer_environment_variable_ids¶
list, optional: environment variable IDs of which the key and value will be passed to the transformer container as environment variables
transformer_args¶
dict, optional: arguments to pass to transformer container key is argument name, value is argument value
model_config¶
are_valid_temporary_credentials¶
| @model_validator(mode="before")
| are_valid_temporary_credentials(cls, values)
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_sagemaker_deployment¶
CreateSageMakerDeployment Objects¶
class CreateSageMakerDeployment(CreateDeploymentBase)
Class that contains the options for creating a SageMaker deployment
region¶
str, optional: the AWS region used for this Deployment
model_instance_type¶
str, optional: the preferred instance type for the model
explainer_instance_type¶
str, optional: The preferred instance type for the explainer
transformer_instance_type¶
str, optional: The preferred instance type for the explainer
model_config¶
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_azure_ml_deployment¶
CreateAzureMLDeployment Objects¶
class CreateAzureMLDeployment(CreateDeploymentBase)
Class that contains the options for creating an Azure Machine Learning deployment
model_instance_type¶
str, optional: the preferred instance type for the model
model_instance_count¶
int, optional: the amount of compute instances used for your model deployment
explainer_instance_type¶
str, optional: The preferred instance type for the explainer
explainer_instance_count¶
int, optional: the amount of compute instances used for your explainer deployment
model_config¶
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_non_managed_deployment_base¶
CreateNonManagedDeploymentBase Objects¶
class CreateNonManagedDeploymentBase(BaseModel)
Class that contains the base options for creating a Deployment
name¶
str: name of the Deployment
description¶
str, optional: the description of the Deployment
repository_id¶
str, optional: uuid of the Repository
branch_name¶
str, optional: the branch name of the Repository to deploy
commit¶
str, optional: the commit sha on the selected branch. If no commit is provided, the latest commit will be used
contract_path¶
str, optional: relative repository subpath that contains the Deeploy contract to deploy from
risk_classification¶
str, optional: enum value from RiskClassification class
approver_user_ids¶
List, optional: list of user UUIDs that are requested to submit approval for this version
message_to_approvers¶
str, optional: a message to the requested approvers (only relevant if approver_user_ids is defined)
documentation_template_ids¶
List, optional: list of documentation template UUIDs that will be added to the deployment
to_request_body¶
| to_request_body(deployment_type: DeploymentType) -> Dict
deeploy.models.create_external_deployment¶
CreateExternalDeployment Objects¶
class CreateExternalDeployment(CreateNonManagedDeploymentBase)
Class that contains the options for creating a external deployment
url¶
str, optional: url endpoint of external deployment
authentication¶
str: enum value from ExternalUrlAuthenticationMethod class.
username¶
str, optional: username header for basic authentication
custom_header¶
str, optional: custom header for custom authentication
password¶
str, optional: password/bearer token/key for basic/bearer/custom authentication
authentication_is_set¶
| @model_validator(mode="before")
| authentication_is_set(cls, values)
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_registration_deployment¶
CreateRegistrationDeployment Objects¶
class CreateRegistrationDeployment(CreateNonManagedDeploymentBase)
Class that contains the options for creating a registration deployment
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_evaluation¶
CreateEvaluation Objects¶
class CreateEvaluation(BaseModel)
Class that contains the options for creating an Evaluation
agree¶
bool: whether the evaluator agrees of disagrees with the correctness of the prediction
desired_output¶
dict, optional: the desired output of the model presented by the expert
comment¶
str, optional: an optional comment/explanation on the evaluation
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_explainer_reference¶
CreateExplainerReference Objects¶
class CreateExplainerReference(BaseModel)
Class that contains the options for creating a reference.json for an explainer
docker¶
DockerReference: docker configuration object of the explainer
blob¶
BlobReference: blob configuration object of the explainer
mlflow¶
MLFlowReference: mlflow configuration object of the explainer
azure_ml¶
AzureMLReference: azure machine learning configuration object of the explainer
databricks¶
DatabricksReference: databricks unity catalog configuration object of the explainer
get_reference¶
| get_reference() -> ExplainerReference
deeploy.models.create_model_reference¶
CreateModelReference Objects¶
class CreateModelReference(BaseModel)
Class that contains the options for creating a reference.json for a model
docker¶
DockerReference: docker configuration object of the model
blob¶
BlobReference: blob configuration object of the model
mlflow¶
MLFlowReference: mlflow configuration object of the model
azure_ml¶
AzureMLReference: azure machine learning configuration object of the model
databricks¶
DatabricksReference: databricks unity catalog configuration object of the model
get_reference¶
| get_reference() -> ModelReference
deeploy.models.create_transformer_reference¶
CreateTransformerReference Objects¶
class CreateTransformerReference(BaseModel)
Class that contains the options for creating a reference.json for a transformer
docker¶
DockerReference: docker configuration object of the transformer
blob¶
BlobReference: blob configuration object of the transformer
get_reference¶
| get_reference() -> TransformerReference
deeploy.models.update_deployment_base¶
UpdateDeploymentBase Objects¶
class UpdateDeploymentBase(BaseModel)
Class that contains the base options for updating a Deployment
deployment_type¶
str: enum value from DeploymentType class
repository_id¶
str, optional: uuid of the Repository
branch_name¶
str, optional: the branch name of the Repository to deploy
commit¶
str, optional: the commit sha on the selected branch
contract_path¶
str, optional: relative repository subpath that contains the Deeploy contract to deploy from
model_type¶
int: enum value from ModelType class
model_framework_version¶
string: enum value from ModelFrameworkVersion class
explainer_type¶
int, optional: enum value from ExplainerType class. Defaults to 0 (no explainer)
explainer_framework_version¶
string: enum value from ExplainerFrameworkVersion class
transformer_type¶
int, optional: enum value from TransformerType class. Defaults to 0 (no transformer)
model_config¶
approver_user_ids¶
List, optional: list of user UUIDs that are requested to submit approval for this version
message_to_approvers¶
str, optional: a message to the requested approvers (only relevant if approver_user_ids is defined)
to_request_body¶
| to_request_body() -> Dict
deeploy.models.update_deployment¶
UpdateDeployment Objects¶
class UpdateDeployment(UpdateDeploymentBase)
Class that contains the options for updating a Deployment
autoscaling_type¶
int, optional: enum value from AutoScalingType class. Defaults to None (no autoscaling).
model_serverless¶
bool, optional: whether to deploy the model in a serverless fashion. Defaults to False
model_blob_credentials_id¶
str, optional: uuid of credentials generated in Deeploy to access private Blob storage
model_s3_temporary_access_key_id¶
str, optional: the temporary AWS access key ID to access the model in S3
model_s3_temporary_secret_access_key¶
str, optional: the temporary AWS secret access key to access the model in S3
model_s3_temporary_session_token¶
str, optional: the temporary AWS session token to access the model in S3
model_azure_temporary_sas_token¶
str, optional: the temporary Azure SAS token to access the model in the Azure Blob Storage
model_azure_temporary_storage_account¶
str, optional: the temporary Azure storage account name to access the model in the Azure Blob Storage
model_databricks_temporary_access_token¶
str, optional: the temporary Databricks access token to access the model in the Databricks Unity Catalog
model_docker_credentials_id¶
str, optional: uuid of credentials generated in Deeploy to access private Docker repo
model_instance_type¶
str, optional: the preferred instance type for the model
model_mem_request¶
int, optional: RAM request of model pod, in Megabytes.
model_mem_limit¶
int, optional: RAM limit of model pod, in Megabytes.
model_cpu_request¶
float, optional: CPU request of model pod, in number of cores.
model_cpu_limit¶
float, optional: CPU limit of model pod, in number of cores.
model_gpu_request¶
float, optional: GPU request of model pod, in number of GPUs.
model_environment_variable_ids¶
list, optional: environment variable IDs of which the key and value will be passed to the model container as environment variables
model_args¶
dict, optional: arguments to pass to model container key is argument name, value is argument value
explainer_serverless¶
bool, optional: whether to deploy the explainer in a serverless fashion. Defaults to False
explainer_blob_credentials_id¶
str, optional: Credential id of credential generated in Deeploy to access private Blob storage
explainer_s3_temporary_access_key_id¶
str, optional: the temporary AWS access key ID to access the explainer in S3
explainer_s3_temporary_secret_access_key¶
str, optional: the temporary AWS secret access key to access the explainer in S3
explainer_s3_temporary_session_token¶
str, optional: the temporary AWS session token to access the explainer in S3
explainer_azure_temporary_sas_token¶
str, optional: the temporary Azure SAS token to access the explainer in the Azure Blob Storage
explainer_azure_temporary_storage_account¶
str, optional: the temporary Azure storage account name to access the explainer in the Azure Blob Storage
explainer_databricks_temporary_access_token¶
str, optional: the temporary Databricks access token to access the explainer in the Databricks Unity Catalog
explainer_docker_credentials_id¶
str, optional: Credential id of credential generated in Deeploy to access private Docker repo
explainer_instance_type¶
str, optional: The preferred instance type for the explainer pod.
explainer_mem_request¶
int, optional: RAM request of explainer pod, in Megabytes.
explainer_mem_limit¶
int, optional: RAM limit of explainer pod, in Megabytes.
explainer_cpu_request¶
float, optional: CPU request of explainer pod, in number of cores.
explainer_cpu_limit¶
float, optional: CPU limit of explainer pod, in number of cores.
explainer_gpu_request¶
float, optional: GPU request of explainer pod, in number of GPUs.
explainer_environment_variable_ids¶
list, optional: environment variable IDs of which the key and value will be passed to the modelexplainercontainer as environment variables
explainer_args¶
dict, optional: arguments to pass to explainer container key is argument name, value is argument value
transformer_serverless¶
bool, optional: whether to deploy the transformer in a serverless fashion. Defaults to False
transformer_docker_credentials_id¶
str, optional: Credential id of credential generated in Deeploy to access private Docker repo
transformer_instance_type¶
str, optional: The preferred instance type for the transformer pod.
transformer_mem_request¶
int, optional: RAM request of transformer pod, in Megabytes.
transformer_mem_limit¶
int, optional: RAM limit of transformer pod, in Megabytes.
transformer_cpu_request¶
float, optional: CPU request of transformer pod, in number of cores.
transformer_cpu_limit¶
float, optional: CPU limit of transformer pod, in number of cores.
transformer_gpu_request¶
float, optional: GPU request of transformer pod, in number of GPUs.
transformer_environment_variable_ids¶
list, optional: environment variable IDs of which the key and value will be passed to the transformer container as environment variables
transformer_args¶
dict, optional: arguments to pass to transformer container key is argument name, value is argument value
model_config¶
are_valid_temporary_credentials¶
| @model_validator(mode="before")
| are_valid_temporary_credentials(cls, values)
to_request_body¶
| to_request_body() -> Dict
deeploy.models.update_sagemaker_deployment¶
UpdateSageMakerDeployment Objects¶
class UpdateSageMakerDeployment(UpdateDeploymentBase)
Class that contains the options for updating a SageMaker Deployment
region¶
str, optional: the AWS region used for this Deployment
model_instance_type¶
str, optional: the preferred instance type for the model
explainer_instance_type¶
str, optional: The preferred instance type for the explainer
transformer_instance_type¶
str, optional: The preferred instance type for the explainer
model_config¶
to_request_body¶
| to_request_body() -> Dict
deeploy.models.update_azure_ml_deployment¶
UpdateAzureMLDeployment Objects¶
class UpdateAzureMLDeployment(UpdateDeploymentBase)
Class that contains the options for updating an Azure Machine Learning Deployment
model_instance_type¶
str, optional: the preferred instance type for the model
model_instance_count¶
int, optional: the amount of compute instances used for your model deployment
explainer_instance_type¶
str, optional: The preferred instance type for the explainer
explainer_instance_count¶
int, optional: the amount of compute instances used for your explainer deployment
model_config¶
to_request_body¶
| to_request_body() -> Dict
deeploy.models.create_non_managed_deployment_base¶
CreateNonManagedDeploymentBase Objects¶
class CreateNonManagedDeploymentBase(BaseModel)
Class that contains the base options for creating a Deployment
name¶
str: name of the Deployment
description¶
str, optional: the description of the Deployment
repository_id¶
str, optional: uuid of the Repository
branch_name¶
str, optional: the branch name of the Repository to deploy
commit¶
str, optional: the commit sha on the selected branch. If no commit is provided, the latest commit will be used
contract_path¶
str, optional: relative repository subpath that contains the Deeploy contract to deploy from
risk_classification¶
str, optional: enum value from RiskClassification class
approver_user_ids¶
List, optional: list of user UUIDs that are requested to submit approval for this version
message_to_approvers¶
str, optional: a message to the requested approvers (only relevant if approver_user_ids is defined)
documentation_template_ids¶
List, optional: list of documentation template UUIDs that will be added to the deployment
to_request_body¶
| to_request_body(deployment_type: DeploymentType) -> Dict
deeploy.models.update_external_deployment¶
UpdateExternalDeployment Objects¶
class UpdateExternalDeployment(UpdateNonManagedDeploymentBase)
Class that contains the options for updating a External Deployment
url¶
str, optional: url endpoint of external deployment
authentication¶
str, optional: enum value from ExternalUrlAuthenticationMethod class.
username¶
str, optional: username for basic authentication
custom_header¶
str, optional: custom header for custom authentication
password¶
str, optional: password/bearer token/key for basic/bearer/custom authentication
authentication_is_set¶
| @model_validator(mode="before")
| authentication_is_set(cls, values)
to_request_body¶
| to_request_body() -> Dict
deeploy.models.update_registration_deployment¶
UpdateRegistrationDeployment Objects¶
class UpdateRegistrationDeployment(UpdateNonManagedDeploymentBase)
Class that contains the options for updating a Registration Deployment
to_request_body¶
| to_request_body() -> Dict
deeploy.models.get_prediction_logs_options¶
GetPredictionLogsOptions Objects¶
class GetPredictionLogsOptions(BaseModel)
Class that contains the options for retrieving prediction logs from a Deployment
start¶
int, optional: the start timestamp of the creation of the prediction log, presented as a unix timestamp in milliseconds
end¶
int, optional: the end timestamp of the creation of the prediction log, presented as a unix timestamp in milliseconds
offset¶
int, optional: the offset skips the first prediction logs for the given offset times the limit
limit¶
int, optional: the maximum number of prediction logs to retrieve in one call
sort¶
str, optional: the sorting applied to the retrieved prediction logs
custom_id¶
str, optional: the custom ID associated to the prediction logs
request_log_id¶
str, optional: the uuid of the request log
id¶
str, optional: the uuid of the prediction log
prediction_class¶
str, optional: the prediction class from your metadata.json used to filter the prediction logs based on the value of their response body
actual¶
str, optional: whether the actual is available on the prediction log
evaluation¶
str, optional: the evaluation status of the prediction log
status¶
str, optional: the status of the request log of the prediction log
endpoint_type¶
str, optional: the endpoint type of the prediction log
sort_must_follow_rhs_syntax¶
| @field_validator("sort")
| @classmethod
| sort_must_follow_rhs_syntax(cls, value: str) -> str
must_be_valid_rhs_syntax¶
| @field_validator(
| "custom_id",
| "prediction_class",
| "actual",
| "evaluation",
| "status",
| "request_log_id",
| "id",
| "endpoint_type",
| )
| @classmethod
| must_be_valid_rhs_syntax(cls, value: str) -> str
to_params¶
| to_params() -> Dict
deeploy.models.get_request_logs_options¶
GetRequestLogsOptions Objects¶
class GetRequestLogsOptions(BaseModel)
Class that contains the options for retrieving request logs from a Deployment
start¶
int, optional: the start timestamp of the creation of the request log, presented as a unix timestamp in milliseconds
end¶
int, optional: the end timestamp of the creation of the request log, presented as a unix timestamp in milliseconds
offset¶
int, optional: the offset skips the first request logs for the given offset times the limit
limit¶
int, optional: the maximum number of request logs to retrieve in one call
sort¶
str, optional: the sorting applied to the retrieved request logs
status¶
str, optional: the status of the request log
commit¶
str, optional: the commit of the request log
status_code¶
str, optional: the status code of the request log
sort_must_follow_rhs_syntax¶
| @field_validator("sort")
| @classmethod
| sort_must_follow_rhs_syntax(cls, value: str) -> str
must_be_valid_rhs_syntax¶
| @field_validator(
| "status",
| "commit",
| "status_code",
| )
| @classmethod
| must_be_valid_rhs_syntax(cls, value: str) -> str
to_params¶
| to_params() -> Dict
deeploy.models.create_job_schedule¶
CreateJobSchedule Objects¶
class CreateJobSchedule(BaseModel)
Class that contains the options for creating a job schedule
name¶
str: a unique name for the job schedule
cron_expression¶
str: the cron expression to decide how often the job should be executed
deployment_id¶
str: the uuid of the Deployment which the job schedule should target
endpoint¶
str: which endpoint the scheduled jobs should call. Defaults to predict
to_request_body¶
| to_request_body() -> Dict
deeploy.models.update_job_schedule¶
UpdateJobSchedule Objects¶
class UpdateJobSchedule(BaseModel)
Class that contains the options for creating a job schedule
name¶
str, optional: a unique name for the job schedule
cron_expression¶
str, optional: the cron expression to decide how often the job should be executed
deployment_id¶
str, optional: the uuid of the Deployment which the job schedule should target
endpoint¶
str: which endpoint the scheduled jobs should call
to_request_body¶
| to_request_body() -> Dict
deeploy.models.test_job_schedule¶
TestJobSchedule Objects¶
class TestJobSchedule(BaseModel)
Class that contains the options for testing a job schedule
deployment_id¶
str: the uuid of the Deployment which the test job schedule should target
endpoint¶
str: which endpoint the test job should call. Defaults to predict
to_request_body¶
| to_request_body() -> Dict
deeploy.enums.model_type¶
ModelType Objects¶
class ModelType(Enum)
Class that contains model types
TENSORFLOW¶
PYTORCH¶
SKLEARN¶
XGBOOST¶
ONNX¶
TRITON¶
CUSTOM¶
LIGHTGBM¶
PMML¶
HUGGINGFACE¶
ModelFrameworkVersion Objects¶
class ModelFrameworkVersion(Enum)
Class that contains model framework versions
XGBOOST_CURRENT¶
SKLEARN_CURRENT¶
LIGHTGBM_CURRENT¶
XGBOOST_1_7_5¶
deeploy.enums.explainer_type¶
ExplainerType Objects¶
class ExplainerType(Enum)
Class that contains explainer types
NO_EXPLAINER¶
ANCHOR_TABULAR¶
ANCHOR_IMAGES¶
ANCHOR_TEXT¶
SHAP_KERNEL¶
PDP_TABULAR¶
MACE_TABULAR¶
CUSTOM¶
INTEGRATED_EXPLAINER¶
SHAP_TREE¶
SALIENCY¶
ATTENTION¶
ExplainerFrameworkVersion Objects¶
class ExplainerFrameworkVersion(Enum)
Class that contains explainer framework versions
ALIBI_CURRENT¶
SHAP_CURRENT¶
OMNIXAI_CURRENT¶
SHAP_0_42¶
deeploy.enums.transformer_type¶
TransformerType Objects¶
class TransformerType(Enum)
Class that contains transformer types
NO_TRANSFORMER¶
CUSTOM¶
deeploy.enums.inference_endpoint¶
InferenceEndpoint Objects¶
class InferenceEndpoint(Enum)
Class that contains the inference endpoint options