sagemaker.core.local.local_session#

Placeholder docstring

Classes

FileInput(fileUri[, content_type])

Amazon SageMaker channel configuration for FILE data sources, used in local mode.

LocalSagemakerClient([sagemaker_session])

A SageMakerClient that implements the API calls locally.

LocalSagemakerRuntimeClient([config])

A SageMaker Runtime client that calls a local endpoint only.

LocalSession([boto_session, default_bucket, ...])

A SageMaker Session class for Local Mode.

file_input

alias of FileInput

class sagemaker.core.local.local_session.FileInput(fileUri, content_type=None)[source]#

Bases: object

Amazon SageMaker channel configuration for FILE data sources, used in local mode.

class sagemaker.core.local.local_session.LocalSagemakerClient(sagemaker_session=None)[source]#

Bases: object

A SageMakerClient that implements the API calls locally.

Used for doing local training and hosting local endpoints. It still needs access to a boto client to interact with S3 but it won’t perform any SageMaker call.

Implements the methods with the same signature as the boto SageMakerClient.

Args:

Returns:

create_endpoint(EndpointName, EndpointConfigName, Tags=None)[source]#

Create the endpoint.

Parameters:
  • EndpointName

  • EndpointConfigName

  • Tags – (Default value = None)

Returns:

create_endpoint_config(EndpointConfigName, ProductionVariants, Tags=None)[source]#

Create the endpoint configuration.

Parameters:
  • EndpointConfigName

  • ProductionVariants

  • Tags – (Default value = None)

Returns:

create_model(ModelName, PrimaryContainer, *args, **kwargs)[source]#

Create a Local Model Object.

Parameters:
  • ModelName (str) – the Model Name

  • PrimaryContainer (dict) – a SageMaker primary container definition

  • *args

  • **kwargs

Returns:

create_processing_job(ProcessingJobName, AppSpecification, ProcessingResources, Environment=None, ProcessingInputs=None, ProcessingOutputConfig=None, **kwargs)[source]#

Creates a processing job in Local Mode

Parameters:
  • ProcessingJobName (str) – local processing job name.

  • AppSpecification (dict) – Identifies the container and application to run.

  • ProcessingResources (dict) – Identifies the resources to use for local processing.

  • Environment (dict, optional) – Describes the environment variables to pass to the container. (Default value = None)

  • ProcessingInputs (dict, optional) – Describes the processing input data. (Default value = None)

  • ProcessingOutputConfig (dict, optional) – Describes the processing output configuration. (Default value = None)

  • **kwargs – Keyword arguments

Returns:

create_training_job(TrainingJobName, AlgorithmSpecification, OutputDataConfig, ResourceConfig, InputDataConfig=None, Environment=None, **kwargs)[source]#

Create a training job in Local Mode.

Parameters:
  • TrainingJobName (str) – local training job name.

  • AlgorithmSpecification (dict) – Identifies the training algorithm to use.

  • InputDataConfig (dict, optional) – Describes the training dataset and the location where it is stored. (Default value = None)

  • OutputDataConfig (dict) – Identifies the location where you want to save the results of model training.

  • ResourceConfig (dict) – Identifies the resources to use for local model training.

  • Environment (dict, optional) – Describes the environment variables to pass to the container. (Default value = None)

  • HyperParameters (dict) – Specifies these algorithm-specific parameters to influence the quality of the final model.

  • **kwargs

Returns:

create_transform_job(TransformJobName, ModelName, TransformInput, TransformOutput, TransformResources, **kwargs)[source]#

Create the transform job.

Parameters:
  • TransformJobName

  • ModelName

  • TransformInput

  • TransformOutput

  • TransformResources

  • **kwargs

Returns:

delete_endpoint(EndpointName)[source]#

Delete the endpoint.

Parameters:

EndpointName

Returns:

delete_endpoint_config(EndpointConfigName)[source]#

Delete the endpoint configuration.

Parameters:

EndpointConfigName

Returns:

delete_model(ModelName)[source]#

Delete the model.

Parameters:

ModelName

Returns:

describe_endpoint(EndpointName)[source]#

Describe the endpoint.

Parameters:

EndpointName

Returns:

describe_endpoint_config(EndpointConfigName)[source]#

Describe the endpoint configuration.

Parameters:

EndpointConfigName

Returns:

describe_model(ModelName)[source]#

Describe the model.

Parameters:

ModelName

Returns:

describe_processing_job(ProcessingJobName)[source]#

Describes a local processing job.

Parameters:

ProcessingJobName (str) – Processing job name to describe.

Returns: (dict) DescribeProcessingJob Response.

Returns:

describe_training_job(TrainingJobName)[source]#

Describe a local training job.

Parameters:

TrainingJobName (str) – Training job name to describe.

Returns: (dict) DescribeTrainingJob Response.

Returns:

describe_transform_job(TransformJobName)[source]#

Describe the transform job.

Parameters:

TransformJobName

Returns:

update_endpoint(EndpointName, EndpointConfigName)[source]#

Update the endpoint.

Parameters:
  • EndpointName

  • EndpointConfigName

Returns:

class sagemaker.core.local.local_session.LocalSagemakerRuntimeClient(config=None)[source]#

Bases: object

A SageMaker Runtime client that calls a local endpoint only.

property config: dict#

Local config getter

invoke_endpoint(Body, EndpointName, ContentType=None, Accept=None, CustomAttributes=None, TargetModel=None, TargetVariant=None, InferenceId=None)[source]#

Invoke the endpoint.

Parameters:
  • Body – Input data for which you want the model to provide inference.

  • EndpointName – The name of the endpoint that you specified when you created the endpoint using the CreateEndpoint API.

  • ContentType – The MIME type of the input data in the request body (Default value = None)

  • Accept – The desired MIME type of the inference in the response (Default value = None)

  • CustomAttributes – Provides additional information about a request for an inference submitted to a model hosted at an Amazon SageMaker endpoint (Default value = None)

  • TargetModel – The model to request for inference when invoking a multi-model endpoint (Default value = None)

  • TargetVariant – Specify the production variant to send the inference request to when invoking an endpoint that is running two or more variants (Default value = None)

  • InferenceId – If you provide a value, it is added to the captured data when you enable data capture on the endpoint (Default value = None)

Returns:

Inference for the given input.

Return type:

object

class sagemaker.core.local.local_session.LocalSession(boto_session=None, default_bucket=None, s3_endpoint_url=None, disable_local_code=False, sagemaker_config: dict | None = None, default_bucket_prefix=None)[source]#

Bases: Session

A SageMaker Session class for Local Mode.

This class provides alternative Local Mode implementations for the functionality of Session.

property config: Dict | None#

The config for the local mode, unused in a normal session

logs_for_job(job_name, wait=False, poll=5, log_type='All')[source]#

A no-op method meant to override the sagemaker client.

Parameters:
  • job_name

  • wait – (Default value = False)

  • poll – (Default value = 5)

Returns:

logs_for_processing_job(job_name, wait=False, poll=10)[source]#

A no-op method meant to override the sagemaker client.

Parameters:
  • job_name

  • wait – (Default value = False)

  • poll – (Default value = 10)

Returns:

sagemaker.core.local.local_session.file_input#

alias of FileInput