AWS Native is in preview. AWS Classic is fully supported.
aws-native.sagemaker.getInferenceExperiment
Explore with Pulumi AI
AWS Native is in preview. AWS Classic is fully supported.
Resource Type definition for AWS::SageMaker::InferenceExperiment
Using getInferenceExperiment
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getInferenceExperiment(args: GetInferenceExperimentArgs, opts?: InvokeOptions): Promise<GetInferenceExperimentResult>
function getInferenceExperimentOutput(args: GetInferenceExperimentOutputArgs, opts?: InvokeOptions): Output<GetInferenceExperimentResult>
def get_inference_experiment(name: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetInferenceExperimentResult
def get_inference_experiment_output(name: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetInferenceExperimentResult]
func LookupInferenceExperiment(ctx *Context, args *LookupInferenceExperimentArgs, opts ...InvokeOption) (*LookupInferenceExperimentResult, error)
func LookupInferenceExperimentOutput(ctx *Context, args *LookupInferenceExperimentOutputArgs, opts ...InvokeOption) LookupInferenceExperimentResultOutput
> Note: This function is named LookupInferenceExperiment
in the Go SDK.
public static class GetInferenceExperiment
{
public static Task<GetInferenceExperimentResult> InvokeAsync(GetInferenceExperimentArgs args, InvokeOptions? opts = null)
public static Output<GetInferenceExperimentResult> Invoke(GetInferenceExperimentInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetInferenceExperimentResult> getInferenceExperiment(GetInferenceExperimentArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: aws-native:sagemaker:getInferenceExperiment
arguments:
# arguments dictionary
The following arguments are supported:
- Name string
- The name for the inference experiment.
- Name string
- The name for the inference experiment.
- name String
- The name for the inference experiment.
- name string
- The name for the inference experiment.
- name str
- The name for the inference experiment.
- name String
- The name for the inference experiment.
getInferenceExperiment Result
The following output properties are available:
- Arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- Creation
Time string - The timestamp at which you created the inference experiment.
- Data
Storage Pulumi.Config Aws Native. Sage Maker. Outputs. Inference Experiment Data Storage Config - The Amazon S3 location and configuration for storing inference request and response data.
- Description string
- The description of the inference experiment.
- Desired
State Pulumi.Aws Native. Sage Maker. Inference Experiment Desired State - The desired state of the experiment after starting or stopping operation.
- Endpoint
Metadata Pulumi.Aws Native. Sage Maker. Outputs. Inference Experiment Endpoint Metadata - Last
Modified stringTime - The timestamp at which you last modified the inference experiment.
- Model
Variants List<Pulumi.Aws Native. Sage Maker. Outputs. Inference Experiment Model Variant Config> - An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- Schedule
Pulumi.
Aws Native. Sage Maker. Outputs. Inference Experiment Schedule The duration for which the inference experiment ran or will run.
The maximum duration that you can set for an inference experiment is 30 days.
- Shadow
Mode Pulumi.Config Aws Native. Sage Maker. Outputs. Inference Experiment Shadow Mode Config - The configuration of
ShadowMode
inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates. - Status
Pulumi.
Aws Native. Sage Maker. Inference Experiment Status - The status of the inference experiment.
- Status
Reason string - The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Pulumi.
Aws Native. Outputs. Tag> - An array of key-value pairs to apply to this resource.
- Arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- Creation
Time string - The timestamp at which you created the inference experiment.
- Data
Storage InferenceConfig Experiment Data Storage Config - The Amazon S3 location and configuration for storing inference request and response data.
- Description string
- The description of the inference experiment.
- Desired
State InferenceExperiment Desired State - The desired state of the experiment after starting or stopping operation.
- Endpoint
Metadata InferenceExperiment Endpoint Metadata - Last
Modified stringTime - The timestamp at which you last modified the inference experiment.
- Model
Variants []InferenceExperiment Model Variant Config - An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- Schedule
Inference
Experiment Schedule The duration for which the inference experiment ran or will run.
The maximum duration that you can set for an inference experiment is 30 days.
- Shadow
Mode InferenceConfig Experiment Shadow Mode Config - The configuration of
ShadowMode
inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates. - Status
Inference
Experiment Status - The status of the inference experiment.
- Status
Reason string - The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Tag
- An array of key-value pairs to apply to this resource.
- arn String
- The Amazon Resource Name (ARN) of the inference experiment.
- creation
Time String - The timestamp at which you created the inference experiment.
- data
Storage InferenceConfig Experiment Data Storage Config - The Amazon S3 location and configuration for storing inference request and response data.
- description String
- The description of the inference experiment.
- desired
State InferenceExperiment Desired State - The desired state of the experiment after starting or stopping operation.
- endpoint
Metadata InferenceExperiment Endpoint Metadata - last
Modified StringTime - The timestamp at which you last modified the inference experiment.
- model
Variants List<InferenceExperiment Model Variant Config> - An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule
Inference
Experiment Schedule The duration for which the inference experiment ran or will run.
The maximum duration that you can set for an inference experiment is 30 days.
- shadow
Mode InferenceConfig Experiment Shadow Mode Config - The configuration of
ShadowMode
inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates. - status
Inference
Experiment Status - The status of the inference experiment.
- status
Reason String - The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Tag>
- An array of key-value pairs to apply to this resource.
- arn string
- The Amazon Resource Name (ARN) of the inference experiment.
- creation
Time string - The timestamp at which you created the inference experiment.
- data
Storage InferenceConfig Experiment Data Storage Config - The Amazon S3 location and configuration for storing inference request and response data.
- description string
- The description of the inference experiment.
- desired
State InferenceExperiment Desired State - The desired state of the experiment after starting or stopping operation.
- endpoint
Metadata InferenceExperiment Endpoint Metadata - last
Modified stringTime - The timestamp at which you last modified the inference experiment.
- model
Variants InferenceExperiment Model Variant Config[] - An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule
Inference
Experiment Schedule The duration for which the inference experiment ran or will run.
The maximum duration that you can set for an inference experiment is 30 days.
- shadow
Mode InferenceConfig Experiment Shadow Mode Config - The configuration of
ShadowMode
inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates. - status
Inference
Experiment Status - The status of the inference experiment.
- status
Reason string - The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Tag[]
- An array of key-value pairs to apply to this resource.
- arn str
- The Amazon Resource Name (ARN) of the inference experiment.
- creation_
time str - The timestamp at which you created the inference experiment.
- data_
storage_ Inferenceconfig Experiment Data Storage Config - The Amazon S3 location and configuration for storing inference request and response data.
- description str
- The description of the inference experiment.
- desired_
state InferenceExperiment Desired State - The desired state of the experiment after starting or stopping operation.
- endpoint_
metadata InferenceExperiment Endpoint Metadata - last_
modified_ strtime - The timestamp at which you last modified the inference experiment.
- model_
variants Sequence[InferenceExperiment Model Variant Config] - An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule
Inference
Experiment Schedule The duration for which the inference experiment ran or will run.
The maximum duration that you can set for an inference experiment is 30 days.
- shadow_
mode_ Inferenceconfig Experiment Shadow Mode Config - The configuration of
ShadowMode
inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates. - status
Inference
Experiment Status - The status of the inference experiment.
- status_
reason str - The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- Sequence[root_Tag]
- An array of key-value pairs to apply to this resource.
- arn String
- The Amazon Resource Name (ARN) of the inference experiment.
- creation
Time String - The timestamp at which you created the inference experiment.
- data
Storage Property MapConfig - The Amazon S3 location and configuration for storing inference request and response data.
- description String
- The description of the inference experiment.
- desired
State "Running" | "Completed" | "Cancelled" - The desired state of the experiment after starting or stopping operation.
- endpoint
Metadata Property Map - last
Modified StringTime - The timestamp at which you last modified the inference experiment.
- model
Variants List<Property Map> - An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
- schedule Property Map
The duration for which the inference experiment ran or will run.
The maximum duration that you can set for an inference experiment is 30 days.
- shadow
Mode Property MapConfig - The configuration of
ShadowMode
inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates. - status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
- The status of the inference experiment.
- status
Reason String - The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
- List<Property Map>
- An array of key-value pairs to apply to this resource.
Supporting Types
InferenceExperimentCaptureContentTypeHeader
- Csv
Content List<string>Types - The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- Json
Content List<string>Types - The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- Csv
Content []stringTypes - The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- Json
Content []stringTypes - The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csv
Content List<String>Types - The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- json
Content List<String>Types - The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csv
Content string[]Types - The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- json
Content string[]Types - The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csv_
content_ Sequence[str]types - The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- json_
content_ Sequence[str]types - The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
- csv
Content List<String>Types - The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
- json
Content List<String>Types - The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
InferenceExperimentDataStorageConfig
- Destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- Content
Type Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Capture Content Type Header - Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- Kms
Key string - The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- Destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- Content
Type InferenceExperiment Capture Content Type Header - Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- Kms
Key string - The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination String
- The Amazon S3 bucket where the inference request and response data is stored.
- content
Type InferenceExperiment Capture Content Type Header - Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kms
Key String - The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination string
- The Amazon S3 bucket where the inference request and response data is stored.
- content
Type InferenceExperiment Capture Content Type Header - Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kms
Key string - The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination str
- The Amazon S3 bucket where the inference request and response data is stored.
- content_
type InferenceExperiment Capture Content Type Header - Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kms_
key str - The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
- destination String
- The Amazon S3 bucket where the inference request and response data is stored.
- content
Type Property Map - Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
- kms
Key String - The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
InferenceExperimentDesiredState
InferenceExperimentEndpointMetadata
- Endpoint
Name string - The name of the endpoint.
- Endpoint
Config stringName - The name of the endpoint configuration.
- Endpoint
Status Pulumi.Aws Native. Sage Maker. Inference Experiment Endpoint Metadata Endpoint Status - The status of the endpoint. For possible values of the status of an endpoint.
- Endpoint
Name string - The name of the endpoint.
- Endpoint
Config stringName - The name of the endpoint configuration.
- Endpoint
Status InferenceExperiment Endpoint Metadata Endpoint Status - The status of the endpoint. For possible values of the status of an endpoint.
- endpoint
Name String - The name of the endpoint.
- endpoint
Config StringName - The name of the endpoint configuration.
- endpoint
Status InferenceExperiment Endpoint Metadata Endpoint Status - The status of the endpoint. For possible values of the status of an endpoint.
- endpoint
Name string - The name of the endpoint.
- endpoint
Config stringName - The name of the endpoint configuration.
- endpoint
Status InferenceExperiment Endpoint Metadata Endpoint Status - The status of the endpoint. For possible values of the status of an endpoint.
- endpoint_
name str - The name of the endpoint.
- endpoint_
config_ strname - The name of the endpoint configuration.
- endpoint_
status InferenceExperiment Endpoint Metadata Endpoint Status - The status of the endpoint. For possible values of the status of an endpoint.
- endpoint
Name String - The name of the endpoint.
- endpoint
Config StringName - The name of the endpoint configuration.
- endpoint
Status "Creating" | "Updating" | "SystemUpdating" | "Rolling Back" | "In Service" | "Out Of Service" | "Deleting" | "Failed" - The status of the endpoint. For possible values of the status of an endpoint.
InferenceExperimentEndpointMetadataEndpointStatus
InferenceExperimentModelInfrastructureConfig
- Infrastructure
Type Pulumi.Aws Native. Sage Maker. Inference Experiment Model Infrastructure Config Infrastructure Type - The type of the inference experiment that you want to run.
- Real
Time Pulumi.Inference Config Aws Native. Sage Maker. Inputs. Inference Experiment Real Time Inference Config - The infrastructure configuration for deploying the model to real-time inference.
- Infrastructure
Type InferenceExperiment Model Infrastructure Config Infrastructure Type - The type of the inference experiment that you want to run.
- Real
Time InferenceInference Config Experiment Real Time Inference Config - The infrastructure configuration for deploying the model to real-time inference.
- infrastructure
Type InferenceExperiment Model Infrastructure Config Infrastructure Type - The type of the inference experiment that you want to run.
- real
Time InferenceInference Config Experiment Real Time Inference Config - The infrastructure configuration for deploying the model to real-time inference.
- infrastructure
Type InferenceExperiment Model Infrastructure Config Infrastructure Type - The type of the inference experiment that you want to run.
- real
Time InferenceInference Config Experiment Real Time Inference Config - The infrastructure configuration for deploying the model to real-time inference.
- infrastructure_
type InferenceExperiment Model Infrastructure Config Infrastructure Type - The type of the inference experiment that you want to run.
- real_
time_ Inferenceinference_ config Experiment Real Time Inference Config - The infrastructure configuration for deploying the model to real-time inference.
- infrastructure
Type "RealTime Inference" - The type of the inference experiment that you want to run.
- real
Time Property MapInference Config - The infrastructure configuration for deploying the model to real-time inference.
InferenceExperimentModelInfrastructureConfigInfrastructureType
InferenceExperimentModelVariantConfig
- Infrastructure
Config Pulumi.Aws Native. Sage Maker. Inputs. Inference Experiment Model Infrastructure Config - The configuration for the infrastructure that the model will be deployed to.
- Model
Name string - The name of the Amazon SageMaker Model entity.
- Variant
Name string - The name of the variant.
- Infrastructure
Config InferenceExperiment Model Infrastructure Config - The configuration for the infrastructure that the model will be deployed to.
- Model
Name string - The name of the Amazon SageMaker Model entity.
- Variant
Name string - The name of the variant.
- infrastructure
Config InferenceExperiment Model Infrastructure Config - The configuration for the infrastructure that the model will be deployed to.
- model
Name String - The name of the Amazon SageMaker Model entity.
- variant
Name String - The name of the variant.
- infrastructure
Config InferenceExperiment Model Infrastructure Config - The configuration for the infrastructure that the model will be deployed to.
- model
Name string - The name of the Amazon SageMaker Model entity.
- variant
Name string - The name of the variant.
- infrastructure_
config InferenceExperiment Model Infrastructure Config - The configuration for the infrastructure that the model will be deployed to.
- model_
name str - The name of the Amazon SageMaker Model entity.
- variant_
name str - The name of the variant.
- infrastructure
Config Property Map - The configuration for the infrastructure that the model will be deployed to.
- model
Name String - The name of the Amazon SageMaker Model entity.
- variant
Name String - The name of the variant.
InferenceExperimentRealTimeInferenceConfig
- Instance
Count int - The number of instances of the type specified by InstanceType.
- Instance
Type string - The instance type the model is deployed to.
- Instance
Count int - The number of instances of the type specified by InstanceType.
- Instance
Type string - The instance type the model is deployed to.
- instance
Count Integer - The number of instances of the type specified by InstanceType.
- instance
Type String - The instance type the model is deployed to.
- instance
Count number - The number of instances of the type specified by InstanceType.
- instance
Type string - The instance type the model is deployed to.
- instance_
count int - The number of instances of the type specified by InstanceType.
- instance_
type str - The instance type the model is deployed to.
- instance
Count Number - The number of instances of the type specified by InstanceType.
- instance
Type String - The instance type the model is deployed to.
InferenceExperimentSchedule
- end_
time str - The timestamp at which the inference experiment ended or will end.
- start_
time str - The timestamp at which the inference experiment started or will start.
InferenceExperimentShadowModeConfig
- Shadow
Model List<Pulumi.Variants Aws Native. Sage Maker. Inputs. Inference Experiment Shadow Model Variant Config> - List of shadow variant configurations.
- Source
Model stringVariant Name - The name of the production variant, which takes all the inference requests.
- Shadow
Model []InferenceVariants Experiment Shadow Model Variant Config - List of shadow variant configurations.
- Source
Model stringVariant Name - The name of the production variant, which takes all the inference requests.
- shadow
Model List<InferenceVariants Experiment Shadow Model Variant Config> - List of shadow variant configurations.
- source
Model StringVariant Name - The name of the production variant, which takes all the inference requests.
- shadow
Model InferenceVariants Experiment Shadow Model Variant Config[] - List of shadow variant configurations.
- source
Model stringVariant Name - The name of the production variant, which takes all the inference requests.
- shadow_
model_ Sequence[Inferencevariants Experiment Shadow Model Variant Config] - List of shadow variant configurations.
- source_
model_ strvariant_ name - The name of the production variant, which takes all the inference requests.
- shadow
Model List<Property Map>Variants - List of shadow variant configurations.
- source
Model StringVariant Name - The name of the production variant, which takes all the inference requests.
InferenceExperimentShadowModelVariantConfig
- Sampling
Percentage int - The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- Shadow
Model stringVariant Name - The name of the shadow variant.
- Sampling
Percentage int - The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- Shadow
Model stringVariant Name - The name of the shadow variant.
- sampling
Percentage Integer - The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadow
Model StringVariant Name - The name of the shadow variant.
- sampling
Percentage number - The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadow
Model stringVariant Name - The name of the shadow variant.
- sampling_
percentage int - The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadow_
model_ strvariant_ name - The name of the shadow variant.
- sampling
Percentage Number - The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
- shadow
Model StringVariant Name - The name of the shadow variant.
InferenceExperimentStatus
Tag
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
AWS Native is in preview. AWS Classic is fully supported.