Google Cloud Native is in preview. Google Cloud Classic is fully supported.
Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi
google-native.dataplex/v1.getTask
Explore with Pulumi AI
Google Cloud Native is in preview. Google Cloud Classic is fully supported.
Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi
Get task resource.
Using getTask
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getTask(args: GetTaskArgs, opts?: InvokeOptions): Promise<GetTaskResult>
function getTaskOutput(args: GetTaskOutputArgs, opts?: InvokeOptions): Output<GetTaskResult>
def get_task(lake_id: Optional[str] = None,
location: Optional[str] = None,
project: Optional[str] = None,
task_id: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetTaskResult
def get_task_output(lake_id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
project: Optional[pulumi.Input[str]] = None,
task_id: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetTaskResult]
func LookupTask(ctx *Context, args *LookupTaskArgs, opts ...InvokeOption) (*LookupTaskResult, error)
func LookupTaskOutput(ctx *Context, args *LookupTaskOutputArgs, opts ...InvokeOption) LookupTaskResultOutput
> Note: This function is named LookupTask
in the Go SDK.
public static class GetTask
{
public static Task<GetTaskResult> InvokeAsync(GetTaskArgs args, InvokeOptions? opts = null)
public static Output<GetTaskResult> Invoke(GetTaskInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetTaskResult> getTask(GetTaskArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: google-native:dataplex/v1:getTask
arguments:
# arguments dictionary
The following arguments are supported:
getTask Result
The following output properties are available:
- Create
Time string - The time when the task was created.
- Description string
- Optional. Description of the task.
- Display
Name string - Optional. User friendly display name.
- Execution
Spec Pulumi.Google Native. Dataplex. V1. Outputs. Google Cloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- Execution
Status Pulumi.Google Native. Dataplex. V1. Outputs. Google Cloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- Labels Dictionary<string, string>
- Optional. User-defined labels for the task.
- Name string
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- Notebook
Pulumi.
Google Native. Dataplex. V1. Outputs. Google Cloud Dataplex V1Task Notebook Task Config Response - Config related to running scheduled Notebooks.
- Spark
Pulumi.
Google Native. Dataplex. V1. Outputs. Google Cloud Dataplex V1Task Spark Task Config Response - Config related to running custom Spark tasks.
- State string
- Current state of the task.
- Trigger
Spec Pulumi.Google Native. Dataplex. V1. Outputs. Google Cloud Dataplex V1Task Trigger Spec Response - Spec related to how often and when a task should be triggered.
- Uid string
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- Update
Time string - The time when the task was last updated.
- Create
Time string - The time when the task was created.
- Description string
- Optional. Description of the task.
- Display
Name string - Optional. User friendly display name.
- Execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- Execution
Status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- Labels map[string]string
- Optional. User-defined labels for the task.
- Name string
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- Notebook
Google
Cloud Dataplex V1Task Notebook Task Config Response - Config related to running scheduled Notebooks.
- Spark
Google
Cloud Dataplex V1Task Spark Task Config Response - Config related to running custom Spark tasks.
- State string
- Current state of the task.
- Trigger
Spec GoogleCloud Dataplex V1Task Trigger Spec Response - Spec related to how often and when a task should be triggered.
- Uid string
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- Update
Time string - The time when the task was last updated.
- create
Time String - The time when the task was created.
- description String
- Optional. Description of the task.
- display
Name String - Optional. User friendly display name.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- execution
Status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- labels Map<String,String>
- Optional. User-defined labels for the task.
- name String
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- notebook
Google
Cloud Dataplex V1Task Notebook Task Config Response - Config related to running scheduled Notebooks.
- spark
Google
Cloud Dataplex V1Task Spark Task Config Response - Config related to running custom Spark tasks.
- state String
- Current state of the task.
- trigger
Spec GoogleCloud Dataplex V1Task Trigger Spec Response - Spec related to how often and when a task should be triggered.
- uid String
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update
Time String - The time when the task was last updated.
- create
Time string - The time when the task was created.
- description string
- Optional. Description of the task.
- display
Name string - Optional. User friendly display name.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- execution
Status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- labels {[key: string]: string}
- Optional. User-defined labels for the task.
- name string
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- notebook
Google
Cloud Dataplex V1Task Notebook Task Config Response - Config related to running scheduled Notebooks.
- spark
Google
Cloud Dataplex V1Task Spark Task Config Response - Config related to running custom Spark tasks.
- state string
- Current state of the task.
- trigger
Spec GoogleCloud Dataplex V1Task Trigger Spec Response - Spec related to how often and when a task should be triggered.
- uid string
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update
Time string - The time when the task was last updated.
- create_
time str - The time when the task was created.
- description str
- Optional. Description of the task.
- display_
name str - Optional. User friendly display name.
- execution_
spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- execution_
status GoogleCloud Dataplex V1Task Execution Status Response - Status of the latest task executions.
- labels Mapping[str, str]
- Optional. User-defined labels for the task.
- name str
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- notebook
Google
Cloud Dataplex V1Task Notebook Task Config Response - Config related to running scheduled Notebooks.
- spark
Google
Cloud Dataplex V1Task Spark Task Config Response - Config related to running custom Spark tasks.
- state str
- Current state of the task.
- trigger_
spec GoogleCloud Dataplex V1Task Trigger Spec Response - Spec related to how often and when a task should be triggered.
- uid str
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update_
time str - The time when the task was last updated.
- create
Time String - The time when the task was created.
- description String
- Optional. Description of the task.
- display
Name String - Optional. User friendly display name.
- execution
Spec Property Map - Spec related to how a task is executed.
- execution
Status Property Map - Status of the latest task executions.
- labels Map<String>
- Optional. User-defined labels for the task.
- name String
- The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.
- notebook Property Map
- Config related to running scheduled Notebooks.
- spark Property Map
- Config related to running custom Spark tasks.
- state String
- Current state of the task.
- trigger
Spec Property Map - Spec related to how often and when a task should be triggered.
- uid String
- System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.
- update
Time String - The time when the task was last updated.
Supporting Types
GoogleCloudDataplexV1JobResponse
- End
Time string - The time when the job ended.
- Execution
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- Labels Dictionary<string, string>
- User-defined labels for the task.
- Message string
- Additional information about the current state.
- Name string
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- Retry
Count int - The number of times the job has been retried (excluding the initial attempt).
- Service string
- The underlying service running a job.
- Service
Job string - The full resource name for the job run under a particular service.
- Start
Time string - The time when the job was started.
- State string
- Execution state for the job.
- Trigger string
- Job execution trigger.
- Uid string
- System generated globally unique ID for the job.
- End
Time string - The time when the job ended.
- Execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- Labels map[string]string
- User-defined labels for the task.
- Message string
- Additional information about the current state.
- Name string
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- Retry
Count int - The number of times the job has been retried (excluding the initial attempt).
- Service string
- The underlying service running a job.
- Service
Job string - The full resource name for the job run under a particular service.
- Start
Time string - The time when the job was started.
- State string
- Execution state for the job.
- Trigger string
- Job execution trigger.
- Uid string
- System generated globally unique ID for the job.
- end
Time String - The time when the job ended.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- labels Map<String,String>
- User-defined labels for the task.
- message String
- Additional information about the current state.
- name String
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry
Count Integer - The number of times the job has been retried (excluding the initial attempt).
- service String
- The underlying service running a job.
- service
Job String - The full resource name for the job run under a particular service.
- start
Time String - The time when the job was started.
- state String
- Execution state for the job.
- trigger String
- Job execution trigger.
- uid String
- System generated globally unique ID for the job.
- end
Time string - The time when the job ended.
- execution
Spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- labels {[key: string]: string}
- User-defined labels for the task.
- message string
- Additional information about the current state.
- name string
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry
Count number - The number of times the job has been retried (excluding the initial attempt).
- service string
- The underlying service running a job.
- service
Job string - The full resource name for the job run under a particular service.
- start
Time string - The time when the job was started.
- state string
- Execution state for the job.
- trigger string
- Job execution trigger.
- uid string
- System generated globally unique ID for the job.
- end_
time str - The time when the job ended.
- execution_
spec GoogleCloud Dataplex V1Task Execution Spec Response - Spec related to how a task is executed.
- labels Mapping[str, str]
- User-defined labels for the task.
- message str
- Additional information about the current state.
- name str
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry_
count int - The number of times the job has been retried (excluding the initial attempt).
- service str
- The underlying service running a job.
- service_
job str - The full resource name for the job run under a particular service.
- start_
time str - The time when the job was started.
- state str
- Execution state for the job.
- trigger str
- Job execution trigger.
- uid str
- System generated globally unique ID for the job.
- end
Time String - The time when the job ended.
- execution
Spec Property Map - Spec related to how a task is executed.
- labels Map<String>
- User-defined labels for the task.
- message String
- Additional information about the current state.
- name String
- The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.
- retry
Count Number - The number of times the job has been retried (excluding the initial attempt).
- service String
- The underlying service running a job.
- service
Job String - The full resource name for the job run under a particular service.
- start
Time String - The time when the job was started.
- state String
- Execution state for the job.
- trigger String
- Job execution trigger.
- uid String
- System generated globally unique ID for the job.
GoogleCloudDataplexV1TaskExecutionSpecResponse
- Args Dictionary<string, string>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- Kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- Max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- Project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- Service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- Args map[string]string
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- Kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- Max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- Project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- Service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Map<String,String>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key String - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job StringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project String
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account String - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args {[key: string]: string}
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key string - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job stringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project string
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account string - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Mapping[str, str]
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms_
key str - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max_
job_ strexecution_ lifetime - Optional. The maximum duration after which the job execution is expired.
- project str
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service_
account str - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
- args Map<String>
- Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.
- kms
Key String - Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.
- max
Job StringExecution Lifetime - Optional. The maximum duration after which the job execution is expired.
- project String
- Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.
- service
Account String - Service account to use to execute a task. If not provided, the default Compute service account for the project is used.
GoogleCloudDataplexV1TaskExecutionStatusResponse
- Latest
Job Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Job Response - latest job execution
- Update
Time string - Last update time of the status.
- Latest
Job GoogleCloud Dataplex V1Job Response - latest job execution
- Update
Time string - Last update time of the status.
- latest
Job GoogleCloud Dataplex V1Job Response - latest job execution
- update
Time String - Last update time of the status.
- latest
Job GoogleCloud Dataplex V1Job Response - latest job execution
- update
Time string - Last update time of the status.
- latest_
job GoogleCloud Dataplex V1Job Response - latest job execution
- update_
time str - Last update time of the status.
- latest
Job Property Map - latest job execution
- update
Time String - Last update time of the status.
GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse
- Executors
Count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- Max
Executors intCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- Executors
Count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- Max
Executors intCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count Integer - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors IntegerCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count number - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors numberCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors_
count int - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max_
executors_ intcount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
- executors
Count Number - Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2
- max
Executors NumberCount - Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000
GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse
- Image string
- Optional. Container image to use.
- Java
Jars List<string> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- Properties Dictionary<string, string>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- Python
Packages List<string> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- Image string
- Optional. Container image to use.
- Java
Jars []string - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- Properties map[string]string
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- Python
Packages []string - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image String
- Optional. Container image to use.
- java
Jars List<String> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Map<String,String>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages List<String> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image string
- Optional. Container image to use.
- java
Jars string[] - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties {[key: string]: string}
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages string[] - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image str
- Optional. Container image to use.
- java_
jars Sequence[str] - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Mapping[str, str]
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python_
packages Sequence[str] - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
- image String
- Optional. Container image to use.
- java
Jars List<String> - Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar
- properties Map<String>
- Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).
- python
Packages List<String> - Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz
GoogleCloudDataplexV1TaskInfrastructureSpecResponse
- Batch
Pulumi.
Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- Container
Image Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- Vpc
Network Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- Batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- Container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- Vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- container
Image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- vpc
Network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch
Google
Cloud Dataplex V1Task Infrastructure Spec Batch Compute Resources Response - Compute resources needed for a Task when using Dataproc Serverless.
- container_
image GoogleCloud Dataplex V1Task Infrastructure Spec Container Image Runtime Response - Container Image Runtime Configuration.
- vpc_
network GoogleCloud Dataplex V1Task Infrastructure Spec Vpc Network Response - Vpc network.
- batch Property Map
- Compute resources needed for a Task when using Dataproc Serverless.
- container
Image Property Map - Container Image Runtime Configuration.
- vpc
Network Property Map - Vpc network.
GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse
- Network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<string>
- Optional. List of network tags to apply to the job.
- Sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- Network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- []string
- Optional. List of network tags to apply to the job.
- Sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- network String
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<String>
- Optional. List of network tags to apply to the job.
- sub
Network String - Optional. The Cloud VPC sub-network in which the job is run.
- network string
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- string[]
- Optional. List of network tags to apply to the job.
- sub
Network string - Optional. The Cloud VPC sub-network in which the job is run.
- network str
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- Sequence[str]
- Optional. List of network tags to apply to the job.
- sub_
network str - Optional. The Cloud VPC sub-network in which the job is run.
- network String
- Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.
- List<String>
- Optional. List of network tags to apply to the job.
- sub
Network String - Optional. The Cloud VPC sub-network in which the job is run.
GoogleCloudDataplexV1TaskNotebookTaskConfigResponse
- Archive
Uris List<string> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris List<string> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- Archive
Uris []string - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris []string - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- notebook String
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris string[] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris string[] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- notebook string
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive_
uris Sequence[str] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file_
uris Sequence[str] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure_
spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- notebook str
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec Property Map - Optional. Infrastructure specification for the execution.
- notebook String
- Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).
GoogleCloudDataplexV1TaskSparkTaskConfigResponse
- Archive
Uris List<string> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris List<string> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec Pulumi.Google Native. Dataplex. V1. Inputs. Google Cloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- Main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- Python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- Sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- Sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- Archive
Uris []string - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- File
Uris []string - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- Infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- Main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- Main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- Python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- Sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- Sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- main
Class String - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar StringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script StringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script String - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script StringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris string[] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris string[] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- main
Class string - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar stringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script stringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script string - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script stringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive_
uris Sequence[str] - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file_
uris Sequence[str] - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure_
spec GoogleCloud Dataplex V1Task Infrastructure Spec Response - Optional. Infrastructure specification for the execution.
- main_
class str - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main_
jar_ strfile_ uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python_
script_ strfile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql_
script str - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql_
script_ strfile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
- archive
Uris List<String> - Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
- file
Uris List<String> - Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.
- infrastructure
Spec Property Map - Optional. Infrastructure specification for the execution.
- main
Class String - The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).
- main
Jar StringFile Uri - The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).
- python
Script StringFile - The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).
- sql
Script String - The query text. The execution args are used to declare a set of script variables (set key="value";).
- sql
Script StringFile - A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).
GoogleCloudDataplexV1TaskTriggerSpecResponse
- Disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- Max
Retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- Schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- Start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- Type string
- Immutable. Trigger type of the user-specified Task.
- Disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- Max
Retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- Schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- Start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- Type string
- Immutable. Trigger type of the user-specified Task.
- disabled Boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries Integer - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule String
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time String - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type String
- Immutable. Trigger type of the user-specified Task.
- disabled boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries number - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule string
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time string - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type string
- Immutable. Trigger type of the user-specified Task.
- disabled bool
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max_
retries int - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule str
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start_
time str - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type str
- Immutable. Trigger type of the user-specified Task.
- disabled Boolean
- Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.
- max
Retries Number - Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.
- schedule String
- Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.
- start
Time String - Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.
- type String
- Immutable. Trigger type of the user-specified Task.
Package Details
- Repository
- Google Cloud Native pulumi/pulumi-google-native
- License
- Apache-2.0
Google Cloud Native is in preview. Google Cloud Classic is fully supported.
Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi