DataSphere Jobs API v2, gRPC: ProjectJobService.Execute
- gRPC request
- ExecuteProjectJobRequest
- operation.Operation
- ExecuteProjectJobMetadata
- Job
- JobParameters
- File
- FileDesc
- Environment
- DockerImageSpec
- PythonEnv
- PipOptions
- CloudInstanceType
- ExtendedWorkingStorage
- Argument
- OutputDatasetDesc
- GracefulShutdownParameters
- SparkParameters
- FileUploadError
- OutputDataset
- JobProgress
- JobMetadata
- ExecuteProjectJobResponse
- StorageFile
- JobResult
Runs job execution.
gRPC request
rpc Execute (ExecuteProjectJobRequest) returns (operation.Operation)
ExecuteProjectJobRequest
{
"job_id": "string"
}
Field |
Description |
job_id |
string ID of the job. |
operation.Operation
{
"id": "string",
"description": "string",
"created_at": "google.protobuf.Timestamp",
"created_by": "string",
"modified_at": "google.protobuf.Timestamp",
"done": "bool",
"metadata": {
"job": {
"id": "string",
"name": "string",
"desc": "string",
"created_at": "google.protobuf.Timestamp",
"finished_at": "google.protobuf.Timestamp",
"status": "JobStatus",
"config": "string",
"created_by_id": "string",
"project_id": "string",
"job_parameters": {
"input_files": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"output_files": [
{
"path": "string",
"var": "string"
}
],
"s3_mount_ids": [
"string"
],
"dataset_ids": [
"string"
],
"cmd": "string",
"env": {
"vars": "map<string, string>",
// Includes only one of the fields `docker_image_resource_id`, `docker_image_spec`
"docker_image_resource_id": "string",
"docker_image_spec": {
"image_url": "string",
"username": "string",
// Includes only one of the fields `password_plain_text`, `password_ds_secret_name`
"password_plain_text": "string",
"password_ds_secret_name": "string"
// end of the list of possible fields
},
// end of the list of possible fields
"python_env": {
"conda_yaml": "string",
"local_modules": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"python_version": "string",
"requirements": [
"string"
],
"pip_options": {
"index_url": "string",
"extra_index_urls": [
"string"
],
"trusted_hosts": [
"string"
],
"no_deps": "bool"
}
}
},
"attach_project_disk": "bool",
"cloud_instance_types": [
{
"name": "string"
}
],
"extended_working_storage": {
"type": "StorageType",
"size_gb": "int64"
},
"arguments": [
{
"name": "string",
"value": "string"
}
],
"output_datasets": [
{
"name": "string",
"description": "string",
"labels": "map<string, string>",
"size_gb": "int64",
"var": "string"
}
],
"graceful_shutdown_parameters": {
"timeout": "google.protobuf.Duration",
"signal": "int64"
},
"spark_parameters": {
"connector_id": "string"
}
},
"data_expires_at": "google.protobuf.Timestamp",
"data_cleared": "bool",
"output_files": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"log_files": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"diagnostic_files": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"data_size_bytes": "int64",
"started_at": "google.protobuf.Timestamp",
"status_details": "string",
"actual_cloud_instance_type": {
"name": "string"
},
"parent_job_id": "string",
"file_errors": [
{
// Includes only one of the fields `output_file_desc`, `log_file_name`
"output_file_desc": {
"path": "string",
"var": "string"
},
"log_file_name": "string",
// end of the list of possible fields
"description": "string",
"type": "ErrorType"
}
],
"output_datasets": [
{
"desc": {
"name": "string",
"description": "string",
"labels": "map<string, string>",
"size_gb": "int64",
"var": "string"
},
"id": "string"
}
]
},
"progress": {
"message": "string",
"progress": "int64",
"create_time": "google.protobuf.Timestamp"
},
"metadata": {
"id": "string",
"name": "string",
"description": "string",
"created_at": "google.protobuf.Timestamp",
"started_at": "google.protobuf.Timestamp",
"finished_at": "google.protobuf.Timestamp",
"data_expires_at": "google.protobuf.Timestamp",
"status": "JobStatus",
"status_details": "string",
"created_by_id": "string",
"project_id": "string",
"parent_job_id": "string"
}
},
// Includes only one of the fields `error`, `response`
"error": "google.rpc.Status",
"response": {
"output_files": [
{
"file": {
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
},
"url": "string"
}
],
"output_files_errors": [
{
// Includes only one of the fields `output_file_desc`, `log_file_name`
"output_file_desc": {
"path": "string",
"var": "string"
},
"log_file_name": "string",
// end of the list of possible fields
"description": "string",
"type": "ErrorType"
}
],
"output_datasets": [
{
"desc": {
"name": "string",
"description": "string",
"labels": "map<string, string>",
"size_gb": "int64",
"var": "string"
},
"id": "string"
}
],
"result": {
"return_code": "int64"
}
}
// end of the list of possible fields
}
An Operation resource. For more information, see Operation.
Field |
Description |
id |
string ID of the operation. |
description |
string Description of the operation. 0-256 characters long. |
created_at |
Creation timestamp. |
created_by |
string ID of the user or service account who initiated the operation. |
modified_at |
The time when the Operation resource was last modified. |
done |
bool If the value is |
metadata |
Service-specific metadata associated with the operation. |
error |
The error result of the operation in case of failure or cancellation. Includes only one of the fields The operation result. |
response |
The normal response of the operation in case of success. Includes only one of the fields The operation result. |
ExecuteProjectJobMetadata
Field |
Description |
job |
Instance of the job. |
progress |
Job progress info |
metadata |
Job metadata with main job info |
Job
Instance of the job.
Field |
Description |
id |
string ID of the job. |
name |
string Name of the job. |
desc |
string Description of the job. |
created_at |
Create job timestamp. |
finished_at |
Finish job timestamp. |
status |
enum JobStatus Status of the job.
|
config |
string Config of the job, copied from configuration file. |
created_by_id |
string ID of the user who created the job. |
project_id |
string ID of the project. |
job_parameters |
|
data_expires_at |
Job data expiration timestamp. |
data_cleared |
bool Marks if the job data has been cleared. |
output_files[] |
Output files of the job. |
log_files[] |
Job log files. |
diagnostic_files[] |
Job diagnostics files. |
data_size_bytes |
int64 Job total data size. |
started_at |
Start job timestamp. |
status_details |
string Details. |
actual_cloud_instance_type |
Actual VM instance type, job is running on. |
parent_job_id |
string Reference to the parent job. |
file_errors[] |
Failed uploads. |
output_datasets[] |
Created datasets. |
JobParameters
Job parameters.
Field |
Description |
input_files[] |
List of input files. |
output_files[] |
List of output files descriptions. |
s3_mount_ids[] |
string List of DataSphere S3 mount ids. |
dataset_ids[] |
string List of DataSphere dataset ids. |
cmd |
string Job run command. |
env |
Job environment description. |
attach_project_disk |
bool Should project disk be attached to VM. |
cloud_instance_types[] |
VM specification. |
extended_working_storage |
Extended working storage configuration. |
arguments[] |
List of literal arguments. |
output_datasets[] |
List of DataSets descriptions to create. |
graceful_shutdown_parameters |
Graceful shutdown settings. |
spark_parameters |
Spark connector settings. |
File
Field |
Description |
desc |
|
sha256 |
string SHA256 of the file. |
size_bytes |
int64 File size in bytes. |
compression_type |
enum FileCompressionType File compression info
|
FileDesc
Field |
Description |
path |
string Path of the file on filesystem. |
var |
string Variable to use in cmd substitution. |
Environment
Field |
Description |
vars |
object (map<string, string>) Environment variables. |
docker_image_resource_id |
string DS docker image id. Includes only one of the fields |
docker_image_spec |
Includes only one of the fields |
python_env |
DockerImageSpec
Field |
Description |
image_url |
string Docker image URL. |
username |
string Username for container registry. |
password_plain_text |
string Plaintext password. Includes only one of the fields Password for container registry. |
password_ds_secret_name |
string ID of DataSphere secret containing password. Includes only one of the fields Password for container registry. |
PythonEnv
Field |
Description |
conda_yaml |
string Conda YAML. |
local_modules[] |
List of local modules descriptions. |
python_version |
string Python version reduced to major.minor |
requirements[] |
string List of pip requirements |
pip_options |
Pip install options |
PipOptions
Field |
Description |
index_url |
string --index-url option |
extra_index_urls[] |
string --extra-index-urls option |
trusted_hosts[] |
string --trusted-hosts option |
no_deps |
bool --no-deps option |
CloudInstanceType
Field |
Description |
name |
string Name of DataSphere VM configuration. |
ExtendedWorkingStorage
Extended working storage configuration.
Field |
Description |
type |
enum StorageType
|
size_gb |
int64 |
Argument
Field |
Description |
name |
string |
value |
string |
OutputDatasetDesc
Field |
Description |
name |
string Name to create dataset with |
description |
string Description to show in UI |
labels |
object (map<string, string>) |
size_gb |
int64 Size of dataset to create |
var |
string Var name to replace in cmd, like in FileDesc |
GracefulShutdownParameters
Field |
Description |
timeout |
|
signal |
int64 default 15 (SIGTERM) |
SparkParameters
Field |
Description |
connector_id |
string ID of the Spark connector. |
FileUploadError
Field |
Description |
output_file_desc |
Includes only one of the fields |
log_file_name |
string Includes only one of the fields |
description |
string |
type |
enum ErrorType
|
OutputDataset
Field |
Description |
desc |
Dataset description |
id |
string Id of created dataset |
JobProgress
Field |
Description |
message |
string Progress message |
progress |
int64 Progress of the job from 0 to 100 |
create_time |
Progress create time |
JobMetadata
Field |
Description |
id |
string ID of the job. |
name |
string Name of the job. |
description |
string Description of the job. |
created_at |
Create job timestamp. |
started_at |
Start job timestamp. |
finished_at |
Finish job timestamp. |
data_expires_at |
Job data expiration timestamp. |
status |
enum JobStatus Status of the job.
|
status_details |
string Details. |
created_by_id |
string ID of the user who created the job. |
project_id |
string ID of the project. |
parent_job_id |
string Reference to the parent job. |
ExecuteProjectJobResponse
Field |
Description |
output_files[] |
Uploaded output files with URLs. |
output_files_errors[] |
Output file errors |
output_datasets[] |
Created datasets |
result |
Result of the job. |
StorageFile
Field |
Description |
file |
|
url |
string File URL. |
JobResult
Field |
Description |
return_code |
int64 Execution return code. |