DataSphere Jobs API v2, gRPC: ProjectJobService.Create
- gRPC request
- CreateProjectJobRequest
- JobParameters
- File
- FileDesc
- Environment
- DockerImageSpec
- PythonEnv
- PipOptions
- CloudInstanceType
- ExtendedWorkingStorage
- Argument
- OutputDatasetDesc
- GracefulShutdownParameters
- SparkParameters
- operation.Operation
- CreateProjectJobMetadata
- CreateProjectJobResponse
- StorageFile
- File
- FileDesc
Creates job.
gRPC request
rpc Create (CreateProjectJobRequest) returns (operation.Operation)
CreateProjectJobRequest
{
"project_id": "string",
"job_parameters": {
"input_files": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"output_files": [
{
"path": "string",
"var": "string"
}
],
"s3_mount_ids": [
"string"
],
"dataset_ids": [
"string"
],
"cmd": "string",
"env": {
"vars": "map<string, string>",
// Includes only one of the fields `docker_image_resource_id`, `docker_image_spec`
"docker_image_resource_id": "string",
"docker_image_spec": {
"image_url": "string",
"username": "string",
// Includes only one of the fields `password_plain_text`, `password_ds_secret_name`
"password_plain_text": "string",
"password_ds_secret_name": "string"
// end of the list of possible fields
},
// end of the list of possible fields
"python_env": {
"conda_yaml": "string",
"local_modules": [
{
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
}
],
"python_version": "string",
"requirements": [
"string"
],
"pip_options": {
"index_url": "string",
"extra_index_urls": [
"string"
],
"trusted_hosts": [
"string"
],
"no_deps": "bool"
}
}
},
"attach_project_disk": "bool",
"cloud_instance_types": [
{
"name": "string"
}
],
"extended_working_storage": {
"type": "StorageType",
"size_gb": "int64"
},
"arguments": [
{
"name": "string",
"value": "string"
}
],
"output_datasets": [
{
"name": "string",
"description": "string",
"labels": "map<string, string>",
"size_gb": "int64",
"var": "string"
}
],
"graceful_shutdown_parameters": {
"timeout": "google.protobuf.Duration",
"signal": "int64"
},
"spark_parameters": {
"connector_id": "string"
}
},
"config": "string",
"name": "string",
"desc": "string",
"data_ttl": "google.protobuf.Duration"
}
Field |
Description |
project_id |
string ID of the project. |
job_parameters |
Parameters of the job. |
config |
string Config of the job. |
name |
string Name of the job. |
desc |
string Description of the job. |
data_ttl |
Job data TTL. |
JobParameters
Job parameters.
Field |
Description |
input_files[] |
List of input files. |
output_files[] |
List of output files descriptions. |
s3_mount_ids[] |
string List of DataSphere S3 mount ids. |
dataset_ids[] |
string List of DataSphere dataset ids. |
cmd |
string Job run command. |
env |
Job environment description. |
attach_project_disk |
bool Should project disk be attached to VM. |
cloud_instance_types[] |
VM specification. |
extended_working_storage |
Extended working storage configuration. |
arguments[] |
List of literal arguments. |
output_datasets[] |
List of DataSets descriptions to create. |
graceful_shutdown_parameters |
Graceful shutdown settings. |
spark_parameters |
Spark connector settings. |
File
Field |
Description |
desc |
|
sha256 |
string SHA256 of the file. |
size_bytes |
int64 File size in bytes. |
compression_type |
enum FileCompressionType File compression info
|
FileDesc
Field |
Description |
path |
string Path of the file on filesystem. |
var |
string Variable to use in cmd substitution. |
Environment
Field |
Description |
vars |
object (map<string, string>) Environment variables. |
docker_image_resource_id |
string DS docker image id. Includes only one of the fields |
docker_image_spec |
Includes only one of the fields |
python_env |
DockerImageSpec
Field |
Description |
image_url |
string Docker image URL. |
username |
string Username for container registry. |
password_plain_text |
string Plaintext password. Includes only one of the fields Password for container registry. |
password_ds_secret_name |
string ID of DataSphere secret containing password. Includes only one of the fields Password for container registry. |
PythonEnv
Field |
Description |
conda_yaml |
string Conda YAML. |
local_modules[] |
List of local modules descriptions. |
python_version |
string Python version reduced to major.minor |
requirements[] |
string List of pip requirements |
pip_options |
Pip install options |
PipOptions
Field |
Description |
index_url |
string --index-url option |
extra_index_urls[] |
string --extra-index-urls option |
trusted_hosts[] |
string --trusted-hosts option |
no_deps |
bool --no-deps option |
CloudInstanceType
Field |
Description |
name |
string Name of DataSphere VM configuration. |
ExtendedWorkingStorage
Extended working storage configuration.
Field |
Description |
type |
enum StorageType
|
size_gb |
int64 |
Argument
Field |
Description |
name |
string |
value |
string |
OutputDatasetDesc
Field |
Description |
name |
string Name to create dataset with |
description |
string Description to show in UI |
labels |
object (map<string, string>) |
size_gb |
int64 Size of dataset to create |
var |
string Var name to replace in cmd, like in FileDesc |
GracefulShutdownParameters
Field |
Description |
timeout |
|
signal |
int64 default 15 (SIGTERM) |
SparkParameters
Field |
Description |
connector_id |
string ID of the Spark connector. |
operation.Operation
{
"id": "string",
"description": "string",
"created_at": "google.protobuf.Timestamp",
"created_by": "string",
"modified_at": "google.protobuf.Timestamp",
"done": "bool",
"metadata": {
"project_id": "string",
"job_id": "string"
},
// Includes only one of the fields `error`, `response`
"error": "google.rpc.Status",
"response": {
"job_id": "string",
"upload_files": [
{
"file": {
"desc": {
"path": "string",
"var": "string"
},
"sha256": "string",
"size_bytes": "int64",
"compression_type": "FileCompressionType"
},
"url": "string"
}
]
}
// end of the list of possible fields
}
An Operation resource. For more information, see Operation.
Field |
Description |
id |
string ID of the operation. |
description |
string Description of the operation. 0-256 characters long. |
created_at |
Creation timestamp. |
created_by |
string ID of the user or service account who initiated the operation. |
modified_at |
The time when the Operation resource was last modified. |
done |
bool If the value is |
metadata |
Service-specific metadata associated with the operation. |
error |
The error result of the operation in case of failure or cancellation. Includes only one of the fields The operation result. |
response |
The normal response of the operation in case of success. Includes only one of the fields The operation result. |
CreateProjectJobMetadata
Field |
Description |
project_id |
string ID of the project. |
job_id |
string Job ID. |
CreateProjectJobResponse
Field |
Description |
job_id |
string ID of the job. |
upload_files[] |
Files to upload with their presigned URLs for upload. |
StorageFile
Field |
Description |
file |
|
url |
string File URL. |
File
Field |
Description |
desc |
|
sha256 |
string SHA256 of the file. |
size_bytes |
int64 File size in bytes. |
compression_type |
enum FileCompressionType File compression info
|
FileDesc
Field |
Description |
path |
string Path of the file on filesystem. |
var |
string Variable to use in cmd substitution. |