Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex DataSphere
  • Getting started
  • Terraform reference
    • Overview
    • Authentication with the API
      • Overview
        • Overview
          • Overview
          • Create
          • Clone
          • Execute
          • Cancel
          • ReadStdLogs
          • ReadLogs
          • DownloadJobFiles
          • List
          • Get
          • Delete
          • DeleteData
          • DeleteAllData
          • SetDataTtl
  • Audit Trails events
  • Access management
  • Pricing policy
  • Public materials
  • Release notes

In this article:

  • gRPC request
  • ExecuteProjectJobRequest
  • operation.Operation
  • ExecuteProjectJobMetadata
  • Job
  • JobParameters
  • File
  • FileDesc
  • Environment
  • DockerImageSpec
  • PythonEnv
  • PipOptions
  • CloudInstanceType
  • ExtendedWorkingStorage
  • Argument
  • OutputDatasetDesc
  • GracefulShutdownParameters
  • SparkParameters
  • FileUploadError
  • OutputDataset
  • JobProgress
  • JobMetadata
  • ExecuteProjectJobResponse
  • StorageFile
  • JobResult
  1. API reference
  2. gRPC
  3. DataSphere Jobs API v2
  4. ProjectJob
  5. Execute

DataSphere Jobs API v2, gRPC: ProjectJobService.Execute

Written by
Yandex Cloud
Updated at December 17, 2024
  • gRPC request
  • ExecuteProjectJobRequest
  • operation.Operation
  • ExecuteProjectJobMetadata
  • Job
  • JobParameters
  • File
  • FileDesc
  • Environment
  • DockerImageSpec
  • PythonEnv
  • PipOptions
  • CloudInstanceType
  • ExtendedWorkingStorage
  • Argument
  • OutputDatasetDesc
  • GracefulShutdownParameters
  • SparkParameters
  • FileUploadError
  • OutputDataset
  • JobProgress
  • JobMetadata
  • ExecuteProjectJobResponse
  • StorageFile
  • JobResult

Runs job execution.

gRPC requestgRPC request

rpc Execute (ExecuteProjectJobRequest) returns (operation.Operation)

ExecuteProjectJobRequestExecuteProjectJobRequest

{
  "job_id": "string"
}

Field

Description

job_id

string

ID of the job.

operation.Operationoperation.Operation

{
  "id": "string",
  "description": "string",
  "created_at": "google.protobuf.Timestamp",
  "created_by": "string",
  "modified_at": "google.protobuf.Timestamp",
  "done": "bool",
  "metadata": {
    "job": {
      "id": "string",
      "name": "string",
      "desc": "string",
      "created_at": "google.protobuf.Timestamp",
      "finished_at": "google.protobuf.Timestamp",
      "status": "JobStatus",
      "config": "string",
      "created_by_id": "string",
      "project_id": "string",
      "job_parameters": {
        "input_files": [
          {
            "desc": {
              "path": "string",
              "var": "string"
            },
            "sha256": "string",
            "size_bytes": "int64",
            "compression_type": "FileCompressionType"
          }
        ],
        "output_files": [
          {
            "path": "string",
            "var": "string"
          }
        ],
        "s3_mount_ids": [
          "string"
        ],
        "dataset_ids": [
          "string"
        ],
        "cmd": "string",
        "env": {
          "vars": "map<string, string>",
          // Includes only one of the fields `docker_image_resource_id`, `docker_image_spec`
          "docker_image_resource_id": "string",
          "docker_image_spec": {
            "image_url": "string",
            "username": "string",
            // Includes only one of the fields `password_plain_text`, `password_ds_secret_name`
            "password_plain_text": "string",
            "password_ds_secret_name": "string"
            // end of the list of possible fields
          },
          // end of the list of possible fields
          "python_env": {
            "conda_yaml": "string",
            "local_modules": [
              {
                "desc": {
                  "path": "string",
                  "var": "string"
                },
                "sha256": "string",
                "size_bytes": "int64",
                "compression_type": "FileCompressionType"
              }
            ],
            "python_version": "string",
            "requirements": [
              "string"
            ],
            "pip_options": {
              "index_url": "string",
              "extra_index_urls": [
                "string"
              ],
              "trusted_hosts": [
                "string"
              ],
              "no_deps": "bool"
            }
          }
        },
        "attach_project_disk": "bool",
        "cloud_instance_types": [
          {
            "name": "string"
          }
        ],
        "extended_working_storage": {
          "type": "StorageType",
          "size_gb": "int64"
        },
        "arguments": [
          {
            "name": "string",
            "value": "string"
          }
        ],
        "output_datasets": [
          {
            "name": "string",
            "description": "string",
            "labels": "map<string, string>",
            "size_gb": "int64",
            "var": "string"
          }
        ],
        "graceful_shutdown_parameters": {
          "timeout": "google.protobuf.Duration",
          "signal": "int64"
        },
        "spark_parameters": {
          "connector_id": "string"
        }
      },
      "data_expires_at": "google.protobuf.Timestamp",
      "data_cleared": "bool",
      "output_files": [
        {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        }
      ],
      "log_files": [
        {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        }
      ],
      "diagnostic_files": [
        {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        }
      ],
      "data_size_bytes": "int64",
      "started_at": "google.protobuf.Timestamp",
      "status_details": "string",
      "actual_cloud_instance_type": {
        "name": "string"
      },
      "parent_job_id": "string",
      "file_errors": [
        {
          // Includes only one of the fields `output_file_desc`, `log_file_name`
          "output_file_desc": {
            "path": "string",
            "var": "string"
          },
          "log_file_name": "string",
          // end of the list of possible fields
          "description": "string",
          "type": "ErrorType"
        }
      ],
      "output_datasets": [
        {
          "desc": {
            "name": "string",
            "description": "string",
            "labels": "map<string, string>",
            "size_gb": "int64",
            "var": "string"
          },
          "id": "string"
        }
      ]
    },
    "progress": {
      "message": "string",
      "progress": "int64",
      "create_time": "google.protobuf.Timestamp"
    },
    "metadata": {
      "id": "string",
      "name": "string",
      "description": "string",
      "created_at": "google.protobuf.Timestamp",
      "started_at": "google.protobuf.Timestamp",
      "finished_at": "google.protobuf.Timestamp",
      "data_expires_at": "google.protobuf.Timestamp",
      "status": "JobStatus",
      "status_details": "string",
      "created_by_id": "string",
      "project_id": "string",
      "parent_job_id": "string"
    }
  },
  // Includes only one of the fields `error`, `response`
  "error": "google.rpc.Status",
  "response": {
    "output_files": [
      {
        "file": {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        },
        "url": "string"
      }
    ],
    "output_files_errors": [
      {
        // Includes only one of the fields `output_file_desc`, `log_file_name`
        "output_file_desc": {
          "path": "string",
          "var": "string"
        },
        "log_file_name": "string",
        // end of the list of possible fields
        "description": "string",
        "type": "ErrorType"
      }
    ],
    "output_datasets": [
      {
        "desc": {
          "name": "string",
          "description": "string",
          "labels": "map<string, string>",
          "size_gb": "int64",
          "var": "string"
        },
        "id": "string"
      }
    ],
    "result": {
      "return_code": "int64"
    }
  }
  // end of the list of possible fields
}

An Operation resource. For more information, see Operation.

Field

Description

id

string

ID of the operation.

description

string

Description of the operation. 0-256 characters long.

created_at

google.protobuf.Timestamp

Creation timestamp.

created_by

string

ID of the user or service account who initiated the operation.

modified_at

google.protobuf.Timestamp

The time when the Operation resource was last modified.

done

bool

If the value is false, it means the operation is still in progress.
If true, the operation is completed, and either error or response is available.

metadata

ExecuteProjectJobMetadata

Service-specific metadata associated with the operation.
It typically contains the ID of the target resource that the operation is performed on.
Any method that returns a long-running operation should document the metadata type, if any.

error

google.rpc.Status

The error result of the operation in case of failure or cancellation.

Includes only one of the fields error, response.

The operation result.
If done == false and there was no failure detected, neither error nor response is set.
If done == false and there was a failure detected, error is set.
If done == true, exactly one of error or response is set.

response

ExecuteProjectJobResponse

The normal response of the operation in case of success.
If the original method returns no data on success, such as Delete,
the response is google.protobuf.Empty.
If the original method is the standard Create/Update,
the response should be the target resource of the operation.
Any method that returns a long-running operation should document the response type, if any.

Includes only one of the fields error, response.

The operation result.
If done == false and there was no failure detected, neither error nor response is set.
If done == false and there was a failure detected, error is set.
If done == true, exactly one of error or response is set.

ExecuteProjectJobMetadataExecuteProjectJobMetadata

Field

Description

job

Job

Instance of the job.

progress

JobProgress

Job progress info

metadata

JobMetadata

Job metadata with main job info

JobJob

Instance of the job.

Field

Description

id

string

ID of the job.

name

string

Name of the job.

desc

string

Description of the job.

created_at

google.protobuf.Timestamp

Create job timestamp.

finished_at

google.protobuf.Timestamp

Finish job timestamp.

status

enum JobStatus

Status of the job.

  • JOB_STATUS_UNSPECIFIED
  • CREATING
  • EXECUTING
  • UPLOADING_OUTPUT
  • SUCCESS
  • ERROR
  • CANCELLED
  • CANCELLING
  • PREPARING

config

string

Config of the job, copied from configuration file.

created_by_id

string

ID of the user who created the job.

project_id

string

ID of the project.

job_parameters

JobParameters

data_expires_at

google.protobuf.Timestamp

Job data expiration timestamp.

data_cleared

bool

Marks if the job data has been cleared.

output_files[]

File

Output files of the job.

log_files[]

File

Job log files.

diagnostic_files[]

File

Job diagnostics files.

data_size_bytes

int64

Job total data size.

started_at

google.protobuf.Timestamp

Start job timestamp.

status_details

string

Details.

actual_cloud_instance_type

CloudInstanceType

Actual VM instance type, job is running on.

parent_job_id

string

Reference to the parent job.

file_errors[]

FileUploadError

Failed uploads.

output_datasets[]

OutputDataset

Created datasets.

JobParametersJobParameters

Job parameters.

Field

Description

input_files[]

File

List of input files.

output_files[]

FileDesc

List of output files descriptions.

s3_mount_ids[]

string

List of DataSphere S3 mount ids.

dataset_ids[]

string

List of DataSphere dataset ids.

cmd

string

Job run command.

env

Environment

Job environment description.

attach_project_disk

bool

Should project disk be attached to VM.

cloud_instance_types[]

CloudInstanceType

VM specification.

extended_working_storage

ExtendedWorkingStorage

Extended working storage configuration.

arguments[]

Argument

List of literal arguments.

output_datasets[]

OutputDatasetDesc

List of DataSets descriptions to create.

graceful_shutdown_parameters

GracefulShutdownParameters

Graceful shutdown settings.

spark_parameters

SparkParameters

Spark connector settings.

FileFile

Field

Description

desc

FileDesc

sha256

string

SHA256 of the file.

size_bytes

int64

File size in bytes.

compression_type

enum FileCompressionType

File compression info

  • FILE_COMPRESSION_TYPE_UNSPECIFIED
  • NONE
  • ZIP

FileDescFileDesc

Field

Description

path

string

Path of the file on filesystem.

var

string

Variable to use in cmd substitution.

EnvironmentEnvironment

Field

Description

vars

object (map<string, string>)

Environment variables.

docker_image_resource_id

string

DS docker image id.

Includes only one of the fields docker_image_resource_id, docker_image_spec.

docker_image_spec

DockerImageSpec

Includes only one of the fields docker_image_resource_id, docker_image_spec.

python_env

PythonEnv

DockerImageSpecDockerImageSpec

Field

Description

image_url

string

Docker image URL.

username

string

Username for container registry.

password_plain_text

string

Plaintext password.

Includes only one of the fields password_plain_text, password_ds_secret_name.

Password for container registry.

password_ds_secret_name

string

ID of DataSphere secret containing password.

Includes only one of the fields password_plain_text, password_ds_secret_name.

Password for container registry.

PythonEnvPythonEnv

Field

Description

conda_yaml

string

Conda YAML.

local_modules[]

File

List of local modules descriptions.

python_version

string

Python version reduced to major.minor

requirements[]

string

List of pip requirements

pip_options

PipOptions

Pip install options

PipOptionsPipOptions

Field

Description

index_url

string

--index-url option

extra_index_urls[]

string

--extra-index-urls option

trusted_hosts[]

string

--trusted-hosts option

no_deps

bool

--no-deps option

CloudInstanceTypeCloudInstanceType

Field

Description

name

string

Name of DataSphere VM configuration.

ExtendedWorkingStorageExtendedWorkingStorage

Extended working storage configuration.

Field

Description

type

enum StorageType

  • STORAGE_TYPE_UNSPECIFIED
  • SSD

size_gb

int64

ArgumentArgument

Field

Description

name

string

value

string

OutputDatasetDescOutputDatasetDesc

Field

Description

name

string

Name to create dataset with

description

string

Description to show in UI

labels

object (map<string, string>)

size_gb

int64

Size of dataset to create

var

string

Var name to replace in cmd, like in FileDesc

GracefulShutdownParametersGracefulShutdownParameters

Field

Description

timeout

google.protobuf.Duration

signal

int64

default 15 (SIGTERM)

SparkParametersSparkParameters

Field

Description

connector_id

string

ID of the Spark connector.

FileUploadErrorFileUploadError

Field

Description

output_file_desc

FileDesc

Includes only one of the fields output_file_desc, log_file_name.

log_file_name

string

Includes only one of the fields output_file_desc, log_file_name.

description

string

type

enum ErrorType

  • ERROR_TYPE_UNSPECIFIED
  • UPLOAD_FAILED
  • NOT_FOUND

OutputDatasetOutputDataset

Field

Description

desc

OutputDatasetDesc

Dataset description

id

string

Id of created dataset

JobProgressJobProgress

Field

Description

message

string

Progress message

progress

int64

Progress of the job from 0 to 100

create_time

google.protobuf.Timestamp

Progress create time

JobMetadataJobMetadata

Field

Description

id

string

ID of the job.

name

string

Name of the job.

description

string

Description of the job.

created_at

google.protobuf.Timestamp

Create job timestamp.

started_at

google.protobuf.Timestamp

Start job timestamp.

finished_at

google.protobuf.Timestamp

Finish job timestamp.

data_expires_at

google.protobuf.Timestamp

Job data expiration timestamp.

status

enum JobStatus

Status of the job.

  • JOB_STATUS_UNSPECIFIED
  • CREATING
  • EXECUTING
  • UPLOADING_OUTPUT
  • SUCCESS
  • ERROR
  • CANCELLED
  • CANCELLING
  • PREPARING

status_details

string

Details.

created_by_id

string

ID of the user who created the job.

project_id

string

ID of the project.

parent_job_id

string

Reference to the parent job.

ExecuteProjectJobResponseExecuteProjectJobResponse

Field

Description

output_files[]

StorageFile

Uploaded output files with URLs.

output_files_errors[]

FileUploadError

Output file errors

output_datasets[]

OutputDataset

Created datasets

result

JobResult

Result of the job.

StorageFileStorageFile

Field

Description

file

File

url

string

File URL.

JobResultJobResult

Field

Description

return_code

int64

Execution return code.

Was the article helpful?

Previous
Clone
Next
Cancel
© 2025 Direct Cursus Technology L.L.C.