Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex DataSphere
  • Getting started
  • Terraform reference
    • Overview
    • Authentication with the API
      • Overview
        • Overview
          • Overview
          • Create
          • Clone
          • Execute
          • Cancel
          • ReadStdLogs
          • ReadLogs
          • DownloadJobFiles
          • List
          • Get
          • Delete
          • DeleteData
          • DeleteAllData
          • SetDataTtl
  • Audit Trails events
  • Access management
  • Pricing policy
  • Public materials
  • Release notes

In this article:

  • gRPC request
  • GetProjectJobRequest
  • Job
  • JobParameters
  • File
  • FileDesc
  • Environment
  • DockerImageSpec
  • PythonEnv
  • PipOptions
  • CloudInstanceType
  • ExtendedWorkingStorage
  • Argument
  • OutputDatasetDesc
  • GracefulShutdownParameters
  • SparkParameters
  • FileUploadError
  • OutputDataset
  1. API reference
  2. gRPC
  3. DataSphere Jobs API v2
  4. ProjectJob
  5. Get

DataSphere Jobs API v2, gRPC: ProjectJobService.Get

Written by
Yandex Cloud
Updated at December 17, 2024
  • gRPC request
  • GetProjectJobRequest
  • Job
  • JobParameters
  • File
  • FileDesc
  • Environment
  • DockerImageSpec
  • PythonEnv
  • PipOptions
  • CloudInstanceType
  • ExtendedWorkingStorage
  • Argument
  • OutputDatasetDesc
  • GracefulShutdownParameters
  • SparkParameters
  • FileUploadError
  • OutputDataset

Returns job by id.

gRPC requestgRPC request

rpc Get (GetProjectJobRequest) returns (Job)

GetProjectJobRequestGetProjectJobRequest

{
  "job_id": "string"
}

Field

Description

job_id

string

ID of the job.

JobJob

{
  "id": "string",
  "name": "string",
  "desc": "string",
  "created_at": "google.protobuf.Timestamp",
  "finished_at": "google.protobuf.Timestamp",
  "status": "JobStatus",
  "config": "string",
  "created_by_id": "string",
  "project_id": "string",
  "job_parameters": {
    "input_files": [
      {
        "desc": {
          "path": "string",
          "var": "string"
        },
        "sha256": "string",
        "size_bytes": "int64",
        "compression_type": "FileCompressionType"
      }
    ],
    "output_files": [
      {
        "path": "string",
        "var": "string"
      }
    ],
    "s3_mount_ids": [
      "string"
    ],
    "dataset_ids": [
      "string"
    ],
    "cmd": "string",
    "env": {
      "vars": "map<string, string>",
      // Includes only one of the fields `docker_image_resource_id`, `docker_image_spec`
      "docker_image_resource_id": "string",
      "docker_image_spec": {
        "image_url": "string",
        "username": "string",
        // Includes only one of the fields `password_plain_text`, `password_ds_secret_name`
        "password_plain_text": "string",
        "password_ds_secret_name": "string"
        // end of the list of possible fields
      },
      // end of the list of possible fields
      "python_env": {
        "conda_yaml": "string",
        "local_modules": [
          {
            "desc": {
              "path": "string",
              "var": "string"
            },
            "sha256": "string",
            "size_bytes": "int64",
            "compression_type": "FileCompressionType"
          }
        ],
        "python_version": "string",
        "requirements": [
          "string"
        ],
        "pip_options": {
          "index_url": "string",
          "extra_index_urls": [
            "string"
          ],
          "trusted_hosts": [
            "string"
          ],
          "no_deps": "bool"
        }
      }
    },
    "attach_project_disk": "bool",
    "cloud_instance_types": [
      {
        "name": "string"
      }
    ],
    "extended_working_storage": {
      "type": "StorageType",
      "size_gb": "int64"
    },
    "arguments": [
      {
        "name": "string",
        "value": "string"
      }
    ],
    "output_datasets": [
      {
        "name": "string",
        "description": "string",
        "labels": "map<string, string>",
        "size_gb": "int64",
        "var": "string"
      }
    ],
    "graceful_shutdown_parameters": {
      "timeout": "google.protobuf.Duration",
      "signal": "int64"
    },
    "spark_parameters": {
      "connector_id": "string"
    }
  },
  "data_expires_at": "google.protobuf.Timestamp",
  "data_cleared": "bool",
  "output_files": [
    {
      "desc": {
        "path": "string",
        "var": "string"
      },
      "sha256": "string",
      "size_bytes": "int64",
      "compression_type": "FileCompressionType"
    }
  ],
  "log_files": [
    {
      "desc": {
        "path": "string",
        "var": "string"
      },
      "sha256": "string",
      "size_bytes": "int64",
      "compression_type": "FileCompressionType"
    }
  ],
  "diagnostic_files": [
    {
      "desc": {
        "path": "string",
        "var": "string"
      },
      "sha256": "string",
      "size_bytes": "int64",
      "compression_type": "FileCompressionType"
    }
  ],
  "data_size_bytes": "int64",
  "started_at": "google.protobuf.Timestamp",
  "status_details": "string",
  "actual_cloud_instance_type": {
    "name": "string"
  },
  "parent_job_id": "string",
  "file_errors": [
    {
      // Includes only one of the fields `output_file_desc`, `log_file_name`
      "output_file_desc": {
        "path": "string",
        "var": "string"
      },
      "log_file_name": "string",
      // end of the list of possible fields
      "description": "string",
      "type": "ErrorType"
    }
  ],
  "output_datasets": [
    {
      "desc": {
        "name": "string",
        "description": "string",
        "labels": "map<string, string>",
        "size_gb": "int64",
        "var": "string"
      },
      "id": "string"
    }
  ]
}

Instance of the job.

Field

Description

id

string

ID of the job.

name

string

Name of the job.

desc

string

Description of the job.

created_at

google.protobuf.Timestamp

Create job timestamp.

finished_at

google.protobuf.Timestamp

Finish job timestamp.

status

enum JobStatus

Status of the job.

  • JOB_STATUS_UNSPECIFIED
  • CREATING
  • EXECUTING
  • UPLOADING_OUTPUT
  • SUCCESS
  • ERROR
  • CANCELLED
  • CANCELLING
  • PREPARING

config

string

Config of the job, copied from configuration file.

created_by_id

string

ID of the user who created the job.

project_id

string

ID of the project.

job_parameters

JobParameters

data_expires_at

google.protobuf.Timestamp

Job data expiration timestamp.

data_cleared

bool

Marks if the job data has been cleared.

output_files[]

File

Output files of the job.

log_files[]

File

Job log files.

diagnostic_files[]

File

Job diagnostics files.

data_size_bytes

int64

Job total data size.

started_at

google.protobuf.Timestamp

Start job timestamp.

status_details

string

Details.

actual_cloud_instance_type

CloudInstanceType

Actual VM instance type, job is running on.

parent_job_id

string

Reference to the parent job.

file_errors[]

FileUploadError

Failed uploads.

output_datasets[]

OutputDataset

Created datasets.

JobParametersJobParameters

Job parameters.

Field

Description

input_files[]

File

List of input files.

output_files[]

FileDesc

List of output files descriptions.

s3_mount_ids[]

string

List of DataSphere S3 mount ids.

dataset_ids[]

string

List of DataSphere dataset ids.

cmd

string

Job run command.

env

Environment

Job environment description.

attach_project_disk

bool

Should project disk be attached to VM.

cloud_instance_types[]

CloudInstanceType

VM specification.

extended_working_storage

ExtendedWorkingStorage

Extended working storage configuration.

arguments[]

Argument

List of literal arguments.

output_datasets[]

OutputDatasetDesc

List of DataSets descriptions to create.

graceful_shutdown_parameters

GracefulShutdownParameters

Graceful shutdown settings.

spark_parameters

SparkParameters

Spark connector settings.

FileFile

Field

Description

desc

FileDesc

sha256

string

SHA256 of the file.

size_bytes

int64

File size in bytes.

compression_type

enum FileCompressionType

File compression info

  • FILE_COMPRESSION_TYPE_UNSPECIFIED
  • NONE
  • ZIP

FileDescFileDesc

Field

Description

path

string

Path of the file on filesystem.

var

string

Variable to use in cmd substitution.

EnvironmentEnvironment

Field

Description

vars

object (map<string, string>)

Environment variables.

docker_image_resource_id

string

DS docker image id.

Includes only one of the fields docker_image_resource_id, docker_image_spec.

docker_image_spec

DockerImageSpec

Includes only one of the fields docker_image_resource_id, docker_image_spec.

python_env

PythonEnv

DockerImageSpecDockerImageSpec

Field

Description

image_url

string

Docker image URL.

username

string

Username for container registry.

password_plain_text

string

Plaintext password.

Includes only one of the fields password_plain_text, password_ds_secret_name.

Password for container registry.

password_ds_secret_name

string

ID of DataSphere secret containing password.

Includes only one of the fields password_plain_text, password_ds_secret_name.

Password for container registry.

PythonEnvPythonEnv

Field

Description

conda_yaml

string

Conda YAML.

local_modules[]

File

List of local modules descriptions.

python_version

string

Python version reduced to major.minor

requirements[]

string

List of pip requirements

pip_options

PipOptions

Pip install options

PipOptionsPipOptions

Field

Description

index_url

string

--index-url option

extra_index_urls[]

string

--extra-index-urls option

trusted_hosts[]

string

--trusted-hosts option

no_deps

bool

--no-deps option

CloudInstanceTypeCloudInstanceType

Field

Description

name

string

Name of DataSphere VM configuration.

ExtendedWorkingStorageExtendedWorkingStorage

Extended working storage configuration.

Field

Description

type

enum StorageType

  • STORAGE_TYPE_UNSPECIFIED
  • SSD

size_gb

int64

ArgumentArgument

Field

Description

name

string

value

string

OutputDatasetDescOutputDatasetDesc

Field

Description

name

string

Name to create dataset with

description

string

Description to show in UI

labels

object (map<string, string>)

size_gb

int64

Size of dataset to create

var

string

Var name to replace in cmd, like in FileDesc

GracefulShutdownParametersGracefulShutdownParameters

Field

Description

timeout

google.protobuf.Duration

signal

int64

default 15 (SIGTERM)

SparkParametersSparkParameters

Field

Description

connector_id

string

ID of the Spark connector.

FileUploadErrorFileUploadError

Field

Description

output_file_desc

FileDesc

Includes only one of the fields output_file_desc, log_file_name.

log_file_name

string

Includes only one of the fields output_file_desc, log_file_name.

description

string

type

enum ErrorType

  • ERROR_TYPE_UNSPECIFIED
  • UPLOAD_FAILED
  • NOT_FOUND

OutputDatasetOutputDataset

Field

Description

desc

OutputDatasetDesc

Dataset description

id

string

Id of created dataset

Was the article helpful?

Previous
List
Next
Delete
© 2025 Direct Cursus Technology L.L.C.