Yandex Cloud
Поиск
Связаться с намиПодключиться
  • Документация
  • Блог
  • Все сервисы
  • Статус работы сервисов
    • Популярные
    • Инфраструктура и сеть
    • Платформа данных
    • Контейнеры
    • Инструменты разработчика
    • Бессерверные вычисления
    • Безопасность
    • Мониторинг и управление ресурсами
    • Машинное обучение
    • Бизнес-инструменты
  • Все решения
    • По отраслям
    • По типу задач
    • Экономика платформы
    • Безопасность
    • Техническая поддержка
    • Каталог партнёров
    • Обучение и сертификация
    • Облако для стартапов
    • Облако для крупного бизнеса
    • Центр технологий для общества
    • Облако для интеграторов
    • Поддержка IT-бизнеса
    • Облако для фрилансеров
    • Обучение и сертификация
    • Блог
    • Документация
    • Контент-программа
    • Мероприятия и вебинары
    • Контакты, чаты и сообщества
    • Идеи
    • Истории успеха
    • Тарифы Yandex Cloud
    • Промоакции и free tier
    • Правила тарификации
  • Документация
  • Блог
Проект Яндекса
© 2025 ООО «Яндекс.Облако»
Yandex DataSphere
  • Начало работы
  • Справочник Terraform
    • Обзор
    • Аутентификация в API
      • Overview
        • Overview
          • Overview
          • Create
          • Clone
          • Execute
          • Cancel
          • ReadStdLogs
          • ReadLogs
          • DownloadJobFiles
          • List
          • Get
          • Delete
          • DeleteData
          • DeleteAllData
          • SetDataTtl
  • Аудитные логи Audit Trails
  • Управление доступом
  • Правила тарификации
  • Публичные материалы
  • История изменений

В этой статье:

  • gRPC request
  • ListProjectJobRequest
  • ListProjectJobResponse
  • Job
  • JobParameters
  • File
  • FileDesc
  • Environment
  • DockerImageSpec
  • PythonEnv
  • PipOptions
  • CloudInstanceType
  • ExtendedWorkingStorage
  • Argument
  • OutputDatasetDesc
  • GracefulShutdownParameters
  • SparkParameters
  • FileUploadError
  • OutputDataset
  1. Справочник API
  2. gRPC (англ.)
  3. DataSphere Jobs API v2
  4. ProjectJob
  5. List

DataSphere Jobs API v2, gRPC: ProjectJobService.List

Статья создана
Yandex Cloud
Обновлена 17 декабря 2024 г.
  • gRPC request
  • ListProjectJobRequest
  • ListProjectJobResponse
  • Job
  • JobParameters
  • File
  • FileDesc
  • Environment
  • DockerImageSpec
  • PythonEnv
  • PipOptions
  • CloudInstanceType
  • ExtendedWorkingStorage
  • Argument
  • OutputDatasetDesc
  • GracefulShutdownParameters
  • SparkParameters
  • FileUploadError
  • OutputDataset

Lists jobs.

gRPC requestgRPC request

rpc List (ListProjectJobRequest) returns (ListProjectJobResponse)

ListProjectJobRequestListProjectJobRequest

{
  "project_id": "string",
  "page_size": "int64",
  "page_token": "string",
  "filter": "string"
}

Field

Description

project_id

string

ID of the project.

page_size

int64

The maximum number of results per page to return. If the number of available
results is larger than page_size,
the service returns a [ListProjectJobResponse.page_token]
that can be used to get the next page of results in subsequent list requests.

page_token

string

Page token. To get the next page of results, set page_token to the
[ListProjectJobResponse.page_token] returned by a previous list request.

filter

string

restrictions:

  • only status field is supported
  • only IN operator is supported
    example:
  • only running jobs == "status IN (EXECUTING, UPLOADING_OUTPUT)"

ListProjectJobResponseListProjectJobResponse

{
  "jobs": [
    {
      "id": "string",
      "name": "string",
      "desc": "string",
      "created_at": "google.protobuf.Timestamp",
      "finished_at": "google.protobuf.Timestamp",
      "status": "JobStatus",
      "config": "string",
      "created_by_id": "string",
      "project_id": "string",
      "job_parameters": {
        "input_files": [
          {
            "desc": {
              "path": "string",
              "var": "string"
            },
            "sha256": "string",
            "size_bytes": "int64",
            "compression_type": "FileCompressionType"
          }
        ],
        "output_files": [
          {
            "path": "string",
            "var": "string"
          }
        ],
        "s3_mount_ids": [
          "string"
        ],
        "dataset_ids": [
          "string"
        ],
        "cmd": "string",
        "env": {
          "vars": "map<string, string>",
          // Includes only one of the fields `docker_image_resource_id`, `docker_image_spec`
          "docker_image_resource_id": "string",
          "docker_image_spec": {
            "image_url": "string",
            "username": "string",
            // Includes only one of the fields `password_plain_text`, `password_ds_secret_name`
            "password_plain_text": "string",
            "password_ds_secret_name": "string"
            // end of the list of possible fields
          },
          // end of the list of possible fields
          "python_env": {
            "conda_yaml": "string",
            "local_modules": [
              {
                "desc": {
                  "path": "string",
                  "var": "string"
                },
                "sha256": "string",
                "size_bytes": "int64",
                "compression_type": "FileCompressionType"
              }
            ],
            "python_version": "string",
            "requirements": [
              "string"
            ],
            "pip_options": {
              "index_url": "string",
              "extra_index_urls": [
                "string"
              ],
              "trusted_hosts": [
                "string"
              ],
              "no_deps": "bool"
            }
          }
        },
        "attach_project_disk": "bool",
        "cloud_instance_types": [
          {
            "name": "string"
          }
        ],
        "extended_working_storage": {
          "type": "StorageType",
          "size_gb": "int64"
        },
        "arguments": [
          {
            "name": "string",
            "value": "string"
          }
        ],
        "output_datasets": [
          {
            "name": "string",
            "description": "string",
            "labels": "map<string, string>",
            "size_gb": "int64",
            "var": "string"
          }
        ],
        "graceful_shutdown_parameters": {
          "timeout": "google.protobuf.Duration",
          "signal": "int64"
        },
        "spark_parameters": {
          "connector_id": "string"
        }
      },
      "data_expires_at": "google.protobuf.Timestamp",
      "data_cleared": "bool",
      "output_files": [
        {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        }
      ],
      "log_files": [
        {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        }
      ],
      "diagnostic_files": [
        {
          "desc": {
            "path": "string",
            "var": "string"
          },
          "sha256": "string",
          "size_bytes": "int64",
          "compression_type": "FileCompressionType"
        }
      ],
      "data_size_bytes": "int64",
      "started_at": "google.protobuf.Timestamp",
      "status_details": "string",
      "actual_cloud_instance_type": {
        "name": "string"
      },
      "parent_job_id": "string",
      "file_errors": [
        {
          // Includes only one of the fields `output_file_desc`, `log_file_name`
          "output_file_desc": {
            "path": "string",
            "var": "string"
          },
          "log_file_name": "string",
          // end of the list of possible fields
          "description": "string",
          "type": "ErrorType"
        }
      ],
      "output_datasets": [
        {
          "desc": {
            "name": "string",
            "description": "string",
            "labels": "map<string, string>",
            "size_gb": "int64",
            "var": "string"
          },
          "id": "string"
        }
      ]
    }
  ],
  "next_page_token": "string"
}

Field

Description

jobs[]

Job

Instances of the jobs.

next_page_token

string

This token allows you to get the next page of results for list requests. If the number of results
is larger than ListProjectJobRequest.page_size, use
the next_page_token as the value
for the ListProjectJobRequest.page_token query parameter
in the next list request. Each subsequent list request will have its own
page_token to continue paging through the results.

JobJob

Instance of the job.

Field

Description

id

string

ID of the job.

name

string

Name of the job.

desc

string

Description of the job.

created_at

google.protobuf.Timestamp

Create job timestamp.

finished_at

google.protobuf.Timestamp

Finish job timestamp.

status

enum JobStatus

Status of the job.

  • JOB_STATUS_UNSPECIFIED
  • CREATING
  • EXECUTING
  • UPLOADING_OUTPUT
  • SUCCESS
  • ERROR
  • CANCELLED
  • CANCELLING
  • PREPARING

config

string

Config of the job, copied from configuration file.

created_by_id

string

ID of the user who created the job.

project_id

string

ID of the project.

job_parameters

JobParameters

data_expires_at

google.protobuf.Timestamp

Job data expiration timestamp.

data_cleared

bool

Marks if the job data has been cleared.

output_files[]

File

Output files of the job.

log_files[]

File

Job log files.

diagnostic_files[]

File

Job diagnostics files.

data_size_bytes

int64

Job total data size.

started_at

google.protobuf.Timestamp

Start job timestamp.

status_details

string

Details.

actual_cloud_instance_type

CloudInstanceType

Actual VM instance type, job is running on.

parent_job_id

string

Reference to the parent job.

file_errors[]

FileUploadError

Failed uploads.

output_datasets[]

OutputDataset

Created datasets.

JobParametersJobParameters

Job parameters.

Field

Description

input_files[]

File

List of input files.

output_files[]

FileDesc

List of output files descriptions.

s3_mount_ids[]

string

List of DataSphere S3 mount ids.

dataset_ids[]

string

List of DataSphere dataset ids.

cmd

string

Job run command.

env

Environment

Job environment description.

attach_project_disk

bool

Should project disk be attached to VM.

cloud_instance_types[]

CloudInstanceType

VM specification.

extended_working_storage

ExtendedWorkingStorage

Extended working storage configuration.

arguments[]

Argument

List of literal arguments.

output_datasets[]

OutputDatasetDesc

List of DataSets descriptions to create.

graceful_shutdown_parameters

GracefulShutdownParameters

Graceful shutdown settings.

spark_parameters

SparkParameters

Spark connector settings.

FileFile

Field

Description

desc

FileDesc

sha256

string

SHA256 of the file.

size_bytes

int64

File size in bytes.

compression_type

enum FileCompressionType

File compression info

  • FILE_COMPRESSION_TYPE_UNSPECIFIED
  • NONE
  • ZIP

FileDescFileDesc

Field

Description

path

string

Path of the file on filesystem.

var

string

Variable to use in cmd substitution.

EnvironmentEnvironment

Field

Description

vars

object (map<string, string>)

Environment variables.

docker_image_resource_id

string

DS docker image id.

Includes only one of the fields docker_image_resource_id, docker_image_spec.

docker_image_spec

DockerImageSpec

Includes only one of the fields docker_image_resource_id, docker_image_spec.

python_env

PythonEnv

DockerImageSpecDockerImageSpec

Field

Description

image_url

string

Docker image URL.

username

string

Username for container registry.

password_plain_text

string

Plaintext password.

Includes only one of the fields password_plain_text, password_ds_secret_name.

Password for container registry.

password_ds_secret_name

string

ID of DataSphere secret containing password.

Includes only one of the fields password_plain_text, password_ds_secret_name.

Password for container registry.

PythonEnvPythonEnv

Field

Description

conda_yaml

string

Conda YAML.

local_modules[]

File

List of local modules descriptions.

python_version

string

Python version reduced to major.minor

requirements[]

string

List of pip requirements

pip_options

PipOptions

Pip install options

PipOptionsPipOptions

Field

Description

index_url

string

--index-url option

extra_index_urls[]

string

--extra-index-urls option

trusted_hosts[]

string

--trusted-hosts option

no_deps

bool

--no-deps option

CloudInstanceTypeCloudInstanceType

Field

Description

name

string

Name of DataSphere VM configuration.

ExtendedWorkingStorageExtendedWorkingStorage

Extended working storage configuration.

Field

Description

type

enum StorageType

  • STORAGE_TYPE_UNSPECIFIED
  • SSD

size_gb

int64

ArgumentArgument

Field

Description

name

string

value

string

OutputDatasetDescOutputDatasetDesc

Field

Description

name

string

Name to create dataset with

description

string

Description to show in UI

labels

object (map<string, string>)

size_gb

int64

Size of dataset to create

var

string

Var name to replace in cmd, like in FileDesc

GracefulShutdownParametersGracefulShutdownParameters

Field

Description

timeout

google.protobuf.Duration

signal

int64

default 15 (SIGTERM)

SparkParametersSparkParameters

Field

Description

connector_id

string

ID of the Spark connector.

FileUploadErrorFileUploadError

Field

Description

output_file_desc

FileDesc

Includes only one of the fields output_file_desc, log_file_name.

log_file_name

string

Includes only one of the fields output_file_desc, log_file_name.

description

string

type

enum ErrorType

  • ERROR_TYPE_UNSPECIFIED
  • UPLOAD_FAILED
  • NOT_FOUND

OutputDatasetOutputDataset

Field

Description

desc

OutputDatasetDesc

Dataset description

id

string

Id of created dataset

Была ли статья полезна?

Предыдущая
DownloadJobFiles
Следующая
Get
Проект Яндекса
© 2025 ООО «Яндекс.Облако»