Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
  • Blog
  • Pricing
  • Documentation
Yandex project
© 2025 Yandex.Cloud LLC
Yandex Data Processing
  • Getting started
    • All tutorials
      • Overview
      • Working with Hive jobs
      • Working with MapReduce jobs
      • Working with PySpark jobs
      • Working with Spark jobs
      • Running Apache Hive jobs
      • Running Spark applications
      • Running jobs from a remote host
  • Access management
  • Pricing policy
  • Terraform reference
  • Monitoring metrics
  • Audit Trails events
  • Public materials
  • FAQ

In this article:

  • Basic examples of working with jobs
  • Advanced examples of working with jobs
  1. Tutorials
  2. Working with jobs
  3. Overview

Working with jobs

Written by
Yandex Cloud
Updated at December 11, 2023
  • Basic examples of working with jobs
  • Advanced examples of working with jobs

Basic examples of working with jobsBasic examples of working with jobs

  • Working with Hive jobs
  • Working with MapReduce jobs
  • Working with PySpark jobs
  • Working with Spark jobs

Advanced examples of working with jobsAdvanced examples of working with jobs

  • Running Apache Hive jobs
  • Launching and managing applications for Spark and PySpark
  • Running jobs from remote hosts that are not part of the Yandex Data Processing cluster

Was the article helpful?

Previous
Reconfiguring a network connection when recreating a Yandex Data Processing cluster
Next
Working with Hive jobs
Yandex project
© 2025 Yandex.Cloud LLC