Yandex Cloud
Search
Contact UsTry it for free
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
  • Marketplace
    • Featured
    • Infrastructure & Network
    • Data Platform
    • AI for business
    • Security
    • DevOps tools
    • Serverless
    • Monitoring & Resources
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
    • Price calculator
    • Pricing plans
  • Customer Stories
  • Documentation
  • Blog
© 2026 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Uploading DAG files to a cluster
      • Getting an IAM token
    • Working with Apache Airflow™ interfaces
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Audit Trails events
  • Release notes
  • FAQ

In this article:

  • Importing DAG files from a bucket
  • Importing DAG files from a Git repository
  1. Step-by-step guides
  2. Working with DAG files
  3. Uploading DAG files to a cluster

Uploading DAG files to a Managed Service for Apache Airflow™ cluster

Written by
Yandex Cloud
Updated at January 14, 2026
  • Importing DAG files from a bucket
  • Importing DAG files from a Git repository

Automation, data processing, and scheduled task execution are implemented using DAG files. DAG files are Python 3 scripts run inside Apache Airflow™. For a DAG file example, check this Apache Airflow™ tutorial.

You can import DAG files from:

  • Yandex Object Storage bucket
  • External Git repository

You can select the DAG file source type when creating or updating the cluster. This automatically delivers DAGs to your Managed Service for Apache Airflow™ cluster and makes them appear in the Apache Airflow™ web UI.

Importing DAG files from a bucketImporting DAG files from a bucket

  1. Create a folder, e.g., dags, in the bucket and upload your DAG file to it. The system will automatically import the DAG file to the cluster.

    You can upload your DAG file to the bucket root, but it uploads quicker to a folder.

    If you need to upload additional scripts or modules used in the DAG to this folder, specify the full path to these scripts or modules in the bucket. Let’s say, you have uploaded all the files to the dags folder. If so, specify the following in from ... import of the DAG file:

    from dags.<file_name> import <object>
    
  2. Open the Apache Airflow™ web interface.

  3. Make sure that the new DAG has appeared in the DAGs section. It may take a few minutes to load a DAG file from the bucket.

Importing DAG files from a Git repositoryImporting DAG files from a Git repository

  1. Upload your DAG file to the Git repository folder specified along with the repository address in the cluster settings. The system will automatically import the DAG file to the cluster.

    If you upload additional scripts or modules used in the DAG to this folder, specify the full path to these scripts or modules. Let’s say, you have uploaded all the files to the dags folder. If so, specify the following in from ... import of the DAG file:

    from dags.<file_name> import <object>
    
  2. Open the Apache Airflow™ web interface.

  3. Make sure that the new DAG has appeared in the DAGs section. It may take a few minutes to import your DAG file from the Git repository.

Was the article helpful?

Previous
Storing Apache Airflow™ connections in Yandex Lockbox
Next
Getting an IAM token
© 2026 Direct Cursus Technology L.L.C.