Yandex Cloud
Search
Contact UsGet started
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • AI for business
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Uploading a file to Yandex Object Storage
      • Connecting to a Yandex Object Storage bucket with a bucket policy
      • Reading a file from Yandex Object Storage
    • Working with Apache Airflow™ interfaces
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Working with Yandex Object Storage
  3. Reading a file from Yandex Object Storage

Reading a file from Yandex Object Storage

Written by
Yandex Cloud
Updated at October 23, 2025
  • Prepare the DAG file and run the graph
  • Check the result

Use a directed acyclic graph to read files from Yandex Object Storage of the service account attached to your Apache Airflow™ cluster.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named read_file_from_dags_bucket.py and paste the following script to it:

    from airflow.settings import DAGS_FOLDER
    from airflow.decorators import dag, task
    
    
    @dag(schedule=None)
    def read_file_from_dags_bucket():
        @task
        def read_file():
            with open(f'{DAGS_FOLDER}/data/airflow.txt') as file:
                content = file.read()
                print(f"file content: {content}")
    
        read_file()
    
    
    read_file_from_dags_bucket()
    
  2. Upload the read_file_from_dags_bucket.py DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named read_file_from_dags_bucket has appeared in the DAGs section.

    It may take a few minutes to load a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

Apache Airflow™ version below 3.0
Apache Airflow™ version 3.0 or higher
  1. In the DAGs section, click the read_file_from_dags_bucket graph.
  2. Go to the Graph section.
  3. Select read_file.
  4. Go to Logs.
  5. Make sure the logs contain the file content: {content} line, where content is the file contents. This means the query was successful.
  1. In the DAGs section, click the read_file_from_dags_bucket graph.
  2. Go to Tasks.
  3. Select read_file.
  4. Go to Tasks Instances.
  5. Select the task instance.
  6. The Logs section will open.
  7. Make sure the logs contain the file content: {content} line, where content is the file contents. This means the query was successful.

Was the article helpful?

Previous
Connecting to a Yandex Object Storage bucket with a bucket policy
Next
Working with Apache Airflow™ interfaces
© 2025 Direct Cursus Technology L.L.C.