Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Uploading a file to Yandex Object Storage
      • Connecting to a Yandex Object Storage bucket with a bucket policy
      • Reading a file from Yandex Object Storage
    • Working with Apache Airflow™ interfaces
    • Transferring logs from Apache Airflow™ to Cloud Logging
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Create a bucket for uploading files
  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Working with Yandex Object Storage
  3. Uploading a file to Yandex Object Storage

Uploading a file to Yandex Object Storage

Written by
Yandex Cloud
Updated at April 10, 2025
  • Create a bucket for uploading files
  • Prepare the DAG file and run the graph
  • Check the result

Use a directed acyclic graph (DAG) to upload files to Yandex Object Storage.

Create a bucket for uploading filesCreate a bucket for uploading files

  1. Create an Object Storage bucket named username-airflow to upload your files to.
  2. Grant airflow-sa the READ and WRITE permissions for the bucket you created.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named upload_file_to_s3.py and copy the following script to it:

    from airflow.decorators import dag, task
    import boto3
    import botocore
    import botocore.config
    import yandexcloud
    
    
    def _upload_file_to_s3(bucket_name: str, object_path: str, content: str):
        sdk = yandexcloud.SDK()
    
        def provide_cloud_auth_header(request, **kwargs):
            request.headers.add_header("X-YaCloud-SubjectToken", sdk._channels._token_requester.get_token())
    
        session = boto3.Session()
        session.events.register('request-created.s3.*', provide_cloud_auth_header)
        client = session.resource(
            "s3",
            endpoint_url="https://storage.yandexcloud.net",
            config=botocore.config.Config(
                signature_version=botocore.UNSIGNED,
                retries=dict(
                    max_attempts=5,
                    mode="standard",
                ),
            ),
        )
        client.Bucket(name=bucket_name).put_object(Key=object_path, Body=content)
    
    
    @dag(schedule=None)
    def upload_file_to_s3():
        @task
        def upload():
            _upload_file_to_s3(
                bucket_name="username-airflow",
                object_path="data/airflow.txt",
                content="Hello from Managed Airflow!"
            )
    
        upload()
    
    
    upload_file_to_s3()
    
    
  2. Upload the upload_file_to_s3.py DAG file to the first created bucket. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named upload_file_to_s3 has appeared in the DAGs section.

    It may take a few minutes to upload a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

Using the Object Storage interface, check that the file is in the username-airflow bucket.

Was the article helpful?

Previous
Getting an IAM token
Next
Connecting to a Yandex Object Storage bucket with a bucket policy
© 2025 Direct Cursus Technology L.L.C.