Uploading a file to Yandex Object Storage
Written by
Updated at April 10, 2025
Use a directed acyclic graph (DAG) to upload files to Yandex Object Storage.
Create a bucket for uploading files
- Create an Object Storage bucket named
username-airflow
to upload your files to. - Grant
airflow-sa
theREAD and WRITE
permissions for the bucket you created.
Prepare the DAG file and run the graph
-
Create a local file named
upload_file_to_s3.py
and copy the following script to it:from airflow.decorators import dag, task import boto3 import botocore import botocore.config import yandexcloud def _upload_file_to_s3(bucket_name: str, object_path: str, content: str): sdk = yandexcloud.SDK() def provide_cloud_auth_header(request, **kwargs): request.headers.add_header("X-YaCloud-SubjectToken", sdk._channels._token_requester.get_token()) session = boto3.Session() session.events.register('request-created.s3.*', provide_cloud_auth_header) client = session.resource( "s3", endpoint_url="https://storage.yandexcloud.net", config=botocore.config.Config( signature_version=botocore.UNSIGNED, retries=dict( max_attempts=5, mode="standard", ), ), ) client.Bucket(name=bucket_name).put_object(Key=object_path, Body=content) @dag(schedule=None) def upload_file_to_s3(): @task def upload(): _upload_file_to_s3( bucket_name="username-airflow", object_path="data/airflow.txt", content="Hello from Managed Airflow!" ) upload() upload_file_to_s3()
-
Upload the
upload_file_to_s3.py
DAG file to the first created bucket. This will automatically create a graph with the same name in the Apache Airflow™ web interface. -
Make sure a new graph named
upload_file_to_s3
has appeared in the DAGs section.It may take a few minutes to upload a DAG file from the bucket.
-
To run the graph, click
in the line with its name.
Check the result
Using the Object Storage interface, check that the file is in the username-airflow
bucket.