Yandex Cloud
Search
Contact UsTry it for free
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
  • Marketplace
    • Featured
    • Infrastructure & Network
    • Data Platform
    • AI for business
    • Security
    • DevOps tools
    • Serverless
    • Monitoring & Resources
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
    • Price calculator
    • Pricing plans
  • Customer Stories
  • Documentation
  • Blog
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
    • Working with Apache Airflow™ interfaces
      • Monitoring cluster state
      • Viewing cluster logs
      • Transferring logs from Apache Airflow™ to Cloud Logging
      • Getting DAG execution logs
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Audit Trails events
  • Release notes
  • FAQ

In this article:

  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Logs and monitoring
  3. Getting DAG execution logs

Getting DAG execution logs

Written by
Yandex Cloud
Updated at December 23, 2025
  • Prepare the DAG file and run the graph
  • Check the result

Using a directed acyclic graph (DAG), you can get DAG execution logs and export them to a separate storage if required.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named export_dag_logs.py and paste the following script to it:

    import os
    import json
    
    from airflow.decorators import dag, task
    import boto3
    
    
    def system_logs_bucket_name() -> str:
        with open('/opt/airflow/airflow.cfg') as f:
            for line in f:
                line = line.strip()
                if not line.startswith('remote_base_log_folder'):
                    continue
    
                s3_path = line.split('=')[1].strip()
                return s3_path.split('//')[1]
    
    
    @dag(schedule=None)
    def export_dag_logs():
        @task
        def list_logs_bucket():
            str_conn = os.getenv('AIRFLOW_CONN_S3_DAG_LOGS')
            if not str_conn:
                raise Exception('env var AIRFLOW_CONN_S3_DAG_LOGS not found or empty')
    
            conn = json.loads(str_conn)
            bucket = system_logs_bucket_name()
    
            session = boto3.session.Session()
            s3 = session.client(
                service_name='s3',
                endpoint_url=conn['extra']['endpoint_url'],
                aws_access_key_id=conn['login'],
                aws_secret_access_key=conn['password'],
            )
    
            # Here we can do anything with logs, e.g. clone them to custom bucket
            resp = s3.list_objects_v2(Bucket=bucket)
            object_keys = [c['Key'] for c in resp['Contents']]
    
            print('Log files:\n')
            print('\n'.join(object_keys))
    
        list_logs_bucket()
    
    
    export_dag_logs()
    

    Note

    The AIRFLOW_CONN_S3_DAG_LOGS variable is already set on the worker and does not require any additional configuration.

  2. Upload the export_dag_logs.py DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named export_dag_logs has appeared in the DAGs section.

    It may take a few minutes to load a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

Apache Airflow™ version below 3.0
Apache Airflow™ version 3.0 or higher
  1. In the DAGs section, click the export_dag_logs graph.
  2. Go to the Graph section.
  3. Select llist_logs_bucket.
  4. Go to Logs.
  5. Make sure the logs contain the Log files: {content} line, where content is the list of DAG execution logs. This means the query was successful.
  1. In the DAGs section, click the export_dag_logs graph.
  2. Go to Tasks.
  3. Select llist_logs_bucket.
  4. Go to Tasks Instances.
  5. Select the task instance.
  6. The Logs section will open.
  7. Make sure the logs contain the Log files: {content} line, where content is the list of DAG execution logs. This means the query was successful.

Was the article helpful?

Previous
Transferring logs from Apache Airflow™ to Cloud Logging
Next
All tutorials
© 2025 Direct Cursus Technology L.L.C.