Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Managed Service for Apache Airflow™: Connecting to a database
      • Managed Service for ClickHouse®: Connecting to a database
      • Managed Service for PostgreSQL: Connecting to a database
    • Working with Apache Airflow™ interfaces
    • Transferring logs from Apache Airflow™ to Cloud Logging
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Connections
  3. Managed Service for Apache Airflow™: Connecting to a database

Connecting to a Yandex Managed Service for Apache Airflow™ database

Written by
Yandex Cloud
Updated at April 10, 2025
  • Prepare the DAG file and run the graph
  • Check the result

You can connect to a Managed Service for Apache Airflow™ database on the software level and update connection data using a directed acyclic graph (DAG).

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named update_connections.py and copy the following script to it:

    import json
    from airflow.decorators import dag, task
    from airflow.settings import Session
    from airflow.models import Connection
    
    
    @dag(schedule=None)
    def update_connections():
        @task
        def update_connections_task():
            with Session() as session:
                connections = session.query(Connection)
                for conn in connections:
                    extra = conn.extra_dejson
                    print(f"extra: {extra}")
                    update_count = extra.get('update_count', 0)
                    extra['update_count'] = update_count + 1
                    conn.set_extra(json.dumps(extra))
                    session.add(conn)
                session.commit()
    
        update_connections_task()
    
    
    update_connections()
    
  2. Upload the update_connections.py DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named update_connections has appeared in the DAGs section.

    It may take a few minutes to upload a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

  1. In the DAGs section, open the update_connections graph.
  2. Go to the Graph section.
  3. Select the update_connections_task job.
  4. Go to Logs.
  5. Make sure the logs contain a list of updated connections. This means the query was successful.

Was the article helpful?

Previous
Deleting a cluster
Next
Managed Service for ClickHouse®: Connecting to a database
© 2025 Direct Cursus Technology L.L.C.