Yandex Cloud
Search
Contact UsGet started
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • AI for business
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Loading a variable from Yandex Lockbox
      • Storing Apache Airflow™ connections in Yandex Lockbox
    • Working with Apache Airflow™ interfaces
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Getting started
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Storing artifacts in Yandex Lockbox
  3. Storing Apache Airflow™ connections in Yandex Lockbox

Storing Apache Airflow™ connections in Yandex Lockbox

Written by
Yandex Cloud
Updated at October 23, 2025
  • Getting started
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result

When working with Yandex Managed Service for Apache Airflow™, you can use Yandex Lockbox to store artifacts that can be used in DAG files: connections, variables, and configuration data. Yandex Lockbox integrates into Managed Service for Apache Airflow™ via the Yandex Lockbox Secret Backend provider. As a result, access to the secret storage is configured automatically.

Using a directed acyclic graph (DAG), you can load a connection from Yandex Lockbox and run a SELECT 1; SQL query to a database in a Yandex Managed Service for PostgreSQL cluster. Data for connecting to the DB is stored in Yandex Lockbox and automatically inserted into the graph.

Tip

Clusters running Apache Airflow™ older than 3.0 use apache-airflow-providers-postgres 5.13.1 as a default provider. If using a newer provider version, use SQLExecuteQueryOperator instead of PostgresOperator. For more information, see the official documentation.

Getting startedGetting started

  1. Create a Managed Service for PostgreSQL cluster with the following parameters:

    • DB name: db1
    • Username: user1
    • Password: user1-password
  2. Create a Yandex Object Storage bucket to store the DAG file in.

  3. Configure the Managed Service for Apache Airflow™ cluster:

    1. Enable Use Lockbox Secret Backend to use Yandex Lockbox secrets to store Apache Airflow™ configuration data, variables, and connection parameters.

    2. Under Dependencies, add the apache-airflow-providers-postgres pip package.

      Warning

      You need to install a pip package for clusters with Apache Airflow™ version 3.0 or higher. This package comes installed by default on clusters with Apache Airflow™ versions below 3.0.

    3. Under DAG file storage, select the Object Storage bucket you created earlier. Your DAG file will be fetched from it.

  4. Issue the lockbox.payloadViewer role to your service account.

    There is no need to assign the lockbox.payloadViewer role for the whole folder. It is enough to assign it for a specific Yandex Lockbox secret once you create it.

  5. Issue the lockbox.payloadViewer role to your service account.

    There is no need to assign the lockbox.payloadViewer role for the whole folder. It is enough to assign it for a specific Yandex Lockbox secret once you create it.

Create a Yandex Lockbox secretCreate a Yandex Lockbox secret

For the Apache Airflow™ cluster to work correctly, your Yandex Lockbox secret's name must have this format: airflow/<artifact_type>/<artifact_ID>, where:

  • <artifact_type>: Decides what data will be stored in the secret. The allowed values are:
    • connections: Connections.
    • variables: Variables.
    • config: Configuration data.
  • <artifact_ID>: ID to use to access the artifact in Apache Airflow™.

Create a Yandex Lockbox secret with the following parameters:

  • Name: airflow/connections/pg1.

  • Secret type: Custom.

  • Key: conn.

  • Value: Select Text and specify the following contents:

    {
      "conn_type": "postgres",
      "host": "<PostgreSQL_cluster_host_FQDN>",
      "port": 6432,
      "schema": "db1",
      "login": "user1",
      "password": "user1-password"
    }
    

The secret will store the data to connect to the database in the Managed Service for PostgreSQL cluster.

For more information on how to get the FQDN of a PostgreSQL cluster host, see PostgreSQL host FQDN.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named test_lockbox_connection.py and paste the following script to it:

    Apache Airflow™ version below 3.0
    Apache Airflow™ version 3.0 or higher
    from airflow import DAG
    from airflow.providers.postgres.operators.postgres import PostgresOperator
    from datetime import datetime
    
    
    with DAG(
      dag_id='test_lockbox_connection',
      start_date=datetime(2024, 4, 19),
      schedule="@once",
      catchup=False,
    ) as dag:
      check_conn = PostgresOperator(
          task_id="check_conn",
          postgres_conn_id='pg1',
          sql="SELECT 1;",
      )
    
    from airflow import DAG
    from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator
    from datetime import datetime
    
    
    with DAG(
      dag_id='test_lockbox_connection',
      start_date=datetime(2024, 4, 19),
      schedule="@once",
      catchup=False,
    ) as dag:
      check_conn = SQLExecuteQueryOperator(
          task_id="check_conn",
          conn_id='pg1',
          sql="SELECT 1;",
      )
    
  2. Upload the test_lockbox_connection.py DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named test_lockbox_connection has appeared in the DAGs section.

    It may take a few minutes to load a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

Apache Airflow™ version below 3.0
Apache Airflow™ version 3.0 or higher
  1. In the DAGs section, click test_lockbox_connection.
  2. Go to the Graph section.
  3. Select check_conn.
  4. Go to Logs.
  5. Make sure the logs contain the Rows affected: 1 line. This means the query was successful.
  1. In the DAGs section, click test_lockbox_connection.
  2. Go to Tasks.
  3. Select check_conn.
  4. Go to Tasks Instances.
  5. Select the task instance.
  6. The Logs section will open.
  7. Make sure the logs contain the Rows affected: 1 line. This means the query was successful.

Was the article helpful?

Previous
Loading a variable from Yandex Lockbox
Next
Uploading DAG files to a cluster
© 2025 Direct Cursus Technology L.L.C.