Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
  • Blog
  • Pricing
  • Documentation
Yandex project
© 2025 Yandex.Cloud LLC
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Uploading a variable from Yandex Lockbox
      • Storing Apache Airflow™ connections in Yandex Lockbox
    • Working with Apache Airflow™ interfaces
    • Transferring logs from Apache Airflow™ to Cloud Logging
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Getting started
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Storing artifacts in Yandex Lockbox
  3. Storing Apache Airflow™ connections in Yandex Lockbox

Storing Apache Airflow™ connections in Yandex Lockbox

Written by
Yandex Cloud
Updated at April 10, 2025
  • Getting started
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result

When working with Yandex Managed Service for Apache Airflow™, you can use Yandex Lockbox to store artifacts that can be used in DAG files: connections, variables, and configuration data. Yandex Lockbox integrates into Managed Service for Apache Airflow™ via the Yandex Lockbox Secret Backend provider. As a result, access to the secret storage is configured automatically.

Using a directed acyclic graph (DAG), you can load a connection from Yandex Lockbox and run a SELECT 1; SQL query to a database in a Yandex Managed Service for PostgreSQL cluster. Data for connecting to the DB is stored in Yandex Lockbox and automatically substituted into the graph.

Getting startedGetting started

  1. Create a Managed Service for PostgreSQL cluster with the following parameters:

    • DB name: db1
    • Username: user1
    • Password: user1-password
  2. Issue the lockbox.payloadViewer role to your service account.

    You do not have to assign the lockbox.payloadViewer role for the whole folder. It is enough to assign it for a specific Yandex Lockbox secret once you create it.

Create a Yandex Lockbox secretCreate a Yandex Lockbox secret

For the Apache Airflow™ cluster to work correctly, your Yandex Lockbox secret's name must have this format: airflow/<artifact_type>/<artifact_ID>, where:

  • <artifact_type>: Type of the artifact to store in the secret. The following types are available:
    • connections: Connections
    • variables: Variables
    • config: Configuration data
  • <artifact_ID>: ID to use to access the artifact in Apache Airflow™.

Create a Yandex Lockbox secret with the following parameters:

  • Name: airflow/connections/pg

  • Secret type: Custom

  • Key: airflow/connections/pg

  • Value: Select Text and specify the following contents:

    {
      "conn_type": "postgres",
      "host": "<PostgreSQL_cluster_host_FQDN>",
      "port": 6432,
      "schema": "db1",
      "login": "user1",
      "password": "user1-password"
    }
    

The secret will store the data to connect to the database in the Managed Service for PostgreSQL cluster.

For more information on how to learn the FQDN of a PostgreSQL cluster host, see PostgreSQL host FQDN.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named test_lockbox_connection.py and copy the following script to it:

    from airflow import DAG
    from airflow.providers.postgres.operators.postgres import PostgresOperator
    from datetime import datetime
    
    with DAG(
      dag_id='test_lockbox_connection',
      start_date=datetime(2024, 4, 19),
      schedule="@once",
      catchup=False,
    ) as dag:
      check_conn = PostgresOperator(
          task_id="check_conn",
          postgres_conn_id='pg',
          sql="SELECT 1;",
      )
    
  2. Upload the test_lockbox_connection.py DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named test_lockbox_connection has appeared in the DAGs section.

    It may take a few minutes to upload a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

  1. In the DAGs section, open the test_lockbox_connection graph.
  2. Go to the Graph section.
  3. Select the check_conn job.
  4. Go to Logs.
  5. Make sure the logs contain the Rows affected: 1 string. This means the query was successful.

Was the article helpful?

Previous
Uploading a variable from Yandex Lockbox
Next
Uploading DAG files to a cluster
Yandex project
© 2025 Yandex.Cloud LLC