Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex Lockbox
  • Getting started
    • All tutorials
    • Syncing with Managed Service for Kubernetes secrets
    • Storing Apache Airflow™ connections and variables in Yandex Lockbox
    • Building a CI/CD pipeline in GitLab with serverless products
    • Secure storage of GitLab CI passwords as Yandex Lockbox secrets
    • Loading data from Yandex Direct to a Yandex Managed Service for ClickHouse® data mart using Yandex Cloud Functions, Yandex Object Storage, and Yandex Data Transfer
    • Deploying a fault-tolerant architecture with preemptible VMs
    • Creating an interactive serverless application using WebSocket
    • Automatically copying objects from one Object Storage bucket to another
    • Secure password transmission to an initialization script
  • Quotas and limits
  • Access management
  • Pricing policy
  • Terraform reference
  • Monitoring metrics
  • Audit Trails events
  • Release notes
  • FAQ

In this article:

  • Required paid resources
  • Set up your infrastructure
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result
  • Delete the resources you created
  1. Tutorials
  2. Storing Apache Airflow™ connections and variables in Yandex Lockbox

Storing Apache Airflow™ connections and variables in Yandex Lockbox

Written by
Yandex Cloud
Updated at April 25, 2025
  • Required paid resources
  • Set up your infrastructure
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result
  • Delete the resources you created

When working with Yandex Managed Service for Apache Airflow™, you can use Yandex Lockbox to store artifacts that can be used in DAG files: connections, variables, and configuration data. Yandex Lockbox integrates into Managed Service for Apache Airflow™ via the Yandex Lockbox Secret Backend provider. As a result, access to the secret storage is configured automatically.

Below, we consider a directed acyclic graph (DAG) running the SELECT 1; SQL query to a database in a Yandex Managed Service for PostgreSQL cluster. Data for connecting to the DB is stored in Yandex Lockbox and automatically substituted into the graph.

To use configuration data from a Yandex Lockbox secret in the graph:

  1. Set up your infrastructure.
  2. Create a Yandex Lockbox secret.
  3. Prepare the DAG file and run the graph.
  4. Check the result.

If you no longer need the resources you created, delete them.

Required paid resourcesRequired paid resources

The support cost includes:

  • Managed Service for PostgreSQL cluster fee: Computing resources and disk space (see Managed Service for PostgreSQL pricing).
  • Managed Service for Apache Airflow™ cluster fee: Computing resources and disk space (see Apache Airflow™ pricing).
  • Object Storage bucket fee: Storing data and performing operations with it (see Object Storage pricing).
  • Fee for using a Yandex Lockbox secret (see Yandex Lockbox pricing).
  • Fee for using public IP addresses if public access is enabled for cluster hosts (see Virtual Private Cloud pricing).

Set up your infrastructureSet up your infrastructure

  1. Create a service account named airflow-sa with the following roles:

    • managed-airflow.integrationProvider
    • lockbox.payloadViewer

    You do not have to assign the lockbox.payloadViewer role for the whole folder. It is enough to assign it for a specific Yandex Lockbox secret once you create it.

  2. Create an Object Storage bucket in any configuration.

  3. Edit the ACL of the new bucket to give the READ permission to the airflow-sa service account.

  4. Create a Managed Service for Apache Airflow™ cluster with the following parameters:

    • Service account: airflow-sa.
    • Bucket name: Name of the new bucket.
    • Use Lockbox Secret Backend: Make sure to enable this option.
  5. Create a Managed Service for PostgreSQL cluster with the following parameters:

    • DB name: db1
    • Username: user1
    • Password: user1-password

Create a Yandex Lockbox secretCreate a Yandex Lockbox secret

For the Apache Airflow™ cluster to work correctly, your Yandex Lockbox secret's name must have this format: airflow/<artifact_type>/<artifact_ID>, where:

  • <artifact_type>: Type of the artifact to store in the secret. The following types are available:
    • connections: Connections.
    • variables: Variables.
    • config: Configuration data.
  • <artifact_ID>: ID to use to access the artifact in Apache Airflow™.

Create a Yandex Lockbox secret with the following parameters:

  • Name: airflow/connections/pg

  • Secret type: Custom

  • Key: airflow/connections/pg

  • Value: Select Text and specify the following contents:

    {
      "conn_type": "postgres",
      "host": "<PostgreSQL_cluster_host_FQDN>",
      "port": 6432,
      "schema": "db1",
      "login": "user1",
      "password": "user1-password"
    }
    

The secret will store the data to connect to the database in the Managed Service for PostgreSQL cluster.

For more information on how to get the FQDN of a PostgreSQL cluster host, see the documentation.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named test_lockbox_connection.py and copy the following script to it:

    from airflow import DAG
    from airflow.providers.postgres.operators.postgres import PostgresOperator
    from datetime import datetime
    
    with DAG(
      dag_id='test_lockbox_connection',
      start_date=datetime(2024, 4, 19),
      schedule="@once",
      catchup=False,
    ) as dag:
      check_conn = PostgresOperator(
          task_id="check_conn",
          postgres_conn_id='pg',
          sql="SELECT 1;",
      )
    
  2. Upload the test_lockbox_connection.py DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface.

  3. Open the Apache Airflow™ web interface.

  4. Make sure a new graph named test_lockbox_connection has appeared in the DAGs section.

    It may take a few minutes to upload a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

  1. In the DAGs section, open the test_lockbox_connection graph.
  2. Go to the Graph section.
  3. Select the check_conn job.
  4. Go to Logs.
  5. Make sure the logs contain the Rows affected: 1 string. This means the query was successful.

Delete the resources you createdDelete the resources you created

Some resources are not free of charge. Delete the resources you no longer need to avoid paying for them:

  1. Service account
  2. Object Storage bucket
  3. Yandex Lockbox secret
  4. Managed Service for Apache Airflow™ cluster
  5. Managed Service for PostgreSQL cluster

Was the article helpful?

Previous
Syncing with Managed Service for Kubernetes secrets
Next
Building a CI/CD pipeline in GitLab with serverless products
© 2025 Direct Cursus Technology L.L.C.