Yandex Cloud
Search
Contact UsGet started
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • AI for business
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
      • Managed Service for Apache Airflow™: Connecting to a database
      • Managed Service for ClickHouse®: Connecting to a database
      • Managed Service for PostgreSQL: Connecting to a database
    • Working with Apache Airflow™ interfaces
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Getting started
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result
  1. Step-by-step guides
  2. Connections
  3. Managed Service for ClickHouse®: Connecting to a database

Connecting to Yandex Managed Service for ClickHouse®

Written by
Yandex Cloud
Updated at October 23, 2025
  • Getting started
  • Create a Yandex Lockbox secret
  • Prepare the DAG file and run the graph
  • Check the result

With a directed acyclic graph (DAG), you can configure a connection to a database in a Yandex Managed Service for ClickHouse® cluster. Data for connecting to the DB is stored in Yandex Lockbox and automatically inserted into the graph.

Getting startedGetting started

  1. Create a Managed Service for ClickHouse® cluster with the following parameters:

    • DB name: default-bd
    • Username: admin
    • Password: admin-password

    Warning

    You cannot create a database named default.

  2. Create a Yandex Object Storage bucket to store the DAG file in.

  3. Configure the Managed Service for Apache Airflow™ cluster:

    1. Enable the Use Lockbox Secret Backend option allowing you to use secrets in Yandex Lockbox to store Apache Airflow™ configuration data, variables, and connection parameters.
    2. Under Dependencies, add the airflow-clickhouse-plugin pip package.
    3. Under DAG file storage, select the Object Storage bucket you created earlier. Your DAG file will be fetched from it.
  4. Issue the lockbox.payloadViewer role to your service account.

    There is no need to assign the lockbox.payloadViewer role for the whole folder. It is enough to assign it for a specific Yandex Lockbox secret once you create it.

Create a Yandex Lockbox secretCreate a Yandex Lockbox secret

For the Apache Airflow™ cluster to work correctly, your Yandex Lockbox secret's name must have this format: airflow/<artifact_type>/<artifact_ID>, where:

  • <artifact_type>: Decides what data will be stored in the secret. The allowed values are:
    • connections: Connections.
    • variables: Variables.
    • config: Configuration data.
  • <artifact_ID>: ID to use to access the artifact in Apache Airflow™.

Create a Yandex Lockbox secret with the following parameters:

  • Name: airflow/connections/ch.

  • Secret type: Custom.

  • Key: conn.

  • Value: Select Text and specify the following contents:

    {
      "conn_type": "clickhouse",
      "host": "<ClickHouse®_cluster_host_FQDN>",
      "port": 9440,
      "schema": "default-bd",
      "login": "admin",
      "password": "admin-password",
      "extra": {
          "secure": "True"
      }
    }
    

For more information on how to get the FQDN of a ClickHouse® cluster host, see FQDNs of ClickHouse® hosts.

The secret will store the data to connect to the database in the Managed Service for ClickHouse® cluster.

Prepare the DAG file and run the graphPrepare the DAG file and run the graph

  1. Create a local file named clickhouse.py and paste the following script to it:

    from airflow.decorators import dag, task
    from airflow_clickhouse_plugin.hooks.clickhouse import ClickHouseHook
    
    
    @dag(schedule=None)
    def clickhouse():
        @task
        def query_clickhouse():
            ch_hook = ClickHouseHook(clickhouse_conn_id="ch")
            result = ch_hook.execute('select 1;')
            print(f'query result: {result}')
    
        query_clickhouse()
    
    
    clickhouse()
    
    
  2. Upload the clickhouse.py DAG file to the bucket you created earlier.

  3. Open the Apache Airflow™ web interface.

  4. Make sure there is a new graph named clickhouse in the DAGs section.

    It may take a few minutes to load a DAG file from the bucket.

  5. To run the graph, click image in the line with its name.

Check the resultCheck the result

To check the result in the Apache Airflow™ web interface:

Apache Airflow™ version below 3.0
Apache Airflow™ version 3.0 or higher
  1. In the DAGs section, open the clickhouse graph.
  2. Go to the Graph section.
  3. Select query_clickhouse.
  4. Go to Logs.
  5. Make sure the logs contain the query result: [(1,)] line. This means the query was successful.
  1. In the DAGs section, click clickhouse.
  2. Go to Tasks.
  3. Select query_clickhouse.
  4. Go to Tasks Instances.
  5. Select the task instance.
  6. The Logs section will open.
  7. Make sure the logs contain the query result: [(1,)] line. This means the query was successful.

Was the article helpful?

Previous
Managed Service for Apache Airflow™: Connecting to a database
Next
Managed Service for PostgreSQL: Connecting to a database
© 2025 Direct Cursus Technology L.L.C.