Connecting to Yandex Managed Service for PostgreSQL
You can use a directed acyclic graph (DAG) to configure a connection to a database in a Yandex Managed Service for PostgreSQL cluster. Data for connecting to the DB is stored in Yandex Lockbox and automatically substituted into the graph.
Getting started
-
Create a Managed Service for PostgreSQL cluster with the following parameters:
- DB name:
db1
- Username:
user1
- Password:
user1-password
- DB name:
-
Issue the
lockbox.payloadViewer
role to your service account.You do not have to assign the
lockbox.payloadViewer
role for the whole folder. It is enough to assign it for a specific Yandex Lockbox secret once you create it.
Create a Yandex Lockbox secret
Create a Yandex Lockbox secret named airflow/connections/pg1
and containing json.dumps(conn)
, where conn
is a connection to a Managed Service for PostgreSQL cluster.
conn = {
"conn_type": "postgres",
"host": "<PostgreSQL_cluster_host_FQDN>",
"port": 6432,
"schema": "db1",
"login": "user1",
"password": "user1-password"
}
The airflow/connections/pg1
secret will store the data for connection to the database in the Managed Service for PostgreSQL cluster.
For more information on how to learn the FQDN of a PostgreSQL cluster host, see PostgreSQL host FQDN.
Prepare the DAG file and run the graph
-
Create a local file named
postgres_operator.py
and copy the following script to it:from airflow import DAG from airflow.providers.postgres.operators.postgres import PostgresOperator with DAG( dag_id='postgres_operator', schedule=None, ): PostgresOperator( task_id="check_conn", postgres_conn_id='pg1', sql="SELECT 3;", )
-
Upload the
postgres_operator.py
DAG file to the bucket you created earlier. This will automatically create a graph with the same name in the Apache Airflow™ web interface. -
Make sure a new graph named
postgres_operator
has appeared in the DAGs section.It may take a few minutes to upload a DAG file from the bucket.
-
To run the graph, click
in the line with its name.
Check the result
To check the result in the Apache Airflow™ web interface:
- In the DAGs section, open the
postgres_operator
graph. - Go to the Graph section.
- Select the check_conn job.
- Go to Logs.
- Make sure the logs contain the
Rows affected: 3
string. This means the query was successful.