Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
    • Working with Apache Airflow™ interfaces
    • Transferring logs from Apache Airflow™ to Cloud Logging
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Transferring data to the default log group
  • Sending data to a custom log group
  1. Step-by-step guides
  2. Transferring logs from Apache Airflow™ to Cloud Logging

Transferring Managed Service for Apache Airflow™ cluster logs to Yandex Cloud Logging

Written by
Yandex Cloud
Updated at May 5, 2025
  • Transferring data to the default log group
  • Sending data to a custom log group

You can set up regular collection of Managed Service for Apache Airflow™ cluster performance logs. Logs will be delivered to a log group in Cloud Logging. You can choose between these two types of log groups:

  • Log group used by default in the cluster folder.
  • Custom log group.

Transferring data to the default log groupTransferring data to the default log group

  1. Assign the managed-airflow.integrationProvider role to the cluster service account.

  2. Specify logging settings in the Managed Service for Apache Airflow™ cluster:

    Management console
    CLI
    Terraform
    API
    1. Navigate to the folder dashboard and select Managed Service for Apache Airflow™.

    2. Select the cluster and click Edit in the top panel.

    3. Under Logging, enable Write logs.

    4. To write logs to the default log group, select Folder in the Destination field.

    5. Specify the folder whose log group you want to be using.

    6. Select the minimum logging level.

      Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

    Specify the following logging parameters in the cluster create command:

    yc managed-airflow cluster update \
       ...
       --log-enabled \
       --log-folder-id <folder_ID> \
       --log-min-level <logging_level>
    

    Specify the folder whose log group you want to be using.

    Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

    Specify the following parameters in the configuration file with the cluster description:

    resource "yandex_airflow_cluster" "<cluster_name>" {
      ...
      logging = {
        enabled   = true
        folder_id = "<folder_ID>"
        min_level = "<logging_level>"
      }
    }
    

    Specify the folder whose log group you want to be using.

    Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

    In the body of the cluster update request (Cluster.Update in the REST API or ClusterService.Update in the gRPC API), specify the following parameters:

    {
       ...
       "logging": {
          "enabled": true,
          "minLevel": "<logging_level>",
          "folderId": "<folder_ID>"
       }
    }
    

    Specify the folder whose log group you want to be using.

    Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

  3. Test the transfer of cluster logs to the log group.

    Management console
    CLI
    API
    1. In the management console, go to the relevant folder.
    2. Select Cloud Logging.
    3. Click the row with the default log group.

    The page that opens will show the log group records.

    To see the messages in JSON format, run this command:

    yc logging read --group-name=default --format=json
    

    Result:

    [
      {
        "uid": "3:74********",
        "resource": {
          "type": "managed-airflow.cluster",
          "id": "c9qv4tnjqdpa********"
        },
        "timestamp": "2024-10-31T11:14:53.740223Z",
        "ingested_at": "2024-10-31T11:14:55.633Z",
        "saved_at": "2024-10-31T11:14:57.231685Z",
        "level": "INFO",
        "message": "10.253.244.40 - - \"GET /health HTTP/1.1\" 200 283 \"-\" \"kube-probe/1.25\"",
        "json_payload": {
          "file": "/home/airflow/.local/lib/python3.8/site-packages/gunicorn/glogging.py",
          "instance": "airflow-c9qv4tnjqdpa********-webserver-68********-q5***",
          "line": 363,
          "resource_id": "c9qv4tnjqdpa********",
          "stream_name": "webserver",
          "thread": "MainThread"
        },
        "stream_name": "webserver"
      }
    ]
    

    To view log group messages, use the LogReadingService/Read gRPC API call.

    For more information, see Reading records.

Sending data to a custom log groupSending data to a custom log group

  1. Create a log group named airflow-log-group.

  2. Assign the managed-airflow.integrationProvider role to the cluster service account.

  3. Specify logging settings in the Managed Service for Apache Airflow™ cluster:

    Management console
    CLI
    Terraform
    API
    1. Navigate to the folder dashboard and select Managed Service for Apache Airflow™.

    2. Select the cluster and click Edit in the top panel.

    3. Under Logging, enable Write logs.

    4. To write logs to a custom log group, select Log group in the Destination field.

    5. Specify the log group, airflow-log-group.

    6. Select the minimum logging level.

      Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

    Specify the following logging parameters in the cluster create command:

    yc managed-airflow cluster create \
       ...
       --log-enabled \
       --log-group-id <log_group_ID> \
       --log-min-level <logging_level>
    

    Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

    Specify the following parameters in the configuration file with the cluster description:

    resource "yandex_airflow_cluster" "<cluster_name>" {
      ...
      logging = {
        enabled      = true
        log_group_id = "<log_group_ID>"
        min_level    = "<logging_level>"
      }
    }
    

    Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

    In the body of the cluster update request (Cluster.Update in the REST API or ClusterService.Update in the gRPC API), specify the following parameters:

    {
       ...
       "logging": {
          "enabled": true,
          "minLevel": "<logging_level>",
          "logGroupId": "<log_group_ID>"
       }
    }
    

    Logs of the specified level and higher will be written to the execution log. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default level is INFO.

  4. Test the transfer of cluster logs to the log group.

    Management console
    CLI
    API
    1. In the management console, go to the relevant folder.
    2. Select Cloud Logging.
    3. Click the row with the airflow-log-group log group.

    The page that opens will show the log group records.

    To see the messages in JSON format, run this command:

    yc logging read --group-name=airflow-log-group --format=json
    

    Result:

    [
      {
        "uid": "3:74********",
        "resource": {
          "type": "managed-airflow.cluster",
          "id": "c9qv4tnjqdpa********"
        },
        "timestamp": "2024-10-31T11:14:53.740223Z",
        "ingested_at": "2024-10-31T11:14:55.633Z",
        "saved_at": "2024-10-31T11:14:57.231685Z",
        "level": "INFO",
        "message": "10.253.244.40 - - \"GET /health HTTP/1.1\" 200 283 \"-\" \"kube-probe/1.25\"",
        "json_payload": {
          "file": "/home/airflow/.local/lib/python3.8/site-packages/gunicorn/glogging.py",
          "instance": "airflow-c9qv4tnjqdpa********-webserver-68********-q5***",
          "line": 363,
          "resource_id": "c9qv4tnjqdpa********",
          "stream_name": "webserver",
          "thread": "MainThread"
        },
        "stream_name": "webserver"
      }
    ]
    

    To view log group messages, use the LogReadingService/Read gRPC API call.

    For more information, see Reading records.

Was the article helpful?

Previous
Working with Apache Airflow™ interfaces
Next
All tutorials
© 2025 Direct Cursus Technology L.L.C.