Yandex Cloud
Search
Contact UsGet started
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • AI for business
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
  • Pricing
  • Customer Stories
  • Documentation
  • Blog
© 2025 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Airflow™
  • Getting started
    • All guides
    • Working with Apache Airflow™ interfaces
      • Monitoring cluster state
      • Viewing cluster logs
      • Transferring logs from Apache Airflow™ to Cloud Logging
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Release notes
  • FAQ

In this article:

  • Transferring data to the default log group
  • Sending data to a custom log group
  1. Step-by-step guides
  2. Logs and monitoring
  3. Transferring logs from Apache Airflow™ to Cloud Logging

Transferring Managed Service for Apache Airflow™ cluster logs to Yandex Cloud Logging

Written by
Yandex Cloud
Updated at September 12, 2025
  • Transferring data to the default log group
  • Sending data to a custom log group

You can set up regular collection of Managed Service for Apache Airflow™ cluster performance logs. Logs will be delivered to a log group in Cloud Logging. You can choose between these two types of log groups:

  • Default log group of the cluster folder.
  • Custom log group.

Transferring data to the default log groupTransferring data to the default log group

  1. Assign the managed-airflow.integrationProvider role to the cluster service account.

  2. Configure logging in the Managed Service for Apache Airflow™ cluster:

    Management console
    CLI
    Terraform
    API
    1. Navigate to the folder dashboard and select Managed Service for Apache Airflow™.

    2. Select the cluster and click Edit in the top panel.

    3. Under Logging, enable Write logs.

    4. To write logs to the default log group, select Folder in the Destination field.

    5. Specify the folder whose log group you want to use.

    6. Select the minimum logging level.

      The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

    Specify the following logging parameters in the cluster create command:

    yc managed-airflow cluster update \
       ...
       --log-enabled \
       --log-folder-id <folder_ID> \
       --log-min-level <logging_level>
    

    Specify the folder whose log group you want to use.

    The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

    Specify the following parameters in the configuration file with the cluster description:

    resource "yandex_airflow_cluster" "<cluster_name>" {
      ...
      logging = {
        enabled   = true
        folder_id = "<folder_ID>"
        min_level = "<logging_level>"
      }
    }
    

    Specify the folder whose log group you want to use.

    The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

    In the body of the cluster update request (Cluster.Update in the REST API or ClusterService.Update in the gRPC API), specify the following parameters:

    {
       ...
       "logging": {
          "enabled": true,
          "minLevel": "<logging_level>",
          "folderId": "<folder_ID>"
       }
    }
    

    Specify the folder whose log group you want to use.

    The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

  3. Test the transfer of cluster logs to the log group:

    Management console
    CLI
    API
    1. In the management console, navigate to the relevant folder.
    2. Select Cloud Logging.
    3. Click the row with the default log group.

    The page that opens will show the log group entries.

    To view the entries in JSON format, run this command:

    yc logging read --group-name=default --format=json
    

    Result:

    [
      {
        "uid": "3:74********",
        "resource": {
          "type": "managed-airflow.cluster",
          "id": "c9qv4tnjqdpa********"
        },
        "timestamp": "2024-10-31T11:14:53.740223Z",
        "ingested_at": "2024-10-31T11:14:55.633Z",
        "saved_at": "2024-10-31T11:14:57.231685Z",
        "level": "INFO",
        "message": "10.253.244.40 - - \"GET /health HTTP/1.1\" 200 283 \"-\" \"kube-probe/1.25\"",
        "json_payload": {
          "file": "/home/airflow/.local/lib/python3.8/site-packages/gunicorn/glogging.py",
          "instance": "airflow-c9qv4tnjqdpa********-webserver-68********-q5***",
          "line": 363,
          "resource_id": "c9qv4tnjqdpa********",
          "stream_name": "webserver",
          "thread": "MainThread"
        },
        "stream_name": "webserver"
      }
    ]
    

    To view log group entries, use the LogReadingService.Read gRPC API call.

    For more information, see Reading records.

Sending data to a custom log groupSending data to a custom log group

  1. Create a log group named airflow-log-group.

  2. Assign the managed-airflow.integrationProvider role to the cluster service account.

  3. Configure logging in the Managed Service for Apache Airflow™ cluster:

    Management console
    CLI
    Terraform
    API
    1. Navigate to the folder dashboard and select Managed Service for Apache Airflow™.

    2. Select the cluster and click Edit in the top panel.

    3. Under Logging, enable Write logs.

    4. To write logs to a custom log group, select Log group in the Destination field.

    5. Specify the airflow-log-group log group.

    6. Select the minimum logging level.

      The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

    Specify the following logging parameters in the cluster create command:

    yc managed-airflow cluster create \
       ...
       --log-enabled \
       --log-group-id <log_group_ID> \
       --log-min-level <logging_level>
    

    The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

    Specify the following parameters in the configuration file with the cluster description:

    resource "yandex_airflow_cluster" "<cluster_name>" {
      ...
      logging = {
        enabled      = true
        log_group_id = "<log_group_ID>"
        min_level    = "<logging_level>"
      }
    }
    

    The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

    In the body of the cluster update request (Cluster.Update in the REST API or ClusterService.Update in the gRPC API), specify the following parameters:

    {
       ...
       "logging": {
          "enabled": true,
          "minLevel": "<logging_level>",
          "logGroupId": "<log_group_ID>"
       }
    }
    

    The execution log will contain logs of this level or higher. The available levels are TRACE, DEBUG, INFO, WARN, ERROR, and FATAL. The default is INFO.

  4. Test the transfer of cluster logs to the log group:

    Management console
    CLI
    API
    1. In the management console, navigate to the relevant folder.
    2. Select Cloud Logging.
    3. Click the row with the airflow-log-group log group.

    The page that opens will show the log group entries.

    To view the entries in JSON format, run this command:

    yc logging read --group-name=airflow-log-group --format=json
    

    Result:

    [
      {
        "uid": "3:74********",
        "resource": {
          "type": "managed-airflow.cluster",
          "id": "c9qv4tnjqdpa********"
        },
        "timestamp": "2024-10-31T11:14:53.740223Z",
        "ingested_at": "2024-10-31T11:14:55.633Z",
        "saved_at": "2024-10-31T11:14:57.231685Z",
        "level": "INFO",
        "message": "10.253.244.40 - - \"GET /health HTTP/1.1\" 200 283 \"-\" \"kube-probe/1.25\"",
        "json_payload": {
          "file": "/home/airflow/.local/lib/python3.8/site-packages/gunicorn/glogging.py",
          "instance": "airflow-c9qv4tnjqdpa********-webserver-68********-q5***",
          "line": 363,
          "resource_id": "c9qv4tnjqdpa********",
          "stream_name": "webserver",
          "thread": "MainThread"
        },
        "stream_name": "webserver"
      }
    ]
    

    To view log group entries, use the LogReadingService.Read gRPC API call.

    For more information, see Reading records.

Was the article helpful?

Previous
Viewing cluster logs
Next
All tutorials
© 2025 Direct Cursus Technology L.L.C.