Yandex Cloud
Поиск
Связаться с намиПопробовать бесплатно
  • Истории успеха
  • Документация
  • Блог
  • Все сервисы
  • Статус работы сервисов
  • Marketplace
    • Популярные
    • Инфраструктура и сеть
    • Платформа данных
    • Искусственный интеллект
    • Безопасность
    • Инструменты DevOps
    • Бессерверные вычисления
    • Управление ресурсами
  • Все решения
    • По отраслям
    • По типу задач
    • Экономика платформы
    • Безопасность
    • Техническая поддержка
    • Каталог партнёров
    • Обучение и сертификация
    • Облако для стартапов
    • Облако для крупного бизнеса
    • Центр технологий для общества
    • Облако для интеграторов
    • Поддержка IT-бизнеса
    • Облако для фрилансеров
    • Обучение и сертификация
    • Блог
    • Документация
    • Контент-программа
    • Мероприятия и вебинары
    • Контакты, чаты и сообщества
    • Идеи
    • Калькулятор цен
    • Тарифы
    • Акции и free tier
  • Истории успеха
  • Документация
  • Блог
Создавайте контент и получайте гранты!Готовы написать своё руководство? Участвуйте в контент-программе и получайте гранты на работу с облачными сервисами!
Подробнее о программе
Проект Яндекса
© 2026 ООО «Яндекс.Облако»
Terraform в Yandex Cloud
  • Начало работы
  • Библиотека решений
    • Обзор
    • История изменений (англ.)
          • airflow_cluster

В этой статье:

  • Example usage
  • Arguments & Attributes Reference
  • Import
  1. Справочник Terraform
  2. Ресурсы (англ.)
  3. Managed Service for Apache Airflow™
  4. Resources
  5. airflow_cluster

yandex_airflow_cluster (Resource)

Статья создана
Yandex Cloud
Обновлена 12 февраля 2026 г.
  • Example usage
  • Arguments & Attributes Reference
  • Import

Managed Airflow cluster.

Example usageExample usage

//
// Create a new Airflow Cluster.
//
resource "yandex_airflow_cluster" "my_airflow_cluster" {
  name               = "airflow-created-with-terraform"
  subnet_ids         = [yandex_vpc_subnet.a.id, yandex_vpc_subnet.b.id, yandex_vpc_subnet.d.id]
  service_account_id = yandex_iam_service_account.for-airflow.id
  admin_password     = "some-strong-password"

  code_sync = {
    s3 = {
      bucket = "bucket-for-airflow-dags"
    }
  }

  webserver = {
    count              = 1
    resource_preset_id = "c1-m4"
  }

  scheduler = {
    count              = 1
    resource_preset_id = "c1-m4"
  }

  worker = {
    min_count          = 1
    max_count          = 2
    resource_preset_id = "c1-m4"
  }

  airflow_config = {
    "api" = {
      "auth_backends" = "airflow.api.auth.backend.basic_auth,airflow.api.auth.backend.session"
    }
  }

  pip_packages = ["dbt"]

  lockbox_secrets_backend = {
    enabled = true
  }

  logging = {
    enabled   = true
    folder_id = var.folder_id
    min_level = "INFO"
  }
}

Arguments & Attributes ReferenceArguments & Attributes Reference

  • admin_password (String). Password that is used to log in to Apache Airflow web UI under admin user.
  • airflow_config (Map Of Map Of String). Configuration of the Apache Airflow application itself. The value of this attribute is a two-level map. Keys of top-level map are the names of configuration sections. Keys of inner maps are the names of configuration options within corresponding section.
  • airflow_version (String). Apache Airflow version in format <major>.<minor>.
  • code_sync [Block]. Parameters of the location and access to the code that will be executed in the cluster.
    • git_sync [Block]. Git repository that stores DAG files used in the cluster.
      • branch (Required)(String). The name of the branch that stores DAG files used in the cluster.
      • repo (Required)(String). The URL of the Git repository that stores DAG files used in the cluster.
      • ssh_key (Required)(String). The SSH key that is used to access the Git repository.
      • sub_path (Required)(String). The path to the directory in the repository that stores DAG files used in the cluster.
    • s3 [Block]. Currently only Object Storage (S3) is supported as the source of DAG files.
      • bucket (Required)(String). The name of the Object Storage bucket that stores DAG files used in the cluster.
  • created_at (Read-Only) (String). The creation timestamp of the resource.
  • dag_processor [Block]. Configuration of dag-processor instances. Only for airflow version 3.*.
    • count (Required)(Number). The number of dag-processor instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • deb_packages (Set Of String). System packages that are installed in the cluster.
  • deletion_protection (Bool). The true value means that resource is protected from accidental deletion.
  • description (String). The resource description.
  • folder_id (String). The folder identifier that resource belongs to. If it is not provided, the default provider folder-id is used.
  • id (Read-Only) (String). The resource identifier.
  • labels (Map Of String). A set of key/value label pairs which assigned to resource.
  • lockbox_secrets_backend [Block]. Configuration of Lockbox Secrets Backend. See documentation for details.
    • enabled (Required)(Bool). Enables usage of Lockbox Secrets Backend.
  • logging [Block]. Cloud Logging configuration.
    • enabled (Required)(Bool). Enables delivery of logs generated by the Airflow components to Cloud Logging.
    • folder_id (String). Logs will be written to default log group of specified folder. Exactly one of the attributes folder_id or log_group_id should be specified.
    • log_group_id (String). Logs will be written to the specified log group. Exactly one of the attributes folder_id or log_group_id should be specified.
    • min_level (String). Minimum level of messages that will be sent to Cloud Logging. Can be either TRACE, DEBUG, INFO, WARN, ERROR or FATAL. If not set then server default is applied (currently INFO).
  • maintenance_window [Block]. Configuration of window for maintenance operations.
    • day (String). Day of week for maintenance window. One of MON, TUE, WED, THU, FRI, SAT, SUN.
    • hour (Number). Hour of day in UTC time zone (1-24) for maintenance window.
    • type (String). Type of maintenance window. Can be either ANYTIME or WEEKLY. If WEEKLY, day and hour must be specified.
  • name (Required)(String). The resource name.
  • pip_packages (Set Of String). Python packages that are installed in the cluster.
  • python_version (String). Version of Python that Airflow will run on. Must be in format <major>.<minor>.
  • scheduler [Block]. Configuration of scheduler instances.
    • count (Required)(Number). The number of scheduler instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • security_group_ids (Set Of String). The list of security groups applied to resource or their components.
  • service_account_id (Required)(String). Service account which linked to the resource. For more information, see documentation.
  • status (Read-Only) (String). Status of the cluster. Can be either CREATING, STARTING, RUNNING, UPDATING, STOPPING, STOPPED, ERROR or STATUS_UNKNOWN. For more information see status field of JSON representation in the official documentation.
  • subnet_ids (Required)(Set Of String). The list of VPC subnets identifiers which resource is attached.
  • triggerer [Block]. Configuration of triggerer instances.
    • count (Required)(Number). The number of triggerer instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • webserver [Block]. Configuration of webserver instances.
    • count (Required)(Number). The number of webserver instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • worker [Block]. Configuration of worker instances.
    • max_count (Required)(Number). The maximum number of worker instances in the cluster.
    • min_count (Required)(Number). The minimum number of worker instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).

ImportImport

The resource can be imported by using their resource ID. For getting it you can use Yandex Cloud Web Console or Yandex Cloud CLI.

# terraform import yandex_airflow_cluster.<resource Name> <resource Id>
terraform import yandex_airflow_cluster.my_airflow_cluster enphq**********cjsw4

Была ли статья полезна?

Предыдущая
airflow_cluster
Следующая
mdb_kafka_cluster
Создавайте контент и получайте гранты!Готовы написать своё руководство? Участвуйте в контент-программе и получайте гранты на работу с облачными сервисами!
Подробнее о программе
Проект Яндекса
© 2026 ООО «Яндекс.Облако»