Yandex Cloud
Search
Contact UsTry it for free
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
  • Marketplace
    • Featured
    • Infrastructure & Network
    • Data Platform
    • AI for business
    • Security
    • DevOps tools
    • Serverless
    • Monitoring & Resources
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
    • Price calculator
    • Pricing plans
  • Customer Stories
  • Documentation
  • Blog
© 2026 Direct Cursus Technology L.L.C.
Terraform in Yandex Cloud
  • Getting started
  • Solution library
    • Overview
    • Release notes
          • airflow_cluster

In this article:

  • Example usage
  • Arguments & Attributes Reference
  • Import
  1. Terraform reference
  2. Resources
  3. Managed Service for Apache Airflow™
  4. Resources
  5. airflow_cluster

yandex_airflow_cluster (Resource)

Written by
Yandex Cloud
Updated at February 12, 2026
  • Example usage
  • Arguments & Attributes Reference
  • Import

Managed Airflow cluster.

Example usageExample usage

//
// Create a new Airflow Cluster.
//
resource "yandex_airflow_cluster" "my_airflow_cluster" {
  name               = "airflow-created-with-terraform"
  subnet_ids         = [yandex_vpc_subnet.a.id, yandex_vpc_subnet.b.id, yandex_vpc_subnet.d.id]
  service_account_id = yandex_iam_service_account.for-airflow.id
  admin_password     = "some-strong-password"

  code_sync = {
    s3 = {
      bucket = "bucket-for-airflow-dags"
    }
  }

  webserver = {
    count              = 1
    resource_preset_id = "c1-m4"
  }

  scheduler = {
    count              = 1
    resource_preset_id = "c1-m4"
  }

  worker = {
    min_count          = 1
    max_count          = 2
    resource_preset_id = "c1-m4"
  }

  airflow_config = {
    "api" = {
      "auth_backends" = "airflow.api.auth.backend.basic_auth,airflow.api.auth.backend.session"
    }
  }

  pip_packages = ["dbt"]

  lockbox_secrets_backend = {
    enabled = true
  }

  logging = {
    enabled   = true
    folder_id = var.folder_id
    min_level = "INFO"
  }
}

Arguments & Attributes ReferenceArguments & Attributes Reference

  • admin_password (String). Password that is used to log in to Apache Airflow web UI under admin user.
  • airflow_config (Map Of Map Of String). Configuration of the Apache Airflow application itself. The value of this attribute is a two-level map. Keys of top-level map are the names of configuration sections. Keys of inner maps are the names of configuration options within corresponding section.
  • airflow_version (String). Apache Airflow version in format <major>.<minor>.
  • code_sync [Block]. Parameters of the location and access to the code that will be executed in the cluster.
    • git_sync [Block]. Git repository that stores DAG files used in the cluster.
      • branch (Required)(String). The name of the branch that stores DAG files used in the cluster.
      • repo (Required)(String). The URL of the Git repository that stores DAG files used in the cluster.
      • ssh_key (Required)(String). The SSH key that is used to access the Git repository.
      • sub_path (Required)(String). The path to the directory in the repository that stores DAG files used in the cluster.
    • s3 [Block]. Currently only Object Storage (S3) is supported as the source of DAG files.
      • bucket (Required)(String). The name of the Object Storage bucket that stores DAG files used in the cluster.
  • created_at (Read-Only) (String). The creation timestamp of the resource.
  • dag_processor [Block]. Configuration of dag-processor instances. Only for airflow version 3.*.
    • count (Required)(Number). The number of dag-processor instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • deb_packages (Set Of String). System packages that are installed in the cluster.
  • deletion_protection (Bool). The true value means that resource is protected from accidental deletion.
  • description (String). The resource description.
  • folder_id (String). The folder identifier that resource belongs to. If it is not provided, the default provider folder-id is used.
  • id (Read-Only) (String). The resource identifier.
  • labels (Map Of String). A set of key/value label pairs which assigned to resource.
  • lockbox_secrets_backend [Block]. Configuration of Lockbox Secrets Backend. See documentation for details.
    • enabled (Required)(Bool). Enables usage of Lockbox Secrets Backend.
  • logging [Block]. Cloud Logging configuration.
    • enabled (Required)(Bool). Enables delivery of logs generated by the Airflow components to Cloud Logging.
    • folder_id (String). Logs will be written to default log group of specified folder. Exactly one of the attributes folder_id or log_group_id should be specified.
    • log_group_id (String). Logs will be written to the specified log group. Exactly one of the attributes folder_id or log_group_id should be specified.
    • min_level (String). Minimum level of messages that will be sent to Cloud Logging. Can be either TRACE, DEBUG, INFO, WARN, ERROR or FATAL. If not set then server default is applied (currently INFO).
  • maintenance_window [Block]. Configuration of window for maintenance operations.
    • day (String). Day of week for maintenance window. One of MON, TUE, WED, THU, FRI, SAT, SUN.
    • hour (Number). Hour of day in UTC time zone (1-24) for maintenance window.
    • type (String). Type of maintenance window. Can be either ANYTIME or WEEKLY. If WEEKLY, day and hour must be specified.
  • name (Required)(String). The resource name.
  • pip_packages (Set Of String). Python packages that are installed in the cluster.
  • python_version (String). Version of Python that Airflow will run on. Must be in format <major>.<minor>.
  • scheduler [Block]. Configuration of scheduler instances.
    • count (Required)(Number). The number of scheduler instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • security_group_ids (Set Of String). The list of security groups applied to resource or their components.
  • service_account_id (Required)(String). Service account which linked to the resource. For more information, see documentation.
  • status (Read-Only) (String). Status of the cluster. Can be either CREATING, STARTING, RUNNING, UPDATING, STOPPING, STOPPED, ERROR or STATUS_UNKNOWN. For more information see status field of JSON representation in the official documentation.
  • subnet_ids (Required)(Set Of String). The list of VPC subnets identifiers which resource is attached.
  • triggerer [Block]. Configuration of triggerer instances.
    • count (Required)(Number). The number of triggerer instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • webserver [Block]. Configuration of webserver instances.
    • count (Required)(Number). The number of webserver instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
  • worker [Block]. Configuration of worker instances.
    • max_count (Required)(Number). The maximum number of worker instances in the cluster.
    • min_count (Required)(Number). The minimum number of worker instances in the cluster.
    • resource_preset_id (Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).

ImportImport

The resource can be imported by using their resource ID. For getting it you can use Yandex Cloud Web Console or Yandex Cloud CLI.

# terraform import yandex_airflow_cluster.<resource Name> <resource Id>
terraform import yandex_airflow_cluster.my_airflow_cluster enphq**********cjsw4

Was the article helpful?

Previous
airflow_cluster
Next
mdb_kafka_cluster
© 2026 Direct Cursus Technology L.L.C.