yandex_airflow_cluster (Resource)
Статья создана
Обновлена 12 февраля 2026 г.
Managed Airflow cluster.
Example usage
//
// Create a new Airflow Cluster.
//
resource "yandex_airflow_cluster" "my_airflow_cluster" {
name = "airflow-created-with-terraform"
subnet_ids = [yandex_vpc_subnet.a.id, yandex_vpc_subnet.b.id, yandex_vpc_subnet.d.id]
service_account_id = yandex_iam_service_account.for-airflow.id
admin_password = "some-strong-password"
code_sync = {
s3 = {
bucket = "bucket-for-airflow-dags"
}
}
webserver = {
count = 1
resource_preset_id = "c1-m4"
}
scheduler = {
count = 1
resource_preset_id = "c1-m4"
}
worker = {
min_count = 1
max_count = 2
resource_preset_id = "c1-m4"
}
airflow_config = {
"api" = {
"auth_backends" = "airflow.api.auth.backend.basic_auth,airflow.api.auth.backend.session"
}
}
pip_packages = ["dbt"]
lockbox_secrets_backend = {
enabled = true
}
logging = {
enabled = true
folder_id = var.folder_id
min_level = "INFO"
}
}
Arguments & Attributes Reference
admin_password(String). Password that is used to log in to Apache Airflow web UI underadminuser.airflow_config(Map Of Map Of String). Configuration of the Apache Airflow application itself. The value of this attribute is a two-level map. Keys of top-level map are the names of configuration sections . Keys of inner maps are the names of configuration options within corresponding section.airflow_version(String). Apache Airflow version in format<major>.<minor>.code_sync[Block]. Parameters of the location and access to the code that will be executed in the cluster.git_sync[Block]. Git repository that stores DAG files used in the cluster.branch(Required)(String). The name of the branch that stores DAG files used in the cluster.repo(Required)(String). The URL of the Git repository that stores DAG files used in the cluster.ssh_key(Required)(String). The SSH key that is used to access the Git repository.sub_path(Required)(String). The path to the directory in the repository that stores DAG files used in the cluster.
s3[Block]. Currently only Object Storage (S3) is supported as the source of DAG files.bucket(Required)(String). The name of the Object Storage bucket that stores DAG files used in the cluster.
created_at(Read-Only) (String). The creation timestamp of the resource.dag_processor[Block]. Configuration of dag-processor instances. Only for airflow version 3.*.count(Required)(Number). The number of dag-processor instances in the cluster.resource_preset_id(Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
deb_packages(Set Of String). System packages that are installed in the cluster.deletion_protection(Bool). Thetruevalue means that resource is protected from accidental deletion.description(String). The resource description.folder_id(String). The folder identifier that resource belongs to. If it is not provided, the default providerfolder-idis used.id(Read-Only) (String). The resource identifier.labels(Map Of String). A set of key/value label pairs which assigned to resource.lockbox_secrets_backend[Block]. Configuration of Lockbox Secrets Backend. See documentation for details.enabled(Required)(Bool). Enables usage of Lockbox Secrets Backend.
logging[Block]. Cloud Logging configuration.enabled(Required)(Bool). Enables delivery of logs generated by the Airflow components to Cloud Logging.folder_id(String). Logs will be written to default log group of specified folder. Exactly one of the attributesfolder_idorlog_group_idshould be specified.log_group_id(String). Logs will be written to the specified log group. Exactly one of the attributesfolder_idorlog_group_idshould be specified.min_level(String). Minimum level of messages that will be sent to Cloud Logging. Can be eitherTRACE,DEBUG,INFO,WARN,ERRORorFATAL. If not set then server default is applied (currentlyINFO).
maintenance_window[Block]. Configuration of window for maintenance operations.day(String). Day of week for maintenance window. One ofMON,TUE,WED,THU,FRI,SAT,SUN.hour(Number). Hour of day in UTC time zone (1-24) for maintenance window.type(String). Type of maintenance window. Can be eitherANYTIMEorWEEKLY. IfWEEKLY, day and hour must be specified.
name(Required)(String). The resource name.pip_packages(Set Of String). Python packages that are installed in the cluster.python_version(String). Version of Python that Airflow will run on. Must be in format<major>.<minor>.scheduler[Block]. Configuration of scheduler instances.count(Required)(Number). The number of scheduler instances in the cluster.resource_preset_id(Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
security_group_ids(Set Of String). The list of security groups applied to resource or their components.service_account_id(Required)(String). Service account which linked to the resource. For more information, see documentation.status(Read-Only) (String). Status of the cluster. Can be eitherCREATING,STARTING,RUNNING,UPDATING,STOPPING,STOPPED,ERRORorSTATUS_UNKNOWN. For more information seestatusfield of JSON representation in the official documentation.subnet_ids(Required)(Set Of String). The list of VPC subnets identifiers which resource is attached.triggerer[Block]. Configuration oftriggererinstances.count(Required)(Number). The number of triggerer instances in the cluster.resource_preset_id(Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
webserver[Block]. Configuration ofwebserverinstances.count(Required)(Number). The number of webserver instances in the cluster.resource_preset_id(Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
worker[Block]. Configuration of worker instances.max_count(Required)(Number). The maximum number of worker instances in the cluster.min_count(Required)(Number). The minimum number of worker instances in the cluster.resource_preset_id(Required)(String). The identifier of the preset for computational resources available to an instance (CPU, memory etc.).
Import
The resource can be imported by using their resource ID. For getting it you can use Yandex Cloud Web Console
# terraform import yandex_airflow_cluster.<resource Name> <resource Id>
terraform import yandex_airflow_cluster.my_airflow_cluster enphq**********cjsw4