yandex_mdb_kafka_connector (Resource)
Статья создана
Обновлена 11 сентября 2025 г.
- Example usage
- Schema
- Required
- Optional
- Read-Only
- Nested Schema for connector_config_mirrormaker
- Nested Schema for connector_config_mirrormaker.source_cluster
- Nested Schema for connector_config_mirrormaker.source_cluster.external_cluster
- Nested Schema for connector_config_mirrormaker.source_cluster.this_cluster
- Nested Schema for connector_config_mirrormaker.target_cluster
- Nested Schema for connector_config_mirrormaker.target_cluster.external_cluster
- Nested Schema for connector_config_mirrormaker.target_cluster.this_cluster
- Nested Schema for connector_config_s3_sink
- Nested Schema for connector_config_s3_sink.s3_connection
- Nested Schema for connector_config_s3_sink.s3_connection.external_s3
- Import
Manages a connector of a Kafka cluster within the Yandex Cloud. For more information, see the official documentation.
Example usage
//
// Create a new MDB Kafka Connector.
//
resource "yandex_mdb_kafka_connector" "my_conn" {
cluster_id = yandex_mdb_kafka_cluster.my_cluster.id
name = "replication"
tasks_max = 3
properties = {
refresh.topics.enabled = "true"
}
connector_config_mirrormaker {
topics = "data.*"
replication_factor = 1
source_cluster {
alias = "source"
external_cluster {
bootstrap_servers = "somebroker1:9091,somebroker2:9091"
sasl_username = "someuser"
sasl_password = "somepassword"
sasl_mechanism = "SCRAM-SHA-512"
security_protocol = "SASL_SSL"
}
}
target_cluster {
alias = "target"
this_cluster {}
}
}
}
resource "yandex_mdb_kafka_connector" "connector" {
cluster_id = yandex_mdb_kafka_cluster.my_cluster.id
name = "s3-sink"
tasks_max = 3
properties = {
"key.converter" = "org.apache.kafka.connect.storage.StringConverter"
"value.converter" = "org.apache.kafka.connect.json.JsonConverter"
"value.converter.schemas.enable" = "false"
"format.output.type" = "jsonl"
"file.name.template" = "dir1/dir2/{{topic}}-not_var{{partition:padding=true}}-not_var{{start_offset:padding=true}}.gz"
"timestamp.timezone" = "Europe/Moscow"
}
connector_config_s3_sink {
topics = "data.*"
file_compression_type = "gzip"
file_max_records = 100
s3_connection {
bucket_name = "somebucket"
external_s3 {
endpoint = "storage.yandexcloud.net"
access_key_id = "some_access_key_id"
secret_access_key = "some_secret_access_key"
}
}
}
}
resource "yandex_mdb_kafka_cluster" "my_cluster" {
name = "foo"
network_id = "c64vs98keiqc7f24pvkd"
config {
version = "2.8"
zones = ["ru-central1-a"]
kafka {
resources {
resource_preset_id = "s2.micro"
disk_type_id = "network-hdd"
disk_size = 16
}
}
}
}
Schema
Required
cluster_id(String) The ID of the Kafka cluster.name(String) The resource name.
Optional
connector_config_mirrormaker(Block List) Settings for MirrorMaker2 connector. (see below for nested schema)connector_config_s3_sink(Block List) Settings for S3 Sink connector. (see below for nested schema)properties(Map of String) Additional properties for connector.tasks_max(Number) The number of the connector's parallel working tasks. Default is the number of brokers.
Read-Only
id(String) The ID of this resource.
Nested Schema for connector_config_mirrormaker
Required:
replication_factor(Number) Replication factor for topics created in target cluster.source_cluster(Block List, Min: 1, Max: 1) Settings for source cluster. (see below for nested schema)target_cluster(Block List, Min: 1, Max: 1) Settings for target cluster. (see below for nested schema)topics(String) The pattern for topic names to be replicated.
Nested Schema for connector_config_mirrormaker.source_cluster
Optional:
alias(String) Name of the cluster. Used also as a topic prefix.external_cluster(Block List) Connection settings for external cluster. (see below for nested schema)this_cluster(Block List) Using this section in the cluster definition (source or target) means it's this cluster. (see below for nested schema)
Nested Schema for connector_config_mirrormaker.source_cluster.external_cluster
Required:
bootstrap_servers(String) List of bootstrap servers to connect to cluster.
Optional:
sasl_mechanism(String) Type of SASL authentification mechanism to use.sasl_password(String, Sensitive) Password to use in SASL authentification mechanismsasl_username(String) Username to use in SASL authentification mechanism.security_protocol(String) Security protocol to use.
Nested Schema for connector_config_mirrormaker.source_cluster.this_cluster
Nested Schema for connector_config_mirrormaker.target_cluster
Optional:
alias(String) Name of the cluster. Used also as a topic prefix.external_cluster(Block List) Connection settings for external cluster. (see below for nested schema)this_cluster(Block List) Using this section in the cluster definition (source or target) means it's this cluster. (see below for nested schema)
Nested Schema for connector_config_mirrormaker.target_cluster.external_cluster
Required:
bootstrap_servers(String) List of bootstrap servers to connect to cluster.
Optional:
sasl_mechanism(String) Type of SASL authentification mechanism to use.sasl_password(String, Sensitive) Password to use in SASL authentification mechanismsasl_username(String) Username to use in SASL authentification mechanism.security_protocol(String) Security protocol to use.
Nested Schema for connector_config_mirrormaker.target_cluster.this_cluster
Nested Schema for connector_config_s3_sink
Required:
file_compression_type(String) Compression type for messages. Cannot be changed.s3_connection(Block List, Min: 1, Max: 1) Settings for connection to s3-compatible storage. (see below for nested schema)topics(String) The pattern for topic names to be copied to s3 bucket.
Optional:
file_max_records(Number) Max records per file.
Nested Schema for connector_config_s3_sink.s3_connection
Required:
bucket_name(String) Name of the bucket in s3-compatible storage.external_s3(Block List, Min: 1) Connection params for external s3-compatible storage. (see below for nested schema)
Nested Schema for connector_config_s3_sink.s3_connection.external_s3
Required:
endpoint(String) URL of s3-compatible storage.
Optional:
access_key_id(String) ID of aws-compatible static key.region(String) Region of s3-compatible storage. Available region list .secret_access_key(String, Sensitive) Secret key of aws-compatible static key.
Import
The resource can be imported by using their resource ID. For getting the resource ID you can use Yandex Cloud Web Console
# terraform import yandex_mdb_kafka_connector.<resource Name> <resource Id>
terraform import yandex_mdb_kafka_connector.my_conn ...