Yandex Cloud
Search
Contact UsTry it for free
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
  • Marketplace
    • Featured
    • Infrastructure & Network
    • Data Platform
    • AI for business
    • Security
    • DevOps tools
    • Serverless
    • Monitoring & Resources
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
    • Price calculator
    • Pricing plans
  • Customer Stories
  • Documentation
  • Blog
© 2026 Direct Cursus Technology L.L.C.
Yandex Managed Service for Apache Kafka®
  • Getting started
    • All guides
      • Pre-configuration
      • Connecting from applications
      • Code examples
    • Managing topics
    • Managing users
    • Managing connectors
    • Kafka UI for Apache Kafka®
  • Access management
  • Pricing policy
  • Terraform reference
  • Yandex Monitoring metrics
  • Audit Trails events
  • Public materials
  • Release notes
  • FAQ

In this article:

  • Command line tools
  • kafkacat
  • Apache Kafka® tools for Linux (Bash)/macOS (Zsh)
  • Apache Kafka® tools for Windows (PowerShell)
  • Before you connect from a Docker container
  1. Step-by-step guides
  2. Connection
  3. Connecting from applications

Connecting to an Apache Kafka® cluster from applications

Written by
Yandex Cloud
Updated at February 6, 2026
  • Command line tools
    • kafkacat
    • Apache Kafka® tools for Linux (Bash)/macOS (Zsh)
    • Apache Kafka® tools for Windows (PowerShell)
  • Before you connect from a Docker container

This section provides settings for connecting to Managed Service for Apache Kafka® cluster hosts using command line tools and from a Docker container. To learn how to connect from your application code, see Code examples.

You can only connect to public Apache Kafka® cluster hosts using an SSL certificate. The examples below assume that the YandexInternalRootCA.crt certificate is located in this directory:

  • /usr/local/share/ca-certificates/Yandex/ for Ubuntu.
  • $HOME\.kafka\ for Windows.

Connecting without an SSL certificate is only supported for non-public hosts. If this is the case, internal virtual network traffic will not be encrypted when connecting to a database.

Before connecting, configure security groups for the cluster, if required.

The examples for Linux were tested in the following environment:

  • Yandex Cloud VM running Ubuntu 20.04 LTS.
  • OpenJDK: 11.0.24.
  • Bash: 5.0.16.

The examples for Windows were tested in the following environment:

  • Yandex Cloud virtual machine running Windows Server 2019 Datacenter.
  • Microsoft OpenJDK: 11.0.11.
  • PowerShell: 5.1.17763.1490 Desktop.

Command line toolsCommand line tools

To see code examples with the host FQDN filled in, open the cluster page in the management console and click Connect.

kafkacatkafkacat

kafkacat, or kcat, is an open source tool for producing and consuming data without installing Java Runtime Environment.

Before connecting, install the required dependencies:

sudo apt update && sudo apt install -y kafkacat

Note

On Ubuntu 24.04 or higher, use kcat.

Connecting without SSL
Connecting with SSL
  1. Run the following command to receive messages from the topic:

    kafkacat -C \
             -b <broker_FQDN>:9092 \
             -t <topic_name> \
             -X security.protocol=SASL_PLAINTEXT \
             -X sasl.mechanism=SCRAM-SHA-512 \
             -X sasl.username="<consumer_login>" \
             -X sasl.password="<consumer_password>" -Z
    

    This command will continuously read new messages from the topic.

  2. In a separate terminal, run the following command to send a message to the topic:

    echo "test message" | kafkacat -P \
           -b <broker_FQDN>:9092 \
           -t <topic_name> \
           -k key \
           -X security.protocol=SASL_PLAINTEXT \
           -X sasl.mechanism=SCRAM-SHA-512 \
           -X sasl.username="<producer_login>" \
           -X sasl.password="<producer_password>" -Z
    
  1. Run the following command to receive messages from the topic:

    kafkacat -C \
             -b <broker_FQDN>:9091 \
             -t <topic_name> \
             -X security.protocol=SASL_SSL \
             -X sasl.mechanism=SCRAM-SHA-512 \
             -X sasl.username="<consumer_username>" \
             -X sasl.password="<consumer_password>" \
             -X ssl.ca.location=/usr/local/share/ca-certificates/Yandex/YandexInternalRootCA.crt -Z -K:
    

    This command will continuously read new messages from the topic.

  2. In a separate terminal, run the following command to send a message to the topic:

    echo "test message" | kafkacat -P \
        -b <broker_FQDN>:9091 \
        -t <topic_name> \
        -k key \
        -X security.protocol=SASL_SSL \
        -X sasl.mechanism=SCRAM-SHA-512 \
        -X sasl.username="<producer_login>" \
        -X sasl.password="<producer_password>" \
        -X ssl.ca.location=/usr/local/share/ca-certificates/Yandex/YandexInternalRootCA.crt -Z
    

To learn how to get a broker host FQDN, see this guide.

Make sure the first terminal displays key:test message sent in the second terminal.

Apache Kafka® tools for Linux (Bash)/macOS (Zsh)Apache Kafka® tools for Linux (Bash)/macOS (Zsh)

Archives with Apache Kafka® binary files include tools that allow managing the Apache Kafka® cluster and its entities. The example below shows how to specify user credentials for connection and use the tools with these credentials:

  • A message will be sent to the topic using kafka-console-producer.
  • A message will be received from the topic using kafka-console-consumer.

Before connecting:

  1. Install OpenJDK:

    sudo apt update && sudo apt install --yes default-jdk
    
  2. Download the archive with binary files for the Apache Kafka® version running in your cluster. Your Scala version is irrelevant.

  3. Unpack the archive.

Connecting without SSL
Connecting with SSL
  1. Create two configuration files to connect to the cluster: one for the producer and one for the consumer.

    These files have the same content and differ only in user credentials:

    sasl.mechanism=SCRAM-SHA-512
    sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
      username="<producer_or_consumer_login>" \
      password="<producer_or_consumer_password>";
    security.protocol=SASL_PLAINTEXT
    
  2. Run the following command to receive messages from the topic:

    <path_to_directory_with_Apache_Kafka_files>/bin/kafka-console-consumer.sh \
      --consumer.config <path_to_file_with_consumer_configuration> \
      --bootstrap-server <broker_FQDN>:9092 \
      --topic <topic_name> \
      --property print.key=true \
      --property key.separator=":"
    

    This command will continuously read new messages from the topic.

  3. In a separate terminal, run the following command to send a message to the topic:

    echo "key:test message" | <path_to_directory_with_Apache_Kafka_files>/bin/kafka-console-producer.sh \
      --producer.config <path_to_file_with_producer_configuration> \
      --bootstrap-server <broker_FQDN>:9092 \
      --topic <topic_name> \
      --property parse.key=true \
      --property key.separator=":"
    
  1. Go to the folder where the Java certificate store will reside:

    cd /etc/security
    
  2. Add the SSL certificate to the Java trusted certificate store (Java Key Store) so that the Apache Kafka® driver can use this certificate for secure connections to the cluster hosts. Set a password of at least 6 characters using the -storepass parameter for additional storage protection:

    sudo keytool -importcert \
                 -alias YandexCA -file /usr/local/share/ca-certificates/Yandex/YandexInternalRootCA.crt \
                 -keystore ssl -storepass <certificate_store_password> \
                 --noprompt
    
  3. Create two configuration files to connect to the cluster: one for the producer and one for the consumer.

    These files have the same content and differ only in user credentials:

    sasl.mechanism=SCRAM-SHA-512
    sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
      username="<producer_or_consumer_login>" \
      password="<producer_or_consumer_password>";
    security.protocol=SASL_SSL
    ssl.truststore.location=/etc/security/ssl
    ssl.truststore.password=<certificate_store_password>
    
  4. Run the following command to receive messages from the topic:

    <path_to_directory_with_Apache_Kafka_files>/bin/kafka-console-consumer.sh \
      --consumer.config <path_to_file_with_consumer_configuration> \
      --bootstrap-server <broker_FQDN>:9091 \
      --topic <topic_name> \
      --property print.key=true \
      --property key.separator=":"
    

    This command will continuously read new messages from the topic.

  5. In a separate terminal, run the following command to send a message to the topic:

    echo "key:test message" | <path_to_directory_with_Apache_Kafka_files>/bin/kafka-console-producer.sh \
      --producer.config <path_to_file_with_producer_configuration> \
      --bootstrap-server <broker_FQDN>:9091 \
      --topic <topic_name> \
      --property parse.key=true \
      --property key.separator=":"
    

To learn how to get a broker host FQDN, see this guide.

Make sure the first terminal displays key:test message sent in the second terminal.

Apache Kafka® tools for Windows (PowerShell)Apache Kafka® tools for Windows (PowerShell)

Archives with Apache Kafka® binary files include tools that allow managing the Apache Kafka® cluster and its entities. The example below shows how to specify user credentials for connection and use the tools with these credentials:

  • A message will be sent to the topic using kafka-console-producer.
  • A message will be received from the topic using kafka-console-consumer.

While mentioning .sh scripts, the documentation for the tools is relevant for Windows as well. The tools are the same across platforms, and only the scripts that run them differ, as shown below:

  • bin/kafka-console-producer.sh for Linux (Bash)/macOS (Zsh).
  • bin\windows\kafka-console-producer.bat for Windows (PowerShell).

Before connecting:

  1. Install the latest available version of Microsoft OpenJDK.

  2. Download the archive with binary files for the Apache Kafka® version running in your cluster. Your Scala version is irrelevant.

  3. Unpack the archive.

    Tip

    Unpack the Apache Kafka® files to the disk root folder, e.g., C:\kafka_2.12-2.6.0\.

    If the path to the Apache Kafka® executables and batch files is too long, you will get the The input line is too long error when trying to run them.

Connecting without SSL
Connecting with SSL
  1. Create two configuration files to connect to the cluster: one for the producer and one for the consumer.

    These files have the same content and differ only in user credentials:

    sasl.mechanism=SCRAM-SHA-512
    sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
      username="<producer_or_consumer_login>" \
      password="<producer_or_consumer_password>";
    security.protocol=SASL_PLAINTEXT
    
  2. Run the following command to receive messages from the topic:

    <path_to_directory_with_Apache_Kafka_files>\bin\windows\kafka-console-consumer.bat `
        --consumer.config <path_to_file_with_consumer_configuration> `
        --bootstrap-server <broker_FQDN>:9092 `
        --topic <topic_name> `
        --property print.key=true `
        --property key.separator=":"
    

    This command will continuously read new messages from the topic.

  3. In a separate terminal, run the following command to send a message to the topic:

    echo "key:test message" | <path_to_directory_with_Apache_Kafka_files>\bin\windows\kafka-console-producer.bat `
        --producer.config <path_to_file_with_producer_configuration> `
        --bootstrap-server <broker_FQDN>:9092 `
        --topic <topic_name> `
        --property parse.key=true `
        --property key.separator=":"
    
  1. Add the SSL certificate to the Java trusted certificate store (Java Key Store) so that the Apache Kafka® driver can use this certificate for secure connections to the cluster hosts. Set a password in the --storepass parameter for additional storage protection:

    keytool.exe -importcert -alias YandexCA `
      --file $HOME\.kafka\YandexInternalRootCA.crt `
      --keystore $HOME\.kafka\ssl `
      --storepass <certificate_store_password> `
      --noprompt
    
  2. Create two configuration files to connect to the cluster: one for the producer and one for the consumer.

    These files have the same content and differ only in user credentials:

    sasl.mechanism=SCRAM-SHA-512
    sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
      username="<producer_or_consumer_login>" \
      password="<producer_or_consumer_password>";
    security.protocol=SASL_SSL
    ssl.truststore.location=<$HOME_variable_value>\\.kafka\\ssl
    ssl.truststore.password=<certificate_store_password>
    

    Specify the full path to the certificate store as the ssl.truststore.location parameter value. Here is an example:

    ssl.truststore.location=C:\\Users\\Administrator\\.kafka\\ssl
    

    The certificate store is located at $HOME\.kafka\ssl, but you cannot use environment variables in the value. To expand the variable, run this command:

    echo $HOME
    

    Warning

    Use \\ instead of \ when specifying the ssl.truststore.location parameter value, otherwise you will not be able to access the certificate store when running commands.

  3. Run the following command to receive messages from the topic:

    <path_to_directory_with_Apache_Kafka_files>\bin\windows\kafka-console-consumer.bat `
        --consumer.config <path_to_file_with_consumer_configuration> `
        --bootstrap-server <broker_FQDN>:9091 `
        --topic <topic_name> `
        --property print.key=true `
        --property key.separator=":"
    

    This command will continuously read new messages from the topic.

  4. In a separate terminal, run the following command to send a message to the topic:

    echo "key:test message" | <path_to_directory_with_Apache_Kafka_files>\bin\windows\kafka-console-producer.bat `
        --producer.config <path_to_file_with_producer_configuration> `
        --bootstrap-server <broker_FQDN>:9091 `
        --topic <topic_name> `
        --property parse.key=true `
        --property key.separator=":"
    

To learn how to get a broker host FQDN, see this guide.

Make sure the first terminal displays key:test message sent in the second terminal.

Before you connect from a Docker containerBefore you connect from a Docker container

To connect to a Managed Service for Apache Kafka® cluster from a Docker container, add the following lines to your Dockerfile:

Connecting without SSL
Connecting with SSL
RUN apt-get update && \
    apt-get install kafkacat --yes
RUN apt-get update && \
    apt-get install wget kafkacat --yes && \
    mkdir --parents /usr/local/share/ca-certificates/Yandex/ && \
    wget "https://storage.yandexcloud.net/cloud-certs/CA.pem" \
         --output-document /usr/local/share/ca-certificates/Yandex/YandexInternalRootCA.crt && \
    chmod 0655 /usr/local/share/ca-certificates/Yandex/YandexInternalRootCA.crt

Was the article helpful?

Previous
Pre-configuration
Next
Code examples
© 2026 Direct Cursus Technology L.L.C.