Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
    • Yandex Cloud Partner program
  • Blog
  • Pricing
  • Documentation
© 2025 Direct Cursus Technology L.L.C.
Yandex Data Streams
    • Preparing the environment
    • Creating a stream
      • Fluentd
      • Logstash
      • AWS CLI
    • Processing CDC streams from Debezium
  • Access management
  • Pricing policy
  • FAQ
  1. Getting started
  2. Collecting and streaming data
  3. Fluentd

Fluentd

Written by
Yandex Cloud
Updated at October 28, 2024

Note

You can create a trigger that will launch a function in Cloud Functions or a container in Serverless Containers when data is sent to the stream. Read more about triggers for Data Streams.

  1. Download and install Fluentd.

  2. Install the Fluentd plugin to support the AWS Kinesis Data Streams protocol. This protocol will be used for streaming data.

    sudo td-agent-gem install fluent-plugin-kinesis
    
  3. In the management console, select the folder with the stream.

  4. Select Data Streams.

  5. Select the data stream.

  6. Click Connect and go to the Fluentd tab.

  7. Copy the configuration file example and paste it into the /etc/td-agent/td-agent.conf file.

    Example of the configuration file:

    <system>
      log_level debug
    </system>
    <source>
      @type http
      @id input_http
      port 8888
    </source>
    <match kinesis>
      @type copy
      <store>
        @type stdout
      </store>
      <store>
        @type kinesis_streams
    
        aws_key_id <access_key_ID>
        aws_sec_key <secret_key>
    
        # kinesis stream name
        stream_name /ru-central1/aoegtvhtp8ob********/cc8004q4lbo6********/test
    
        # region
        region ru-central-1
    
        endpoint https://yds.serverless.yandexcloud.net
    
        <buffer>
          flush_interval 5s
        </buffer>
      </store>
    </match>
    

    Where:

    • <access_key_ID>: Static access key ID
    • <secret_key>: Secret part of the static access key
  8. Send the test data to Fluentd:

    curl \
     --request POST \
     --data 'json={"user_id":"user1", "score": 100}' \
     http://localhost:8888/kinesis
    

    If the setup is successful, the Fluentd /var/log/td-agent/td-agent.log operation log will include a message about receiving the data and sending it to Yandex Data Streams over the AWS Kinesis Data Streams protocol:

    kinesis: {"json":"message"}
    DEBUG -- : [Aws::Kinesis::Client 200 0.628973 0 retries] put_records(stream_name:"/ru-central1/aoeu1kuk2dht********/cc8029jgtuab********/fluentd_stream",records:[{data:"{\"message\":\"Write chunk 5c0cf5c556654e99cac84*********** /   2 records /    0 KB\"}\n",partition_key:"6ec03a4e3ba832c85e802***********"},{data:"{\"message\":\"Finish writing chunk\"}\n",partition_key:"8ada32f7373e1ab4c48fb***********"},{data:"{\"json\":\"message\"}\n",partition_key:"70f21f2decfc90b6f1975***********"}])
    

Was the article helpful?

Previous
Creating a stream
Next
Logstash
© 2025 Direct Cursus Technology L.L.C.