Yandex Cloud
Search
Contact UsTry it for free
  • Customer Stories
  • Documentation
  • Blog
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • AI for business
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Start testing with double trial credits
    • Cloud credits to scale your IT product
    • Gateway to Russia
    • Cloud for Startups
    • Center for Technologies and Society
    • Yandex Cloud Partner program
    • Price calculator
    • Pricing plans
  • Customer Stories
  • Documentation
  • Blog
© 2025 Direct Cursus Technology L.L.C.
Yandex Data Streams
    • Preparing the environment
    • Creating a data stream
      • Fluentd
      • Logstash
      • AWS CLI
    • Debezium Change Data Capture (CDC) stream processing
  • Access management
  • Pricing policy
  • FAQ
  1. Getting started
  2. Data collection and delivery
  3. Logstash

Logstash

Written by
Yandex Cloud
Updated at August 15, 2025

Note

You can create a trigger that will invoke a function in Cloud Functions or run a container in Serverless Containers when data is sent to a stream. Read more about triggers for Data Streams.

  1. Download and install Logstash.

  2. Install the plugin for AWS Kinesis Data Streams protocol support. This protocol will be used for data delivery.

    sudo /usr/share/logstash/bin/logstash-plugin install logstash-output-kinesis
    

    Note

    The plugin utilizes the Amazon Kinesis Producer Library that requires Java Development Kit (JDK) to operate. Download and install it for your platform. At startup, verify JDK version 1.8.235 or higher is installed.

  3. In the management console, select the folder containing your data stream.

  4. Select Data Streams.

  5. Select the data stream.

  6. Click Connect and navigate to the Logstash tab.

  7. Copy the example configuration and paste it into the /usr/share/logstash/bin/mypipeline.conf file.

    Example configuration:

    input {
      http {
        port => 8888
      }
    }
    output {
      stdout { codec => rubydebug}
      kinesis {
        stream_name => "/ru-central1/aoegtvhtp8ob********/cc8004q4lbo6********/test"
        region => "ru-central-1"
        verify_certificate => false
        codec => json_lines
        randomized_partition_key => true
        access_key => "<access_key_ID>"
        secret_key => "<secret_key>"
        metrics_level => "none"
        endpoint => "https://yds.serverless.yandexcloud.net"
      }
    }
    

    Where:

    • <access_key_ID>: Static access key ID.
    • <secret_key>: Secret part of the static access key.
  8. Start data delivery:

    sudo /usr/share/logstash/bin/logstash -f mypipeline.conf
    
  9. Send test data to Logstash:

    curl \
      --request PUT 'http://127.0.0.1:8888/kinesis' \
      --header "content-type: application/json" \
      --data '{"user_id":"user1", "score": 100}'
    

    If the setup is successful, the Logstash console will show a message confirming data receipt and its transmission to Data Streams via the AWS Kinesis Data Streams protocol:

    {
      "@version" => "1",
      "headers" => {
        "request_path" => "/kinesis",
        "http_version" => "HTTP/1.1",
        "content_type" => "application/json",
        "http_host" => "127.0.0.1:8888",
        "http_accept" => "*/*",
        "request_method" => "PUT",
        "content_length" => "18",
        "http_user_agent" => "curl/7.68.0"
      },
      "host" => "127.0.0.1",
      "json" => "message"
    }
    Stage 1 Triggers: { stream: '/ru-central1/aoeu1kuk2dht********/cc8029jgtuab********/logstash_stream', manual: 0, count: 0, size: 0, matches: 0, timed: 0, UserRecords: 0, KinesisRecords: 0 }
    Stage 2 Triggers: { stream: '/ru-central1/aoeu1kuk2dht********/cc8029jgtuab********/logstash_stream', manual: 0, count: 0, size: 0, matches: 0, timed: 1, KinesisRecords: 1, PutRecords: 1 }
    (test) Average Processing Time: 723 ms
    

Was the article helpful?

Previous
Fluentd
Next
AWS CLI
© 2025 Direct Cursus Technology L.L.C.