Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
  • Blog
  • Pricing
  • Documentation
Yandex project
© 2025 Yandex.Cloud LLC
Yandex Data Streams
    • All tutorials
    • Entering data into storage systems
    • Smart log processing
    • Transferring data within microservice architectures
    • Saving data to ClickHouse®
    • Replicating logs to Object Storage using Fluent Bit
    • Replicating logs to Object Storage using Data Streams
    • Migrating data to Yandex Object Storage using Yandex Data Transfer
    • Delivering data from Yandex Managed Service for Apache Kafka® using Yandex Data Transfer
    • Delivering data from an Data Streams queue to Managed Service for YDB
    • Delivering data to Yandex Managed Service for Apache Kafka® using Yandex Data Transfer
    • YDB change data capture and delivery to YDS
    • PostgreSQL change data capture and delivery to YDS
    • MySQL® change data capture and delivery to YDS
    • Streaming Yandex Cloud Postbox events to Yandex Data Streams and analyzing them using Yandex DataLens
    • Creating an interactive serverless application using WebSocket
    • Processing Audit Trails events
    • Processing CDC Debezium streams
    • Exporting audit logs to MaxPatrol SIEM
    • Searching for Yandex Cloud events in Yandex Query
  • Access management
  • Pricing policy
  • FAQ

In this article:

  • Benefits
  • Receiving data
  • Reliability
  • Batching
  • Rewinding data
  • Multiple storage systems
  • Masking data and processing logs
  • Reading data
  • Configuration
  1. Tutorials
  2. Entering data into storage systems

Entering data into storage systems

Written by
Yandex Cloud
Updated at April 18, 2025
  • Benefits
    • Receiving data
    • Reliability
    • Batching
    • Rewinding data
    • Multiple storage systems
    • Masking data and processing logs
    • Reading data
  • Configuration

Mobile phones, various smart devices, and external services are increasingly replacing application components as data sources.

Such sources supply data in massive numbers of small batches. The communication channels used for transmission are often slow, and the communication time limited. Under such conditions, you want to quickly save the data you receive. Its processing can wait till later. This is why the data first goes to data streaming buses to be collected for processing from there.

As a data streaming bus, Yandex Data Streams provides optimal operation modes for sources and targets:

  • Accepts incoming data with high frequency and speed without blocking the sources.
  • Saves the received data in its own storage.
  • Generates data batches and sends them to target systems reducing the load on them.

BenefitsBenefits

When working with external devices or services, you want to quickly save the data you receive. You can fetch the saved data from Data Streams through direct reads or by setting up data delivery to Yandex Cloud storage systems using Yandex Data Transfer.

Receiving dataReceiving data

Data is transmitted to Data Streams over HTTP. Using Yandex API Gateway, you can implement any protocol for incoming data. Data received in API Gateway can be forwarded to Data Streams as well.

Data Streams is highly scalable and can accept data from thousands of data sources at the same time.

ReliabilityReliability

A data streaming bus is an important infrastructure component. It is tolerant to any type of Yandex Cloud failures. Data input in Data Streams is saved to at least three Yandex Cloud availability zones.

BatchingBatching

Data storage and processing systems perform at their best if data is written to them in batches. You can create batches in the most efficient way at the single point all your data flows to. Its role is commonly performed by data buses.

Rewinding dataRewinding data

Unlike message queues, data buses store data until the retention period expires without deleting the data after it is read. This allows you to move across the stored data in any direction: from the oldest to the most recent. For example, if a new data format appears and gets incorrectly written to the target system, you can rewind the data stored in a bus to the beginning and then reread and rewrite it to the target system correctly.

Multiple storage systemsMultiple storage systems

The same data is often stored in multiple storage systems at once: in ClickHouse® for fast analysis and in Object Storage for long-term storage. With data buses, you can easily handle this: as different apps can read data concurrently, you can set up sending the same data to both storage systems: ClickHouse® and Object Storage. This solution will also let you add a third storage system, such as Greenplum® or Elasticsearch, at any time.

The multiple storage system approach is very convenient for ensuring compliance with FZ-152, PCI DSS, and other standards requiring that data be stored for at least one year. In which case the last month's data can go to one storage system for quick access, and the rest of the data to a long-term cold Object Storage storage.

Masking data and processing logsMasking data and processing logs

Some data is not accessible to all employees. For example, data with users' personal data access to which must be limited.

Transmitted data can be sent for processing to Cloud Functions where it can be masked or handled in any other way.

Once processed, the data can be sent to multiple target systems at once: access to the data containing masked personal data can be granted to all employees while access to the full data to administrators only.

Reading dataReading data

You can process the data saved in Data Streams programmatically. Data Streams is compatible with the Amazon Kinesis Data Streams API, allowing you to use SDKs for different programming languages: C++, Java, Go, Python, etc.

ConfigurationConfiguration

To set up data input to storage systems:

  1. Create a data stream Data Streams.

  2. Configure the AWS SDK.

  3. Set up Yandex Data Transfer to transfer data to the selected storage system.

    An example of setting up data delivery from Data Streams is given in the tutorial on how to save data to ClickHouse®.

  4. Connect an arbitrary data processing function to Yandex Data Transfer. This GitHub example illustrates the function code. Alternatively, you can use the SDK to read data directly from Data Streams:

    • Go
    • C++
    • Java
    • JavaScript
    • Python
    • HTTP Kinesis Data Streams API

ClickHouse® is a registered trademark of ClickHouse, Inc.

Was the article helpful?

Previous
All tutorials
Next
Smart log processing
Yandex project
© 2025 Yandex.Cloud LLC