Sending data to a stream in the AWS SDK
Written by
Updated at January 30, 2024
Note
You can create a trigger that will launch a function in Cloud Functions or a container in Serverless Containers when data is sent to the stream. Read more about triggers for Data Streams.
Python
To send data to a data stream, use the put_record/put_records
method. When you invoke this method, you should specify the following parameters:
- Name of the stream, e.g.,
example-stream
. - ID of the cloud the stream is located in, e.g.,
b1gi1kuj2dht********
. - YDB database ID with the stream, e.g.,
cc8028jgtuab********
. - Data being sent, e.g.,
message
.
You also need to configure the AWS SDK and assign the service account the yds.writer
role.
To send data to the stream with the parameters specified above:
-
Create the
stream_put_record.py
file and copy the following code into it:import boto3 from pprint import pprint def put_record(cloud, database, stream_name, message): client = boto3.client('kinesis', endpoint_url="https://yds.serverless.yandexcloud.net") response = client.put_record( StreamName="/ru-central1/{cloud}/{database}/{stream}".format(cloud=cloud, database=database, stream=stream_name), Data=message, PartitionKey=message ) return response if __name__ == '__main__': put_record_response = put_record( cloud="b1gi1kuj2dht********", database="cc8028jgtuab********", stream_name="example-stream", message="message") print("The record has been sent successfully") pprint(put_record_response)
-
Run the program:
python3 stream_put_record.py
Result:
The record has been sent successfully { 'EncryptionType': 'NONE', 'ResponseMetadata': { 'HTTPHeaders': { 'connection': 'keep-alive', 'content-length': '81', 'content-type': 'application/json', 'date': '' 'GMT', 'server': 'nginx/1.19.5' }, 'HTTPStatusCode': 200, 'RetryAttempts': 0 }, 'SequenceNumber': '0', 'ShardId': 'shard-000000' }