Yandex Cloud
Search
Contact UsGet started
  • Blog
  • Pricing
  • Documentation
  • All Services
  • System Status
    • Featured
    • Infrastructure & Network
    • Data Platform
    • Containers
    • Developer tools
    • Serverless
    • Security
    • Monitoring & Resources
    • ML & AI
    • Business tools
  • All Solutions
    • By industry
    • By use case
    • Economics and Pricing
    • Security
    • Technical Support
    • Customer Stories
    • Gateway to Russia
    • Cloud for Startups
    • Education and Science
  • Blog
  • Pricing
  • Documentation
Yandex project
© 2025 Yandex.Cloud LLC
Yandex Object Storage
    • All tools
      • All SDKs
      • AWS SDK for Java
      • AWS SDK for JavaScript
      • AWS SDK for Python (boto)
      • AWS SDK for .NET
      • AWS SDK for C++
      • AWS SDK for PHP
      • AWS SDK for Go
  • Pricing policy
  • Terraform reference
  • Monitoring metrics
  • Audit Trails events
  • Bucket logs
  • Release notes
  • FAQ

In this article:

  • Getting started
  • Configuring a project
  • Preparing authentication data
  • Preparing a project directory
  • Code examples
  1. Tools
  2. SDK
  3. AWS SDK for JavaScript

AWS SDK for JavaScript

Written by
Yandex Cloud
Updated at April 1, 2025
  • Getting started
  • Configuring a project
    • Preparing authentication data
    • Preparing a project directory
  • Code examples

The AWS SDK for JavaScript is a Yandex Object Storage-compatible software development kit for integration with AWS services.

With the AWS SDK for Node.js, you will create a bucket, upload objects to it, get a list of objects, download a single object, clean up the bucket contents, and delete the bucket.

Getting startedGetting started

  1. Create a service account.

  2. Assign to the service account the roles required for your project, e.g., storage.editor for a bucket (to work with a particular bucket) or a folder (to work with all buckets in this folder). For more information about roles, see Access management with Yandex Identity and Access Management.

    To work with objects in an encrypted bucket, a user or service account must have the following roles for the encryption key in addition to the storage.configurer role:

    • kms.keys.encrypter: To read the key, encrypt and upload objects.
    • kms.keys.decrypter: To read the key, decrypt and download objects.
    • kms.keys.encrypterDecrypter: This role includes the kms.keys.encrypter and kms.keys.decrypter permissions.

    For more information, see Key Management Service service roles.

  3. Create a static access key.

    As a result, you will get the static access key data. To authenticate in Object Storage, you will need the following:

    • key_id: Static access key ID
    • secret: Secret key

    Save key_id and secret: you will not be able to get the key value again.

Note

A service account is only allowed to view a list of buckets in the folder it was created in.

A service account can perform actions with objects in buckets that are created in folders different from the service account folder. To enable this, assign the service account roles for the appropriate folder or its bucket.

Configuring a projectConfiguring a project

Preparing authentication dataPreparing authentication data

  1. Create a directory to store the authentication data in and navigate to it:

    For macOS and Linux:

    mkdir ~/.aws/
    

    For Windows:

    mkdir C:\Users\<username>\.aws\
    
  2. In the .aws directory, create a file named credentials, copy the credentials you got earlier, and paste them into it:

    [default]
    aws_access_key_id = <static_key_ID>
    aws_secret_access_key = <secret_key>
    
  3. Create a file named config with the default region settings and copy the following information to it:

    [default]
    region = ru-central1
    endpoint_url = https://storage.yandexcloud.net
    

    Note

    Some apps designed to work with Amazon S3 do not allow you to specify the region; this is why Object Storage may also accept the us-east-1 value.

To access Object Storage, use the https://storage.yandexcloud.net endpoint.

Preparing a project directoryPreparing a project directory

  1. Install Node.js.

  2. Create a directory for the code example and navigate to it:

    mkdir app
    cd app
    
  3. Initialize the Node.js project and use the command below to install the aws-sdk/client-s3 library:

    npm init -y && npm i @aws-sdk/client-s3
    
  4. Add the "type": "module" string to the package.json file to use the ESM (ECMAScript Modules) syntax in the project. The package.json file with the project's basic Node.js settings will be created in the directory.

    The final package.json file will appear as follows:

    {
        "name": "check",
        "version": "1.0.0",
        "main": "index.js",
        "scripts": {
            "test": "echo \"Error: no test specified\" && exit 1"
        },
        "keywords": [],
        "author": "",
        "license": "ISC",
        "description": "",
        "dependencies": {
            "@aws-sdk/client-s3": "^3.726.1"
        },
        "type": "module"
    }
    
  5. Create a file named index.js to store the code using the AWS SDK.

Code examplesCode examples

Below we describe how to perform basic operations with a bucket using the AWS SDK for Node.js.

  1. Add the following code to index.js:

    import { readFileSync } from "node:fs"
    import
    {
        S3Client,
        PutObjectCommand,
        CreateBucketCommand,
        DeleteObjectCommand,
        DeleteBucketCommand,
        paginateListObjectsV2,
        GetObjectCommand,
        ListObjectsV2Command,
    } from "@aws-sdk/client-s3";
    
    (async function ()
    {
        // Creating an s3 client to interact with aws.
        // Authentication data is taken from your environment, but you can specify it explicitly, e.g.:
        // `new S3Client({ region: 'ru-central1', credentials: {...} })`
        const s3Client = new S3Client({});
    
        const bucketName = `test-bucket-${Date.now()}`;
        // Creating a new bucket
        console.log(`Creating the bucket ${bucketName}.`);
        await s3Client.send(
            new CreateBucketCommand({
                Bucket: bucketName,
            }),
        );
        console.log(`The bucket ${bucketName} was created.\n\n`);
    
        // Uploading objects to a bucket
        // From a string
        console.log('Creating a object from string.');
        await s3Client.send(
            new PutObjectCommand({
                Bucket: bucketName,
                Key: "bucket-text",
                Body: 'Hello bucket!',
            }),
        );
        console.log('The object from string was created.\n');
        // From files
        console.log('Creating the first object from local file.');
        await s3Client.send(
            new PutObjectCommand({
                Bucket: bucketName,
                Key: "my-package.json",
                Body: readFileSync('package.json'),
            }),
        );
        console.log('The first object was created.\nCreating the second object from local file.');
        await s3Client.send(
            new PutObjectCommand({
                Bucket: bucketName,
                Key: "my-package-lock.json",
                Body: readFileSync('package-lock.json'),
            }),
        );
        console.log('The second object was created.\n');
    
        // Getting a list of objects
        console.log('Getting bucket objects list.');
        const command = new ListObjectsV2Command({ Bucket: bucketName });
        const { Contents } = await s3Client.send(command);
        const contentsList = Contents.map((c) => ` • ${c.Key}`).join("\n");
        console.log("Here's a list of files in the bucket:");
        console.log(`${contentsList}\n`);
    
        // Deleting multiple objects
        console.log('Deleting objects.');
        await s3Client.send(
            new DeleteObjectCommand({ Bucket: bucketName, Key: "my-package.json" }),
        );
        await s3Client.send(
            new DeleteObjectCommand({ Bucket: bucketName, Key: "my-package-lock.json" }),
        );
        console.log('The objects were deleted.\n');
    
        // Getting an object
        console.log('Getting your "bucket-text" object')
        const { Body } = await s3Client.send(
            new GetObjectCommand({
                Bucket: bucketName,
                Key: "bucket-text",
            }),
        );
        console.log('Your "bucket-text" content:')
        console.log(await Body.transformToString(), '\n');
    
        // Deleting bucket objects and the bucket itself
        // Getting a list of objects page by page
        const paginator = paginateListObjectsV2(
            { client: s3Client },
            { Bucket: bucketName },
        );
        for await (const page of paginator)
        {
            const objects = page.Contents;
            if (objects)
            {
                // Running the delete command for each object by iterating pages with objects
                for (const object of objects)
                {
                    // Sending the delete command
                    await s3Client.send(
                        new DeleteObjectCommand({ Bucket: bucketName, Key: object.Key }),
                    );
                }
            }
        }
    
        // Deleting the previously created bucket
        await s3Client.send(new DeleteBucketCommand({ Bucket: bucketName }));
        console.log('Your bucket was emptied and deleted.');
    })()
    

    In this code snippet, we added an IIFE (Immediately Invoked Function Expression) to invoke the script when running the file.

  2. Run the application:

    node index.js
    

    In the console output, you will see a step-by-step description of the operation results.

Learn more about using the AWS SDK for JavaScript in the AWS documentation.

Was the article helpful?

Previous
AWS SDK for Java
Next
AWS SDK for Python (boto)
Yandex project
© 2025 Yandex.Cloud LLC