AWS SDK for JavaScript
The AWS SDK for JavaScript
With the AWS SDK for Node.js, you will create a bucket, upload objects, get a list of objects, extract an object, clear the bucket contents, and delete the bucket.
Getting started
- Create a service account.
- Assign to the service account the roles required for your project, e.g,
storage.editor
. For more information about roles, see Access management with Yandex Identity and Access Management. - Create a static access key.
Note
A service account is only allowed to view a list of buckets in the folder it was created in.
A service account can perform actions with objects in buckets that are created in folders different from the service account folder. To enable this, assign the service account roles for the appropriate folder or its bucket.
Preparing a project
Preparing authentication data
-
Create a directory to store the authentication data in and navigate to it:
For macOS and Linux:
mkdir ~/.aws/
For Windows:
mkdir C:\Users\<username>\.aws\
-
In the
.aws
directory, create a file namedcredentials
with credentials for Object Storage and copy the following data into it:[default] aws_access_key_id = <static_key_ID> aws_secret_access_key = <secret_key>
-
Create a file named
config
with the default region settings and copy the following information to it:[default] region = ru-central1 endpoint_url = https://storage.yandexcloud.net
Note
Some apps designed to work with Amazon S3 do not allow you to specify the region; this is why Object Storage may also accept the
us-east-1
value.
To access Object Storage, use the https://storage.yandexcloud.net
endpoint.
Preparing a project directory
-
Install
Node.js. -
Create a directory for the code example and navigate to it:
mkdir app cd app
-
Initialize a project named Node.js and use the command below to install the
aws-sdk/client-s3
library:npm init -y && npm i @aws-sdk/client-s3
-
Add the
"type": "module"
row into the package.json file to use the ESM (ECMAScript Modules) syntax in the project. A file namedpackage.json
with the project's basic Node.js settings will be created in the directory.The resulting
package.json
file will appear as follows:{ "name": "check", "version": "1.0.0", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "keywords": [], "author": "", "license": "ISC", "description": "", "dependencies": { "@aws-sdk/client-s3": "^3.726.1" }, "type": "module" }
-
Create a file named
index.js
for the code using the AWS SDK.
Code snippets
Below we describe how to perform basic operations with a bucket using the AWS SDK for Node.js.
-
Add the following code to
index.js
:import { readFileSync } from "node:fs" import { S3Client, PutObjectCommand, CreateBucketCommand, DeleteObjectCommand, DeleteBucketCommand, paginateListObjectsV2, GetObjectCommand, ListObjectsV2Command, } from "@aws-sdk/client-s3"; (async function () { // Creating an s3 client to interact with aws. // Authentication data is taken from your environment, but you can specify it explicitly. Here is an example: // `new S3Client({ region: 'ru-central1', credentials: {...} })` const s3Client = new S3Client({}); const bucketName = `test-bucket-${Date.now()}`; // Creating a new bucket console.log(`Creating the bucket ${bucketName}.`); await s3Client.send( new CreateBucketCommand({ Bucket: bucketName, }), ); console.log(`The bucket ${bucketName} was created.\n\n`); // Uploading objects into a bucket // From a string console.log('Creating a object from string.'); await s3Client.send( new PutObjectCommand({ Bucket: bucketName, Key: "bucket-text", Body: 'Hello bucket!', }), ); console.log('The object from string was created.\n'); // From files console.log('Creating the first object from local file.'); await s3Client.send( new PutObjectCommand({ Bucket: bucketName, Key: "my-package.json", Body: readFileSync('package.json'), }), ); console.log('The first object was created.\nCreating the second object from local file.'); await s3Client.send( new PutObjectCommand({ Bucket: bucketName, Key: "my-package-lock.json", Body: readFileSync('package-lock.json'), }), ); console.log('The second object was created.\n'); // Getting a list of objects console.log('Getting bucket objects list.'); const command = new ListObjectsV2Command({ Bucket: bucketName }); const { Contents } = await s3Client.send(command); const contentsList = Contents.map((c) => ` • ${c.Key}`).join("\n"); console.log("Here's a list of files in the bucket:"); console.log(`${contentsList}\n`); // Deleting multiple objects console.log('Deleting objects.'); await s3Client.send( new DeleteObjectCommand({ Bucket: bucketName, Key: "my-package.json" }), ); await s3Client.send( new DeleteObjectCommand({ Bucket: bucketName, Key: "my-package-lock.json" }), ); console.log('The objects were deleted.\n'); // Getting an object console.log('Getting your "bucket-text" object') const { Body } = await s3Client.send( new GetObjectCommand({ Bucket: bucketName, Key: "bucket-text", }), ); console.log('Your "bucket-text" content:') console.log(await Body.transformToString(), '\n'); // Deleting bucket objects and the bucket itself // Getting a list of objects page by page const paginator = paginateListObjectsV2( { client: s3Client }, { Bucket: bucketName }, ); for await (const page of paginator) { const objects = page.Contents; if (objects) { // Running the delete command for each object by iterating pages with objects for (const object of objects) { // Sending the delete command await s3Client.send( new DeleteObjectCommand({ Bucket: bucketName, Key: object.Key }), ); } } } // Deleting the previously created bucket await s3Client.send(new DeleteBucketCommand({ Bucket: bucketName })); console.log('Your bucket was emptied and deleted.'); })()
In this code snippet, we added an IIFE (Immediately Invoked Function Expression)
. This enables you to invoke the script when running the file. -
Run the application:
node index.js
In the console output, you will see a step-by-step description of the operation results.
To learn more about using the AWS SDK for JavaScript, see the AWS documentation