Backing up to Yandex Object Storage with AWS S3 Sync
In this tutorial, you will set up backup of local files to Yandex Object Storage using AWS S3 Sync
AWS S3 Sync is a standard AWS CLI command for syncing content between a local folder and a bucket. This is a simple and reliable way to back up your files to S3-compatible cloud object storage.
Features of AWS S3 Sync:
- Requires no additional software other than the AWS CLI.
- Supported on Linux, Windows, and macOS.
- Easy setup and direct access to the S3 API.
- Simple automation with a task scheduler or scripts.
To configure backup with AWS S3 Sync:
- Get your cloud ready.
- Create a bucket.
- Create a service account.
- Create a static access key.
- Install the AWS CLI.
- Synchronize the local folder with the bucket.
If you no longer need the resources you created, delete them.
Getting started
Sign up for Yandex Cloud and create a billing account:
- Navigate to the management console
and log in to Yandex Cloud or create a new account. - On the Yandex Cloud Billing
page, make sure you have a billing account linked and it has theACTIVEorTRIAL_ACTIVEstatus. If you do not have a billing account, create one and link a cloud to it.
If you have an active billing account, you can navigate to the cloud page
Learn more about clouds and folders here.
Required paid resources
The bucket support cost includes the fee for bucket data storage and data operations (see Yandex Object Storage pricing).
Create a bucket
Note
To protect your backups from accidental file deletion, enable S3 bucket versioning. This way, deleted or overwritten files will be saved as previous versions you can restore if needed. For more information about S3 bucket versioning, see this guide.
Without versioning, you will not be able to restore files deleted from S3, even if previously copied.
- In the management console
, navigate to the relevant folder. - Select Object Storage.
- Click Create bucket.
- Enter a name for the bucket according to the naming requirements.
- In the Read objects, Read object list, and Read settings fields, select
With authorization. - Click Create bucket.
-
If you do not have the AWS CLI yet, install and configure it.
-
Create a bucket by entering its name following the naming requirements:
aws --endpoint-url=https://storage.yandexcloud.net \ s3 mb s3://<bucket_name>Result:
make_bucket: backup-bucket
Use the create REST API method for the Bucket resource, the BucketService/Create gRPC API call, or the create S3 API method.
Create a service account
Create a service account to be used for backups.
- In the management console
, select Identity and Access Management. - Click Create service account.
- In the Name field, specify
sa-backup-to-s3. - Click
Add role and select thestorage.editorrole. - Click Create.
If you do not have the Yandex Cloud CLI installed yet, install and initialize it.
By default, the CLI uses the folder specified when creating the profile. To change the default folder, use the yc config set folder-id <folder_ID> command. You can also set a different folder for any specific command using the --folder-name or --folder-id parameter.
-
Create a service account:
yc iam service-account create --name sa-backup-to-s3 \ --folder-name <folder_name>Result:
id: ajeab0cnib1p******** folder_id: b0g12ga82bcv******** created_at: "2025-10-03T09:44:35.989446Z" name: sa-backup-to-s3 -
Assign the
storage.editorrole for the folder to the service account:yc resource-manager folder add-access-binding <folder_name> \ --service-account-name sa-backup-to-s3 \ --role storage.editor \ --folder-name <folder_name>Result:
effective_deltas: - action: ADD access_binding: role_id: storage.editor subject: id: ajeab0cnib1p******** type: serviceAccount
- Create a service account named
sa-backup-to-s3. Do it by using the create REST API method for the ServiceAccount resource or the ServiceAccountService/Create gRPC API call. - Assign the
storage.editorrole for the current folder to the the service account. Do it by using the setAccessBindings REST API method for the Folder resource or the FolderService/SetAccessBindings gRPC API call.
Note
To work with objects in an encrypted bucket, a user or service account must have the following roles for the encryption key in addition to the storage.configurer role:
kms.keys.encrypter: To read the key, encrypt and upload objects.kms.keys.decrypter: To read the key, decrypt and download objects.kms.keys.encrypterDecrypter: This role includes thekms.keys.encrypterandkms.keys.decrypterpermissions.
For more information, see Yandex Key Management Service service roles.
Create a static access key
-
In the management console
, select Identity and Access Management. -
In the left-hand panel, select
Service accounts. -
Select the
sa-backup-to-s3service account. -
In the top panel, click
Create new key and select Create static access key. -
Enter a description for the key and click Create.
-
Save the ID and secret key for later when you are mounting the bucket.
Alert
After you close this dialog, the key value will no longer be available.
-
Run this command:
yc iam access-key create \ --service-account-name sa-backup-to-s3Where
--service-account-nameis the name of the service account you are creating the key for.Result:
access_key: id: aje726ab18go******** service_account_id: ajecikmc374i******** created_at: "2024-11-28T14:16:44.936656476Z" key_id: YCAJEOmgIxyYa54LY******** secret: YCMiEYFqczmjJQ2XCHMOenrp1s1-yva1******** -
Save the ID (
key_id) and secret key (secret) for later when you are mounting the bucket.
To create an access key, use the create REST API method for the AccessKey resource or the AccessKeyService/Create gRPC API call.
Save the ID (key_id) and secret key (secret) for later when you are mounting the bucket.
Install the AWS CLI
If you do not have the AWS CLI yet, install and configure it.
Synchronize the local folder with the bucket
To complete the backup setup, configure manual or automatic synchronization of the local folder with the bucket.
Manual synchronization
For a one-off synchronization, run this command:
aws s3 sync <local_folder_path> s3://<bucket_name> \
--endpoint-url=https://storage.yandexcloud.net \
--delete
Where:
--endpoint-url: Object Storage endpoint.--delete: Flag to delete files from the bucket when they are deleted from the local folder.
This command copies all contents from your local folder to the S3 bucket by moving only new and modified files.
For a one-off synchronization, run this command:
aws s3 sync <local_folder_path> s3://<bucket_name> \
--endpoint-url=https://storage.yandexcloud.net \
--delete
Where:
--endpoint-url: Object Storage endpoint.--delete: Flag to delete files from the bucket when they are deleted from the local folder.
This command copies all contents from your local folder to the S3 bucket by moving only new and modified files.
Tip
To avoid running the command manually each time, you can create a BAT file:
-
Open Notepad and add the following contents:
@echo off aws s3 sync "<local_folder_path>" s3://<bucket_name> ^ --endpoint-url=https://storage.yandexcloud.net ^ --deleteWhere
--deleteis a flag to delete files from the bucket when they are deleted from the local folder. -
Save the file, e.g., as
sync_to_s3.bat. -
To synchronize folders, run the BAT file.
Automatic synchronization
To automatically synchronize your local folder with the bucket:
-
Make sure the user who will schedule the
cronjob has access to the local folder. -
Open the current user's scheduler file by running this command:
crontab -e -
Add a line to the file to trigger autosync, e.g., every 10 minutes:
*/10 * * * * aws s3 sync <local_folder_path> <connection_name>:<bucket_name> --delete --log-file=<log_file_path>Where:
--delete: Flag to delete files from the bucket when they are deleted from the local folder.--log-file: Optional parameter for writing logs. Specify the full path.
Note
Specify the full absolute path to folders without using
~, e.g.,/home/user/.
The job will run at the specified frequency and synchronize the folders.
For auto sync, set up a task in the Task scheduler:
-
Open the Windows Task Scheduler:
- Start Menu → Task Scheduler.
- Or start it in Run →
taskschd.msc.
-
Click Create task....
-
In the Actions tab, add a new action by specifying the absolute path to the executable script, e.g., a BAT file, under Program or script.
-
In the Triggers tab, add a schedule.
-
Click OK.
How to delete the resources you created
To stop paying for the resources you created:
- Delete the objects from the bucket.
- Delete the bucket.