Backing up to Yandex Object Storage with rclone
In this tutorial, you will set up backup of local files to Yandex Object Storage using rclone.
Rclone
Features of rclone:
- Connects directly to the S3 API so you do not need to mount a bucket.
- Supported on Linux, Windows, and macOS.
- Supports copy, sync, filtering, checks, and automation with scripts.
- Easy setup and integration with task schedulers.
To set up backup with rclone:
- Get your cloud ready.
- Create a bucket.
- Create a service account.
- Create a static access key.
- Set up your environment.
- Synchronize the local folder with the bucket.
If you no longer need the resources you created, delete them.
Getting started
Sign up for Yandex Cloud and create a billing account:
- Navigate to the management console
and log in to Yandex Cloud or create a new account. - On the Yandex Cloud Billing
page, make sure you have a billing account linked and it has theACTIVEorTRIAL_ACTIVEstatus. If you do not have a billing account, create one and link a cloud to it.
If you have an active billing account, you can navigate to the cloud page
Learn more about clouds and folders here.
Required paid resources
The bucket support cost includes the fee for bucket data storage and data operations (see Yandex Object Storage pricing).
Create a bucket
Note
To protect your backups from accidental file deletion, enable S3 bucket versioning. This way, deleted or overwritten files will be saved as previous versions you can restore if needed. For more information about S3 bucket versioning, see this guide.
Without versioning, you will not be able to restore files deleted from S3, even if previously copied.
- In the management console
, navigate to the relevant folder. - Select Object Storage.
- Click Create bucket.
- Enter a name for the bucket according to the naming requirements.
- In the Read objects, Read object list, and Read settings fields, select
With authorization. - Click Create bucket.
-
If you do not have the AWS CLI yet, install and configure it.
-
Create a bucket by entering its name following the naming requirements:
aws --endpoint-url=https://storage.yandexcloud.net \ s3 mb s3://<bucket_name>Result:
make_bucket: backup-bucket
Use the create REST API method for the Bucket resource, the BucketService/Create gRPC API call, or the create S3 API method.
Create a service account
Create a service account to be used for backups.
- In the management console
, select Identity and Access Management. - Click Create service account.
- In the Name field, specify
sa-backup-to-s3. - Click
Add role and select thestorage.editorrole. - Click Create.
If you do not have the Yandex Cloud CLI installed yet, install and initialize it.
By default, the CLI uses the folder specified when creating the profile. To change the default folder, use the yc config set folder-id <folder_ID> command. You can also set a different folder for any specific command using the --folder-name or --folder-id parameter.
-
Create a service account:
yc iam service-account create --name sa-backup-to-s3 \ --folder-name <folder_name>Result:
id: ajeab0cnib1p******** folder_id: b0g12ga82bcv******** created_at: "2025-10-03T09:44:35.989446Z" name: sa-backup-to-s3 -
Assign the
storage.editorrole for the folder to the service account:yc resource-manager folder add-access-binding <folder_name> \ --service-account-name sa-backup-to-s3 \ --role storage.editor \ --folder-name <folder_name>Result:
effective_deltas: - action: ADD access_binding: role_id: storage.editor subject: id: ajeab0cnib1p******** type: serviceAccount
- Create a service account named
sa-backup-to-s3. Do it by using the create REST API method for the ServiceAccount resource or the ServiceAccountService/Create gRPC API call. - Assign the
storage.editorrole for the current folder to the the service account. Do it by using the setAccessBindings REST API method for the Folder resource or the FolderService/SetAccessBindings gRPC API call.
Note
To work with objects in an encrypted bucket, a user or service account must have the following roles for the encryption key in addition to the storage.configurer role:
kms.keys.encrypter: To read the key, encrypt and upload objects.kms.keys.decrypter: To read the key, decrypt and download objects.kms.keys.encrypterDecrypter: This role includes thekms.keys.encrypterandkms.keys.decrypterpermissions.
For more information, see Yandex Key Management Service service roles.
Create a static access key
-
In the management console
, select Identity and Access Management. -
In the left-hand panel, select
Service accounts. -
Select the
sa-backup-to-s3service account. -
In the top panel, click
Create new key and select Create static access key. -
Enter a description for the key and click Create.
-
Save the ID and secret key for later when you are mounting the bucket.
Alert
After you close this dialog, the key value will no longer be available.
-
Run this command:
yc iam access-key create \ --service-account-name sa-backup-to-s3Where
--service-account-nameis the name of the service account you are creating the key for.Result:
access_key: id: aje726ab18go******** service_account_id: ajecikmc374i******** created_at: "2024-11-28T14:16:44.936656476Z" key_id: YCAJEOmgIxyYa54LY******** secret: YCMiEYFqczmjJQ2XCHMOenrp1s1-yva1******** -
Save the ID (
key_id) and secret key (secret) for later when you are mounting the bucket.
To create an access key, use the create REST API method for the AccessKey resource or the AccessKeyService/Create gRPC API call.
Save the ID (key_id) and secret key (secret) for later when you are mounting the bucket.
Set up your environment
Install rclone
-
Install the latest version of rclone using the following command:
sudo -v ; curl https://rclone.org/install.sh | sudo bashResult:
... rclone v1.71.1 has successfully installed. Now run "rclone config" for setup. Check https://rclone.org/docs/ for more details.For more information about the command, see this rclone guide
. -
Make sure rclone is installed:
rclone versionResult:
rclone v1.71.1 - os/version: ubuntu 24.04 (64 bit) - os/kernel: 6.14.0-29-generic (x86_64) - os/type: linux - os/arch: amd64 - go/version: go1.25.1 - go/linking: static - go/tags: none
-
Download the rclone archive
from the vendor website and unpack it to a local folder. -
Add the folder to the
PATHvariable to make the utility CLI-accessible from anywhere. To do this:- Click Start and type Change system environment variables in the Windows search bar.
- Click Environment variables... at the bottom right.
- In the window that opens, find the
PATHparameter and click Edit. - Add your folder path to the list.
- Click OK.
-
Make sure rclone is installed:
rclone versionResult:
rclone v1.71.1 - os/version: Microsoft Windows 10 Pro 22H2 22H2 (64 bit) - os/kernel: 10.0.19045.4046 (x86_64) - os/type: windows - os/arch: amd64 - go/version: go1.25.1 - go/linking: static - go/tags: cmount
Set up a connection to Object Storage
-
Run a configuration session for rclone using this command:
rclone config -
Follow the prompts to create a new connection profile:
- Start creating a new profile by entering
nin the terminal. - Enter a connection name, e.g.,
yandex-s3. - Select the storage type by entering
4(Amazon S3 Compliant Storage) in the terminal. - Select the provider by entering
1(Amazon Web Services S3) in the terminal. - Select manual entry of credentials by typing
1in the terminal. - In the terminal, enter the secret key ID you got previously.
- In the terminal, enter the secret key value you got previously.
- Specify the region by entering
ru-central1in the terminal. - Specify the endpoint by entering
storage.yandexcloud.netin the terminal. - You can leave all other settings at their defaults by pressing Enter to skip them; at the last item, select
q(Quit config).
Note
You can perform advanced connection setup if required. To do this, type
yat theEdit advanced config?step. For more information about advanced settings, see thisrcloneguide .Also, you can configure connections by manually editing the
rclone.confconfiguration file. To get the configuration file location, run this command:rclone config file - Start creating a new profile by entering
-
Check your connection to the bucket by running this command:
rclone ls <connection_name>:<bucket_name>If your configuration is correct, in your terminal, you will see a list of objects in the bucket.
Synchronize the local folder with the bucket
To complete the backup setup, configure manual or automatic synchronization of the local folder with the bucket.
Note
Files deleted from your local folder will also be deleted from the bucket. If you do not want to delete files from the bucket when they are deleted from the local folder, run the copy command instead of sync.
Manual synchronization
For a one-off synchronization, run this command:
rclone sync <local_folder_path> <connection_name>:<bucket_name>
Where sync creates an exact copy, including deletion of files from the bucket when they are deleted from the local folder. To copy files without deletion, use the copy command.
For a one-off synchronization, run this command:
rclone sync <local_folder_path> <connection_name>:<bucket_name>
Where sync creates an exact copy, including deletion of files from the bucket when they are deleted from the local folder. To copy files without deletion, use the copy command.
Tip
To avoid running the command manually each time, you can create a BAT file:
-
Open Notepad and add the following contents:
@echo off rclone sync <local_folder_path> <connection_name>:<bucket_name>Where
synccreates an exact copy, including deletion of files from the bucket when they are deleted from the local folder. To copy files without deletion, use thecopycommand. -
Save the file, e.g., as
sync_to_s3.bat. -
To synchronize folders, run the BAT file.
Automatic synchronization
To automatically synchronize your local folder with the bucket:
-
Make sure the user who will schedule the
cronjob has access to the local folder. -
Open the current user's scheduler file by running this command:
crontab -e -
Add a line to the file to trigger autosync, e.g., every 10 minutes:
*/10 * * * * rclone sync <local_folder_path> <connection_name>:<bucket_name> --log-file=<log_file_path>Where:
sync: Command for an exact copy, including deletion of files from the bucket when they are deleted from the local folder. To copy files without deletion, use thecopycommand.--log-file: Optional parameter for writing logs. Specify the full path.
Note
Specify the full absolute path to folders without using
~, e.g.,/home/user/.
The job will run at the specified frequency and synchronize the folders.
For auto sync, set up a task in the Task scheduler:
-
Open the Windows Task Scheduler:
- Start Menu → Task Scheduler.
- Or start it in Run →
taskschd.msc.
-
Click Create task....
-
In the Actions tab, add a new action by specifying the absolute path to the executable script, e.g., a BAT file, under Program or script.
-
In the Triggers tab, add a schedule.
-
Click OK.
How to delete the resources you created
To stop paying for the resources you created:
- Delete the objects from the bucket.
- Delete the bucket.