boto3 and boto
boto3
Getting started
- Create a service account.
- Assign the service account the roles required for your project. For more information about roles, see the Identity and Access Management documentation.
- Create a static access key.
Note
A service account is only allowed to view a list of buckets in the folder it was created in.
A service account can perform actions with objects in buckets that are created in folders different from the service account folder. To enable this, assign the service account roles for the appropriate folder or its bucket.
Installation
To install boto, use the instructions in the developer's repository: boto3
Setup
-
Go to the
~/.aws/
directory (for macOS and Linux) orC:\Users\<username>\.aws\
(for Windows). -
Create a file named
credentials
with the credentials for Object Storage and copy the following information to it:[default] aws_access_key_id = <static_key_ID> aws_secret_access_key = <secret_key>
-
Create a file named
config
with the default region settings and copy the following information to it:[default] region=ru-central1 endpoint_url=https://storage.yandexcloud.net
Note
Some apps designed to work with Amazon S3 do not allow you to specify the region; this is why Object Storage may also accept the
us-east-1
value.
To access Object Storage, use the https://storage.yandexcloud.net
endpoint.
Add environment variables to a function in Cloud Functions:
AWS_ACCESS_KEY_ID
: Static service account key ID.AWS_SECRET_ACCESS_KEY
: Secret key.AWS_DEFAULT_REGION
: Region ID.
Use the Object Storage address to access storage.yandexcloud.net
.
Example
boto3:
#!/usr/bin/env python
#-*- coding: utf-8 -*-
import boto3
session = boto3.session.Session()
s3 = session.client(
service_name='s3',
endpoint_url='https://storage.yandexcloud.net'
)
# Creating a new bucket
s3.create_bucket(Bucket='bucket-name')
# Uploading objects into the bucket
## From a string
s3.put_object(Bucket='bucket-name', Key='object_name', Body='TEST', StorageClass='COLD')
## From a file
s3.upload_file('this_script.py', 'bucket-name', 'py_script.py')
s3.upload_file('this_script.py', 'bucket-name', 'script/py_script.py')
# Getting a list of objects in the bucket
for key in s3.list_objects(Bucket='bucket-name')['Contents']:
print(key['Key'])
# Deleting multiple objects
forDeletion = [{'Key':'object_name'}, {'Key':'script/py_script.py'}]
response = s3.delete_objects(Bucket='bucket-name', Delete={'Objects': forDeletion})
# Retrieving an object
get_object_response = s3.get_object(Bucket='bucket-name',Key='py_script.py')
print(get_object_response['Body'].read())
boto
#!/usr/bin/env python
#-*- coding: utf-8 -*-
import os
from boto.s3.key import Key
from boto.s3.connection import S3Connection
os.environ['S3_USE_SIGV4'] = 'True'
conn = S3Connection(
host='storage.yandexcloud.net'
)
conn.auth_region_name = 'ru-central1'
# Create a new bucket
conn.create_bucket('bucket-name')
bucket = conn.get_bucket('bucket-name')
# Uploading objects into the bucket
## From a string
bucket.new_key('test-string').set_contents_from_string('TEST')
## From a file
file_key_1 = Key(bucket)
file_key_1.key = 'py_script.py'
file_key_1.set_contents_from_filename('this_script.py')
file_key_2 = Key(bucket)
file_key_2.key = 'script/py_script.py'
file_key_2.set_contents_from_filename('this_script.py')
# Getting a list of objects in the bucket
keys_list=bucket.list()
for key in keys_list:
print (key.key)
# Deleting multiple objects
response = bucket.delete_keys(['test-string', 'py_script.py'])
# Retrieving an object
key = bucket.get_key('script/py_script.py')
print (key.get_contents_as_string())
For an example, see this video conversion guide.