Amazon AWS

Connecting to AWS S3 with Python

Hello readers, in this tutorial, we will perform some basic operations in AWS S3 using boto3 in Python programming language.

1. What is AWS S3?

AWS Storage Service or simply known as AWS S3 is an online storage facility for the users. It cheap, easy to set up and the user only pays for what they utilize. It offers,

  • To host static web-content and data or even the dynamic pages
  • Data storage for analytics
  • Backup and archival of data
  • Disaster recovery solutions

1.1 S3 Bucket

Amazon S3 Bucket has two primary entities i.e. Object and Bucket, where objects are stored inside the buckets. It provides high availability and durability solutions by replicating the data of one bucket in multiple data centers. Each AWS account offers 100 buckets as free, however, this count can be increased on submitting a request to the support center. Amazon S3 features:

  • Each object in a bucket is given a unique id
  • S3 allows a developer to upload/delete or read an object via the REST API
  • S3 offers two read-after-write and eventual consistency models to ensure that every change command committed to a system should be visible to all the participants
  • Objects stored in a bucket never leave it’s location unless the user transfer it out
  • Objects can be made private or public and rights can be granted to specific users

1.2 S3 Object

S3 Object(s) are the basic entities stored in the Amazon S3 bucket and is a simple key-value store. An object in S3 consists of the following –

  • Key – Represents the name assigned to the object
  • VersionId – Represents the key and version ID uniquely identifies an object. It is a string that S3 generates we add an object to the bucket
  • Value – The content which we are storing in the bucket and can range up to 5 TB in size
  • Metadata – Represents the name-value pairs with which we can store information regarding the object
  • Access Control Information – Through this we control access to the objects stored in Amazon S3

1.3 Setting up Python

If someone needs to go through the Python installation on Windows, please watch this link. To start with this tutorial, I am hoping that readers at present have the python installed.

2. Connecting to AWS S3 using Python

I am using JetBrains PyCharm as my preferred IDE. Readers are free to choose the IDE of their choice.

2.1 Application Pre-requisite

To proceed with this tutorial we need an AWS CLI IAM user. If someone needs to go through the process of creating an IAM user and attaching the Administrator Access policy, please watch this video. Readers are free to choose the S3 Full Access policy if they want to allow access to the CLI user for the AWS S3 service only.

2.2 S3 Operations using Python

Add the AWS CLI user details such as aws_access_key_id, aws_secret_access_key, and region to the AWS credentials file for performing these operations. In our case, we have selected the default region as ap-south-1 wherein we will perform these operations.

2.2.1 List S3 Buckets

To perform the list buckets operation we will use the following Python script.

list_buckets_s3.py

01
02
03
04
05
06
07
08
09
10
11
12
import boto3
 
# getting the s3 client
# offers connecting to the low-level client interface
client = boto3.client('s3')
 
response = client.list_buckets()
 
for bucket in response['Buckets']:
    print('Bucket name= {}'.format(bucket['Name']))
    print('Creation date= {}'.format(bucket['CreationDate']))
    print('============')

If everything goes well the following output will be shown to the user in the IDE console.

Console logs

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
Bucket name= aws-javahome-json
Creation date= 2020-09-12 18:15:09+00:00
============
Bucket name= awstraining-june2020-s3-website-hosting-assignment
Creation date= 2020-07-25 13:51:08+00:00
============
Bucket name= boto3-s3-bucket-2020
Creation date= 2020-09-15 10:57:09+00:00
============
Bucket name= elasticbeanstalk-us-east-2-976836360448
Creation date= 2020-03-21 11:28:45+00:00
============
Bucket name= s3-aws-training-bucket-2020
Creation date= 2020-06-23 18:01:39+00:00
============

2.2.2 Create S3 Bucket

To create the S3 bucket we will use the following Python script.

create_bucket.py

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
import boto3
 
# getting the s3 client
# offers connecting to the low-level client interface
client = boto3.client('s3')
 
response = client.create_bucket(
    ACL='private',
    Bucket='boto3-s3-bucket-2020',
    CreateBucketConfiguration={
        'LocationConstraint': 'ap-south-1'
    }
)
 
print(response)

If everything goes well the S3 bucket with the name boto3-s3-bucket-2020 would be created successfully in the ap-south-1 region. The same can be verified via the S3 console as shown in Fig. 1.

Fig. 1: Creation of S3 bucket

2.2.3 Upload file to S3 object

To perform the put object operation we will use the following Python script.

upload_file.py

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
import boto3
 
# getting the s3 client
# offers connecting to the low-level client interface
client = boto3.client('s3')
 
file_name = 'create_bucket.py'
file_reader = open(file_name).read()
 
response = client.put_object(
    ACL='private',
    Bucket='boto3-s3-bucket-2020',
    Key=file_name,
    Body=file_reader
)
 
# print(response)
if response['ResponseMetadata']['HTTPStatusCode'] == 200:
    print('{} uploaded successfully to the s3 bucket.'.format(file_name))
else:
    print('{} could not be uploaded.'.format(file_name))

If everything goes well the following output will be shown to the user in the IDE console.

Console logs

1
create_bucket.py uploaded successfully to the s3 bucket.

2.2.4 List S3 Bucket objects

To perform the list objects operation we will use the following Python script.

list_objects_s3.py

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
import boto3
 
# getting the s3 client
# offers connecting to the low-level client interface
client = boto3.client('s3')
 
response = client.list_objects(
    Bucket='boto3-s3-bucket-2020'
)
 
# print(response)
 
if 'Contents' in response:
    print('Printing the s3 bucket contents.\n')
    for content in response['Contents']:
        print(content['Key'])
else:
    print('s3 bucket is empty.')

If everything goes well the following output will be shown to the user in the IDE console.

Console logs

01
02
03
04
05
06
07
08
09
10
11
12
13
Printing the s3 bucket contents.
 
2weh5y8rFtE.jpg
3AN011g6dk0.jpg
QZT1WVWS9nk.jpg
Vv7ZwB5CYws.jpg
agon9FwjDxM.jpg
create_bucket.py
data.json
file-sample_150kB.pdf
file_example_CSV_5000.csv
files/
files/employees.csv

2.2.5 Delete S3 Bucket object

To perform the delete object operation we will use the following Python script.

delete_object_s3.py

01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
import boto3
 
# getting the s3 client
# offers connecting to the low-level client interface
client = boto3.client('s3')
 
file_name = 'create_bucket.py'
response = client.delete_object(
    Bucket='boto3-s3-bucket-2020',
    Key=file_name,
)
 
# print(response)
if response['ResponseMetadata']['HTTPStatusCode'] == 200:
    print('{} deleted successfully from the s3 bucket.'.format(file_name))
else:
    print('{} could not be deleted.'.format(file_name))

If everything goes well the following output will be shown to the user in the IDE console.

Console logs

1
create_bucket.py deleted successfully from the s3 bucket.

In case the item could not be deleted the output message saying create_bucket.py could not be deleted. will be shown to the user in the IDE console. That is all for this tutorial and I hope the article served you with whatever you were looking for. Happy Learning and do not forget to share!

3. Summary

In this tutorial, we learned:

  • An introduction to AWS S3
  • Python scripts to perform the basic operations in the S3
  • If readers are interested to go through the details boto3 documentation for S3 they can refer the documentation available at this link

You can download the source code of this tutorial from the Downloads section.

4. Download the Project

This was an example of interacting with AWS S3 using Python programming language.

Download
You can download the full source code of this example here: Connecting to AWS S3 with Python

Yatin

An experience full-stack engineer well versed with Core Java, Spring/Springboot, MVC, Security, AOP, Frontend (Angular & React), and cloud technologies (such as AWS, GCP, Jenkins, Docker, K8).
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
Back to top button