Amazon AWS

How to Create AWS S3 Bucket using Terraform

Hello. In this tutorial, we will explain a popular open-source deployment automation software popularly known as Terraform. We will also be creating an S3 bucket using Terraform on AWS.

1. Introduction

Terraform is a tool for building, changing, and versioning the infrastructure safely and efficiently. It is used to manage the infrastructure of the popular cloud service providers and custom in-house solutions. It helps manage both low-level (Compute, Storage, Networking, etc.) and high-level components (such as SaaS, DNS, etc.) Terraform deployment automation is divided into different sections i.e. –

  • IaaC – IaaC is popularly known as the Infrastructure as a Code wherein the infrastructure is described using a high-level configuration syntax. This allows a blueprint of the infrastructure which can be deployed, versioned, and shared for re-use
  • Execution Plans – Terraform has a planning step where it generates an execution plan. The execution plan tells the administrator what Terraform will do once applied and helps to avoid any surprises when it creates the infrastructure
  • Resource Graph – Terraform builds a graph of all the resources and parallelizes the creation and modification of non-dependent resources. This offers insights into learning the dependencies in their infrastructure
  • Change Automation – Terraform allows to apply of complex changesets to the infrastructure with minimal human intervention

1.1 Configuration language

Terraform has its configuration language designed to meet the infrastructure automation requirements. The main purpose of this language is to declare resources and a group of resources (gathered into a module) represents a larger unit of configuration. Language syntax consists of few elements i.e. – Blocks, Arguments, and Expressions.

  • Blocks – Containers for other contents and represents the object configuration
  • Arguments – Assign a value to the name and appear within the blocks
  • Expressions – Represents a single value, referenced value, or combination of other values

1.2 Steps

To create the infrastructure via the Terraform scripts following commands need to be executed.

  • terraform init – Initializing the new or existing terraform configuration
  • terraform plan – Generate the execution plan from the resources specified in the file
  • terraform apply – Create the infrastructure from the resources specified in the file
  • terraform destroy – Destroy the created infrastructure

2. Practice

Let us dive into some practice stuff from here. You’re free to choose the IDE of your choice. I am using Visual Studio Code as my preferred IDE for the development with the HashiCorp Terraform extension installed. The extension offers syntax highlighting and other editing features for Terraform files using the Terraform language server.

2.1 Pre-requisite

To proceed we will be needing an AWS CLI user having the right set of permissions required for creating the infrastructure. I am using an existing user and attached the IAM full access policy attached to this user so that the IAM user can successfully create the required infrastructure. The access and secret key generated for the user will be used in the variables.tf file.

2.2 Creating a S3 module

Create a module named bucketcreation and add the files to it required for creating the bucket via terraform code.

2.2.1 Variables File

The file contains the declarations required to create the S3 bucket. Add the following code to the file containing information related to the bucket-name, versioning, ACL, etc. You’re free to change the values as per your need.

var.tf

variable "bucket_name" {
  type        = string
  description = "specify the bucket name. should be unique and does not contain underscore or upper-case letters"
  default     = "mybucket"
}
variable "acl" {
  type        = string
  description = "defaults to private"
  default     = "private"
}
variable "versioning" {
  type        = bool
  description = "enable versioning or not"
  default     = false
}
variable "tags" {
  type        = map(string)
  description = "mapping of tags to assign to the bucket"
  default = {
    terraform = "true"
  }
}

2.2.2 Bucket File

The file contains the resource that will be used to create the S3 bucket. The file will get the variable details from the var.tf file. Add the following code to it.

bucket.tf

resource "random_uuid" "uuid" {}

resource "aws_s3_bucket" "demos3" {
  # as the bucket name is universally global. append the uuid to avoid the duplicate error while creation
  bucket = format("%s-%s", var.bucket_name, random_uuid.uuid.result)
  acl    = var.acl
  versioning {
    enabled = var.versioning
  }
  tags          = var.tags
  force_destroy = true
}

2.3 Variables File

The file contains the declarations to be used across the terraform module. Add the following code to the file containing information related to the region code and CLI user access. You’re free to change the values as per your need.

variables.tf

variable "cli_usr_access_key" {
  type    = string
  default = "usr_access_key" # specify the access key
}
variable "cli_usr_secret_key" {
  type    = string
  default = "usr_secret_key" # specify the secret key
}
variable "selected_region" {
  type    = string
  default = "ap-south-1" # specify the aws region
}

2.4 Provider File

The provider file lists the plugin that allows for the full lifecycle management of cloud resources. In our case, we will be using the AWS provider. The block consists of the details that will be used to connect with the AWS cloud.

  • region – The attribute for setting up the infrastructure. This is a mandatory field that cannot be skipped and can be referenced via the variables
  • access_key and secret_key – The CLI user credentials will be responsible for setting up the infrastructure. Remember that the user must have the required policies assigned to set up the infrastructure. Since this is an optional field; many a time developers like to use the profile attribute in case they don’t want to hardcode the credentials in the terraform file

Add the following code to the file.

provider.tf

provider "aws" {
  region = var.selected_region
  # user should have the administration policy or policy as per the lpp principle
  access_key = var.cli_usr_access_key
  secret_key = var.cli_usr_secret_key
}

2.5 Main File

The file contains the module information. Add the following to it.

main.tf

module "name" {
  # s3 module path
  source = "./bucketcreation"
}

3. Code Run

Navigate to the project directory containing the above scripts and open the terminal. Execute the below commands in the respective order within the directory.

Commands

-- step1: initializing the new or existing terraform configuration --
terraform init

-- step2: generating the execution plan --
terraform plan

-- step3: building the infrastructure --
-- auto-approve flag skips interactive approval of the plan before applying
terraform apply --auto-approve

-- step4: destroying the infrastructure --
-- auto-approve flag skips interactive approval of the plan before applying
terraform destroy --auto-approve

4. Demo

Once the terraform script is successfully executed head over the AWS console to confirm that the bucket is successfully created on the AWS S3 dashboard. You can refer to the bucket.tf file to understand the bucket name convention.

Fig. 1: AWS S3 bucket created via Terraform

That is all for this tutorial and I hope the article served you with whatever you were looking for. Happy Learning and do not forget to share!

5. Summary

In this tutorial, we learned an introduction to Terraform and create a simple module to create the S3 bucket in the AWS portal. You can download the source code from the Downloads section.

6. Download the Project

This was a tutorial on learning and implementing Terraform to create the S3 bucket on AWS.

Download
You can download the full source code of this example here: How to Create AWS S3 Bucket using Terraform

Yatin

An experience full-stack engineer well versed with Core Java, Spring/Springboot, MVC, Security, AOP, Frontend (Angular & React), and cloud technologies (such as AWS, GCP, Jenkins, Docker, K8).
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
Back to top button