Terraform Over A Weekend — Deployments to AWS

Ryan Rafferty
8 min readDec 10, 2020

Spent the weekend provisioning resources to AWS with Terraform. Thought I’d make some notes which hope you find valuable.

What did I do and learn? https://www.udemy.com/course/terraform-associate-prep-course

Implemented concepts of variables, maps etc and went through the documentation which is excellent!

Advanced Concepts also covered including Workspaces and its purpose when creating Dev / Test / QA / Staging / Prod etc and the need for consistency and how to deploy.

How Terraform env’s are managed and operated with by Workspaces and how to switch between the different environments and tear them down.

Pushing a deployment environment from staging to production using Workspaces.

Further Notes:

Dynamic Blocks
These are a way we can use reusable code in Terraform

PROVIDER VERSIONS
“~> 2.0“
“>= 2.9.0“ — (OR CAN HAVE MINOR INCREMENTS)

STRUCTURE TYPES
These are tuples and objects

DATA SOURCES
https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/instance

These are a way for terraform to query AWS to get a result — they don’t actually set anything up but query AWS to get you information back such as DB servers in a specific region etc. (Good for auditing purposes or setting things up)

BUILT-IN FUNCTIONS
This is where you’d pass in user data using the “file” function to pass things in to EC2’s
Another example is “element” which when you pass in a position it will return the value
Another example is the “values” which takes a map and returns a list containing values of elements in the map
Flatten is if you have multiple lists it will squash them down so you’d have 1.

VERSIONING
Basically for only specific exact inputs of versions — For example “=“ or no operator — exact version equality
“!=:“ version not equal
“>, >=, <, <=:” — Greater than, lesser than etc. ~>: — this is a pessimistic constraint operator

MODULES
Terraform modules allow to create blocks of reusable code, these are in a nutshell, folders with terraform code in them that TF scans and sees what it needs — For instance a VPC could have 100’s lines of code so you would reference this in the main.tf folder as this below. It can also support Local and Remote modules

Note — (You don’t need to add a provider in db.tf folder as this is already set up in the main.tf module folder )

API + PROVIDERS
If an API is available you can create your own Provider.
Providers use plugins (interacts with API’s etc)

MULTI-PROVIDERS SETUP
To make them available to pull in the Terraform plugins you need to run “terraform init”

CUSTOM PLUGINS

Ones that isn’t available you can manually put them into the folders to be available — Linux/Other = ~/.terraform.d/plugins
Windows = %APPDATA%\terraform.d\plugins

LOCAL VS REMOTE EXEC — For when plugins simply don’t do the job allowing you to execute something from your machine (If user data for example fails — Terraform will give a success as it doesn’t care however you can use local exec with Ansible from your local machine — when TF deploys local exec would execute on the underlying machine to make changes you would like. (More engrained) )

TERRAFORM REGISTRY
This is where you find a wide range of modules (these have been verified by hashicorp / trusted)

https://registry.terraform.io/modules/terraform-aws-modules

MODULE INPUTS

For local variable module inputs If you’re creating these you should create the “moduledemo” (or whatever you choose to) — Create a db.tf file and main.tf file ( Do the same for any other module import you want to create — i.e VPC etc etc) — From then make it look like the below.

2.

3. Reference this in the main.tf file

MODULE OUTPUTS
In here you can group together all the outputs for your module
Gives a central place to look…
Attributes are “.private_ip”

1 — create an outputs.tf file and put stuff in it like this

2 — Next navigate to main.tf and enter the block to look like

3 — Then when you run it will give you the output as per below.

CHILD MODULES
These are modules under the parent essentially.

  1. Create your folders within the db folder (I.e if you are wanting to create 2 different databases mariadb and mysql)

2 — Next give the entry in the main.tf a specific “tunnel vision” update to what you want the terraform apply to look at and launch (terraform looks insides all the files to see what it needs to set up so this way by specifying it directly you are telling it specifically what to look for)

HOW TO PASS IN ENVIRONMENT VARIABLES
Once you add an input variable into the main.tf (See below)

Once the above is done from the command line CLI enter “export TF_VAR_vpcname=envvpc”. (For windows the command is “setx”. )

Environment variables are useful for entering variables or secrets etc that you don’t want in the code.

CLI VARIABLES
Another way to pass in env variables is to type from the command line terraform plan -var=”vpcname=cliname”

ORDER OF VARIABLES — Priorty

  • Environment Variables
  • Terraform.tfvars
  • Terraform.tfvars.json
  • Any.auto.tfvars
  • Any -var or -var-file options

USING TFVARS FILE
(Tfvars is used by Terraform if no variable is set and it cannot find anything else * it defaults to TFVARS essentially….)

TO UNSET ENV VARIABLES USING THE CLI
Type in the Cli “unset TF_VAR_vpcname”

SETTING AUTO TFVARS
If TFVARS file is not set, and env variables not set it will look something like dev.auto.tfvars

MULTIPLE VALUE FILES
This will pick up the values from the prod.tfvars file ( this is useful being able to set these values when creating different prod / non prod environments etc )
Good to use multiple tfvars files for different AMI’s etc / prod / non prod environments etc..

TERRAFORM WORKFLOW
Write — Use Terraform Cloud as Dev Environment (Can store variables and state inside Terraform Cloud)

In Terraform cloud when a PR is raised, Terraform Plan is run.

Create — Before merging a second plan is run before approval to create

TO VALIDATE YOUR CODE BEFOREHAND
Type in Terraform Validate

TERRAFORM EXTRA COMMANDS
Terraform fmt — this formats the code nicely for use within your IDE.

TERRAFORM TAINTING
If you’re launching a massive infrastructure and some things fail — you can mark resources as taint to mark to be replaced essentially

TERRAFORM UNTAINT
Run “terraform untaint aws_vpc.myvpc2”

TERRAFORM IMPORT
Make your click ops things import to Terraform by copying the ID and typing in the command “terraform import aws_vpc.vpcimport vpc-xxxxxxxxxxxxxxxxx”

  • TERRAFORM WORKSPACES
    To see your current workspace you are in type in “terraform workspace list”
  • To create a new workspace (i.e dev , test, staging, prod etc) — Type in “terraform workspace new dev”
  • To select a workspace type in “terraform workspace show”
  • To swap the env you are in type in “terraform workspace select”
  • To delete a workspace type in “terraform delete dev”

TERRAFORM STATELIST
To show a list of resources inside that will be created type in “terraform state list”

TERRAFORM STATE PULL
You are not limited to having your state on your local machine and CAN have them stored in the cloud (S3 Bucket etc)
To pull down type in “Terraform state pull”. *(Note — If you do this on your local machine there probably isn’t any point as you can just see the files anyway)…..

RENAMINING RESOURCES INSIDE YOUR STATEFILE
Be careful with this command as messing around with it can rename resources and introduces a world of pain (backup-out will make a backup) — Type in “terraform state mv aws_vpc.myvpc2 aws_vpc.2myvpc -backup-out=

DELETING FROM AWS CONSOLE AFTER YOU DO A IMPORT TO TERRAFORM
Doing this will make your statefile still list the resource — to remove type in “terraform state rm aws_vpc.importvpc”

DEBUGGING TERRAFORM
Debug bug reports — type in “export TF_LOG=“
(This basically means it exports a bug report which you can link to the forums etc for help with)

Info, warn, debug, Trace (the most verbose logging you can create)

SECURING TERRAFORM KEYS
DO NOT store access keys but rather use environment variables, the CLI or the Vault provider.

WHAT IS SENTINEL
SECURITY Policy as code — Defines policy as to what is created (creating something in production etc you don’t want port 22 open, Sentinel would essentially make sure code adheres to the
https://www.hashicorp.com/sentinel/

PASSING IN SECRETS CREDENTIALS ETC
To do this you can use Secrets Injection
Vault can store secrets and access them to retrieve secrets
https://www.vaultproject.io/

TO WRITE A SECRET IN VAULT
“Vault kv put secret/hellp foo-world”

TO GET / RETRIEVE A SECRET IN VAULT. (Grabs from behind the scenes with an API)
“vault kv get secret/hello”

TO REMOVE AND CLEAN UP SECURITY PASSWORDS ETC FROM TERRAFORM FILES
Lets say this for example below — big NO NO hard coding credentials ……

You need to call Vault as an import provider to talk to it.

To eliminate the need to hard code the username and password — remove the values and replace with for username “data.dbuser.data” and for password “data.dbpassword.data”
Data sources (below) are queries used to query vault

The default key is Value in Vault and is how it’s pulled

TERRAFORM STATE FILE
Keeps track of what you create

TERRAFORM REFRESH
Used for drift when you delete something — state is stored in the stateful
To retrieve type in “Terraform refresh” and then next “Terraform Plan” — this will get them back in sync.

--

--