Blog-Layout

Terraform Remote State on GCP

Michael Hannecke

Step-by-step guide to setup GCP and Terraform to store Remote State information in a GCP bucket.

The body content of your post goes here. To edit this text, click on it and delete this default text and start typing your own or paste your own from a different source.


This is the first of tree posts about terraform remote state on Azure, AWS and GCP.


Feel free to have a look into the other posts as well, if interested in the other hypervisor setups:


Terraform Remote State on Azure



The source code you can find here : https://github.com/bluetuple/terraform-gcp


. . .


Introduction

Harnessing the power of infrastructure-as-code through Terraform is transformative, but when working in a team, managing consistent state files becomes paramount.

Google Cloud Platform (GCP) offers an integrated solution for this very challenge: Terraform remote state storage.

In this blog post, we’ll walk you through the step-by-step process of setting up Terraform’s remote state on GCP, ensuring your infrastructure projects are both collaborative and coherent.

Whether you’re new to Terraform or looking to optimize your current workflows, this guide will be your roadmap to effective state management on GCP.

. . .

A. Initial gcloud Configuration


Set the GCP Cloud Environment


1. Create Environment Variable File

As we have to reference a couple of environment variables, it is recommended to create a hidden file containing all required variables.

Don’t forget to add that secrets-file to your `.gitignore` file if you’re planning to push all code to a git repo!


We’ll directly use the terraform notation for environment variables, so that our terraform code will be able to read the information as-well late on.

Every environment variable starting with `TF_VAR_` will be available for the terraform code.

Within your local terminal:

Export environment variables


Next, we have to initialize the gcp console with the correct settings:

Set gcloud parameter


We now have to set our own user credentials to access the required API’s in the following steps. You’ll be forwarded to the GCP login screen.



Login to your gcp project
Set a region next to you


2. Create a service account for the current project

Follow the name conventions of your company, we recommend using a naming convention like:

sa-<project_name>-<stage>-tf


Create a service account in your GCP project


Take care of the leading two dashes `- -`upfront description and display-name.

The newly created account will be listed in the google cloud console under project -> IAM -> Service Accounts.

Add the service account name to the environment list, this will come in handy later.



Export service account name


3. Enable required API


When we’re using a “fresh” created GCP project, we have to enable a couple of APIs. You can easily do this via the Google cloud console “APIs & Services”:

At least we have to enable the **IAM Service Account Credentials API** and **Cloud Resource Manager API** within the project.


4. Add necessary roles for the newly created service account

We now need to provide necessary roles and permissions. For easy of this example, we will go with Editor permission for the service account. In production environment it is recommended to follow three least privilege principle and reduce permissions as far as possible. We will elaborate on this a bit more a later blog post.


Provide privileges to the account


For this sandbox example we will impersonate the service account to make our terraform changes. In production environment you should use service account credentials and store them in a credentials fault on gcp. We will dig into this in another article.

For impersonating, we need the existing policies for the service account and store them in a local policy.json file.


Get existing policy for the service account


Modify the policy.json to add yourself as a member to the role ServiceAccountTokenCreator. Remember to add the rest of policies that already exist:



policy,json


Now we must update the policies:




B. Terraform configuration

After completing the initial GCP configuration we now need to setup the terraform files defining the infrastructure on GCP we’re planning to setup.


We will use the following general terraform file structure:


Folder structure


For this example we will use ../gcp/sandbox as our project folder.


main.tf

The main.tf is kind of the entry point of our terraform setup, at least from a reader’s perspective. Terraform itself reads all files ending with `.tf` without any special order. While processing, terraform tries to figure out the optimal order for processing the configuration.

In the main.tf file we will initialize the google provider. Use the following as starting point.

This is the first version of main.tf, we will add additional info later:


Initial main.tf


variables.tf

For all variable definition we use the file `variable.tf` as follows:


variables.tf


bucket.tf

This declaration file will hold all storage bucket definitions. For our use case we will define the bucket we want to use for storing the remote state.

We will enable versioning and prohibit public access (versioning enabled= true and public-access_prevention = “enforced”).


Terraform GCP bucket definition


We first must create the bucket, before we can use it for storing the remote state. Of course we could have configured the bucket via the google cloud console or cli, but the idea is to use as much infrastructure as code as possible.

Now we must initialize the bucket. Carry out the following command in the terminal, and ensure that no errors are thrown.




Terraform init to create bucket


We now should have a bucket… You can check in the Google cloud console under Storage -> Buckets.



Define bucket to be used as backend

We now have to provide terraform the info where to find and store the backend state

For production scenarios you might want to place the backend information and versioning in separate files (backend.tf, version.tf) but for simplicity of our sandbox environment we’ll keep all setting dealing with remote state in `main.tf`.

Add the following lines to your main.tf:


Bucket definition


Your final main.tf should now look like this:



main.tf final


For security reason we prefer to put this information in a separate file which than can be excluded from being pushed to a git hub repository.

Create a file named `.backend-config` or whatever suits you. Important is only the leading dot to declare the file as hidden file. Insert put the following declaration with your individual values in it:

Make sure to add this file to your .gitignore if you don’t wan to push it to the repository.

Backend


Your final main.tf now should look like this:




Save everything, run `terraform fmt` and `terraform validate`. If there’s no typo everything should be fine and ready for the next step.



Before we now can run plan/apply to activate our changes, we must run a `terraform init` first, which will change the state from local to the remote storage.


We must provide the backend config as a parameter:


You must acknowledge the switch from local to remote backend, after this you’re done!”

From now on any `terraform plan` or `terraform apply` will be tracked in the remote state, independently from your local client.Neuer Text

By Michael Hannecke 27 Dec, 2023
How to deploy kubernetes nodes with NVIDIA GPU support on GCP using Terraform as Infrastructure as code.
05 Dec, 2023
Summary of responsible AI topics
By Michael Hannecke 01 Dec, 2023
Tips for ensuring security in developing AI applications.
By Michael Hannecke 15 Nov, 2023
Typography of adversarial attacks in generative AI, Process and Countermeasures.
More Posts
Share by: