April 5, 2018 | Cloud, DevOps, Hashicorp, Terraform Provider
Google Cloud Functions is the Google Cloud Platform (GCP) function-as-a-service offering. It allows you to execute your code in response to event triggers – HTTP, PubSub and Storage. While it currently only supports Node.js code for execution, it has proved very useful for running low-frequency operational tasks and other batch jobs in GCP.
WRITTEN BY
When using Hashicorp Terraform to create your infrastructure, it was previously not possible to create Google Cloud Functions. This was painful as you were required to make Cloud Functions using the gcloud tool, losing the benefits of Terraform such as declarative syntax and convergence onto state. At best, you have two codebases (Terraform and gcloud) rather than one.
But times have changed…! I recently implemented Cloud Functions in Terraform Google Cloud provider, and will demonstrate here how you can now create Google Cloud Functions using Terraform.
So let’s look at the Terraform Cloud Function resource definition:
resource "google_cloudfunctions_function" "test" {
name = "[FunctionName]"
entry_point = "helloGET"
available_memory_mb = 128
timeout = 61
project = "[GCPProjectName]"
region = "us-central1"
trigger_http = true
trigger_topic = "[PubSubTopic]"
trigger_bucket = "[StorageBucketName]"
source_archive_bucket = "${google_storage_bucket.bucket.name}"
source_archive_object = "${google_storage_bucket_object.archive.name}"
labels {
deployment_name = "test"
}
}
As usual with Terraform this resource definition is straightforward and declarative. We do need to look at those parameters in more detail.
You specify the function name, the function which acts as entry_point on invocation of the Cloud Function, memory available to the function call in MB (available_memory_mb), how long the Cloud Function has to execute in seconds (timeout), the GCP project name (project) and the GCP region (region).
Some less clear parameters are:
resource "google_storage_bucket" "bucket" {
name = "cloudfunction-deploy-test1"
}
data "archive_file" "http_trigger" {
type = "zip"
output_path = "${path.module}/files/http_trigger.zip"
source {
content = "${file("${path.module}/files/http_trigger.js")}"
filename = "index.js"
}
}
resource "google_storage_bucket_object" "archive" {
name = "http_trigger.zip"
bucket = "${google_storage_bucket.bucket.name}"
source = "${path.module}/files/http_trigger.zip"
depends_on = ["data.archive_file.http_trigger"]
}
Here I am creating a Storage bucket, zipping up the function code and delivering it into the GCP Storage bucket.
This sample function, called helloGET (remember to put this as the entry_point)…
/**
* HTTP Cloud Function.
*
* @param {Object} req Cloud Function request context.
* @param {Object} res Cloud Function response context.
*/
exports.helloGET = function helloGET (req, res) {
res.send(`Hello ${req.body.name || 'World'}!`);
};
Here we showed how to get started how to create Google Cloud Functions with Terraform. In general, Terraform makes defining infrastructure-as-code easy and intuitive, and deploying functions is a great fit for it. Your next steps might be to see what else you can define in terraform. Alternatively, for more info about this module, please see the documentation.
This blog is written exclusively by the OpenCredo team. We do not accept external contributions.
Cloud for Business in 2023: Raconteur and techUK webinar (Recording)
Check out the recording of our CEO/CTO Nicki Watt, and other panellists at the Raconteur and techUK webinar discussing “The Cloud for Business Report,” which…Lunch & Learn: Secure Pipelines Enforcing policies using OPA
Watch our Lunch & Learn by Hieu Doan and Alberto Faedda as they share how engineers and security teams can secure their software development processes…