Monday, October 14, 2024
No menu items!
HomeCloud ComputingMulti-environment service orchestrations

Multi-environment service orchestrations

In a previous post, I showed how to use a GitOps approach to manage the deployment lifecycle of a service orchestration. This approach makes it easy to deploy changes to a workflow in a staging environment, run tests against it, and gradually roll out these changes to the production environment. 

While GitOps helps to manage the deployment lifecycle, it’s not enough. Sometimes, you need to make changes to the workflow before deploying to different environments. You need to design workflows with multiple environments in mind. 

For example, instead of hardcoding the URLs called from the workflow, you should replace the URLs with staging and production URLs depending on where the workflow is being deployed. 

Let’s explore three different ways of replacing URLs in a workflow.

Option 1: Pass URLs as runtime arguments

Option 1

In the first option, you define URLs as runtime arguments and use them whenever you need to call a service:

code_block[StructValue([(u’code’, u’main:rn params: [args]rn steps:rn – init:rn assign:rn – url1: ${args.urls.url1}rn – url2: ${args.urls.url2}’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9baf7c50>)])]

You can deploy workflow1.yaml as an example:

code_block[StructValue([(u’code’, u’gcloud workflows deploy multi-env1 –source workflow1.yaml’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9baf79d0>)])]

Run the workflow in the staging environment with staging URLs:

code_block[StructValue([(u’code’, u’gcloud workflows run multi-env1 –data='{“urls”:{“url1”: “https://us-central1-projectid.cloudfunctions.net/func1-staging”, “url2”: “https://us-central1-projectid.cloudfunctions.net/func2-staging”}}”), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9b8d9590>)])]

And, run the workflow in the prod environment with prod URLs:

code_block[StructValue([(u’code’, u’gcloud workflows run multi-env1 –data='{“urls”:{“url1”: “https://us-central1-projectid.cloudfunctions.net/func1-prod”, “url2”: “https://us-central1-projectid.cloudfunctions.net/func2-prod”}}”), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9b7ff2d0>)])]

Note: These runtime arguments can also be passed when triggering using API, client libraries, or scheduled triggers but not when triggering with Eventarc.

Option 2: Use Cloud Build to deploy multiple versions

Option 2

In the second option, you use Cloud Build to deploy multiple versions of the workflow with the appropriate staging and prod URLs replaced at deployment time.

Run setup.sh to enable required services and grant necessary roles.

Define a YAML (see workflow2.yaml for an example) that has placeholder values for URLs:

code_block[StructValue([(u’code’, u’main:rn steps:rn – init:rn assign:rn – url1: REPLACE_url1rn – url2: REPLACE_url2′), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e99e59fd0>)])]

Define cloubuild.yaml that has a step to replace placeholder URLs and a deployment step:

code_block[StructValue([(u’code’, u’steps:rn- id: ‘replace-urls’rn name: ‘gcr.io/cloud-builders/gcloud’rn entrypoint: bashrn args:rn – -crn – |rn sed -i -e “s~REPLACE_url1~$_URL1~” workflow2.yamlrn sed -i -e “s~REPLACE_url2~$_URL2~” workflow2.yamlrn- id: ‘deploy-workflow’rn name: ‘gcr.io/cloud-builders/gcloud’rn args: [‘workflows’, ‘deploy’, ‘multi-env2-$_ENV’, ‘–source’, ‘workflow2.yaml’]’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9baa4590>)])]

Deploy the workflow in the staging environment with staging URLs:

code_block[StructValue([(u’code’, u’gcloud builds submit –config cloudbuild.yaml –substitutions=_ENV=staging,_URL1=”https://us-central1-projectid.cloudfunctions.net/func1-staging”,_URL2=”https://us-central1-projectid.cloudfunctions.net/func2-staging”‘), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9af65910>)])]

Deploy the workflow in the prod environment with prod URLs:

code_block[StructValue([(u’code’, u’gcloud builds submit –config cloudbuild.yaml –substitutions=_ENV=prod,_URL1=”https://us-central1-projectid.cloudfunctions.net/func1-prod”,_URL2=”https://us-central1-projectid.cloudfunctions.net/func2-prod”‘), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9b7ff890>)])]

Now, you have two workflows ready to run in staging and prod environments:

code_block[StructValue([(u’code’, u’gcloud workflows run multi-env2-stagingrngcloud workflows run multi-env2-prod’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9b7ff0d0>)])]

Option 3: Use Terraform to deploy multiple versions

Option 3

In the third option, you use Terraform to deploy multiple versions of the workflow with the appropriate staging and prod URLs replaced at deployment time.

Define a YAML (see workflow3.yaml for an example) that has placeholder values for URLs:

code_block[StructValue([(u’code’, u’main:rn steps:rn – init:rn assign:rn – url1: ${url1}rn – url2: ${url2}’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e995dd550>)])]

Define main.tf that creates staging and prod workflows:

code_block[StructValue([(u’code’, u’variable “project_id” {rn type = stringrn}rn rnvariable “url1” {rn type = stringrn}rn rnvariable “url2” {rn type = stringrn}rn rnlocals {rn env = [“staging”, “prod”]rn}rn rn# Define and deploy staging and prod workflowsrnresource “google_workflows_workflow” “multi-env3-workflows” {rn for_each = toset(local.env)rn rn name = “multi-env3-${each.key}”rn project = var.project_idrn region = “us-central1″rn source_contents = templatefile(“${path.module}/workflow3.yaml”, { url1 : “${var.url1}-${each.key}”, url2 : “${var.url2}-${each.key}” })rn}’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9bcb08d0>)])]

Initialize Terraform:

code_block[StructValue([(u’code’, u’terraform init’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9bcb0a50>)])]

Check the planned changes:

code_block[StructValue([(u’code’, u’terraform plan -var=”project_id=YOUR-PROJECT-ID” -var=”url1=https://us-central1-projectid.cloudfunctions.net/func1″ -var=”url2=https://us-central1-projectid.cloudfunctions.net/func2″‘), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9bcb07d0>)])]

Deploy the workflow in the staging environment with staging URLs and the prod environment with prod URLs:

code_block[StructValue([(u’code’, u’terraform apply -var=”project_id=YOUR-PROJECT-ID” -var=”url1=https://us-central1-projectid.cloudfunctions.net/func1″ -var=”url2=https://us-central1-projectid.cloudfunctions.net/func2″‘), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9a2ee590>)])]

Now, you have two workflows ready to run in staging and prod environments:

code_block[StructValue([(u’code’, u’gcloud workflows run multi-env3-stagingrngcloud workflows run multi-env3-prod’), (u’language’, u”), (u’caption’, <wagtail.wagtailcore.rich_text.RichText object at 0x3e2e9bb3be10>)])]

Pros and cons

At this point, you might be wondering which option is best. 

Option 1 has a simpler setup (a single workflow deployment) but a more complicated execution, as you need to pass in URLs for every execution. If you have a lot of URLs, executions can get too verbose with all the runtime arguments for URLs. Also, you can’t tell which URLs your workflow will call until you actually execute the workflow.

Option 2 has a more complicated setup with multiple workflow deployments with Cloud Build. However, the workflow contains the URLs being called and that results in a simpler execution and debugging experience.

Option 3 is pretty much the same as Option 2 but for Terraform users. If you’re already using Terraform, it probably makes sense to also rely on Terraform to replace URLs for different environments. 

This post provided examples of how to implement multi-environment workflows. If you have questions or feedback, feel free to reach out to me on Twitter @meteatamel.

Related Article

GitOps your service orchestrations

This blog post describes how to set up a simple Git-driven development, testing, and deployment pipeline for Workflows using Cloud Build.

Read Article

Cloud BlogRead More

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments