Optimizing deployments
As a codebase grows, so does the complexity of the deployment process. This document outlines some best practices for optimizing deployments as you run into bottlenecks.
Terraform and Artifacts
The default setup for Evolve Platform is that Mach Composer will search for referenced terraform code in the same repository as the config. In smaller codebases this reduces overhead and makes executing terraform locally easier to do, and causes little cognitive overhead.
However as a codebase grows so does the amount of terraform code, and components might be deployed from different git branches or tags. This means that mach composer needs to checkout the repository for each component, which can be slow.
To circumvent this issue, we recommend to use a CI/CD pipeline to build and store the terraform artifacts in a cloud storage environment, and have Mach Composer read from there instead. To achieve this several steps need to be taken:
- Add an additional command to your build.ts file to zip the terraform code during build time:
await buildScripts({
docker: async () => {
await buildDocker({
entryPoint: "./src/index.ts",
outdir: "dist/",
});
},
terraform: async () => {
await bundleTerraform({
archiveName: "dist/terraform.zip",
source: "terraform",
});
},
});
- Push the terraform.zip to a cloud storage environment, such as AWS S3 or Google Cloud Storage. This can most easily be achieved as part of the CI/CD pipeline. For example:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials #v2.0.0
with:
role-to-assume: <your-role>
aws-region: eu-central-1
- name: Upload
shell: bash
run: aws s3 cp components/my-service/dist/terraform.zip s3://my-service-terraform/${{ github.sha }}.zip
- Update the Mach Composer configuration to read from the cloud storage environment:
components:
- name: account-commercetools
source: s3://my-service-terraform/
Doing this will configure Mach Composer to download the source code instead, and largely reduce the init phase.