How to Set Up CI/CD for Monorepo Using Buildkite and AWS: Complete Guide
By Braincuber Team
Published on April 4, 2026
What You'll Learn:
- What a monorepo is and the CI/CD challenges it presents
- How to deploy Buildkite Elastic CI Stack on AWS with autoscaling EC2 instances
- How to configure GitHub webhooks to trigger Buildkite CI pipelines
- How to use git diff to dynamically trigger appropriate pipelines when sub-projects change
- Pull Request workflow: detecting changes, running tests, and reporting status to GitHub
- Merge workflow: deploying to staging, manual production release approval
- How to automate all infrastructure setup using bash scripts
A monorepo is a single repository that holds all the code and multiple projects in one Git repository. This setup is quite nice to work with because of its flexibility and ability to manage various services and frontends in a single repository. It also eliminates the hassle of tracking changes in multiple repositories and updating dependencies as projects change. On the other hand, monorepos also come with their challenges, specifically as relates to Continuous Integration. As individual sub-projects within the monorepo change, we need to identify which sub-projects changed to build and deploy them. In this complete tutorial, we will set up a full CI/CD pipeline for a monorepo using Buildkite, GitHub, and AWS. This step by step guide will show you exactly how to build automated, dynamic pipelines that only build what changed.
Understanding the Buildkite Workflow
The Buildkite workflow consists of Pipelines and Steps. The top-level containers for modeling and defining workflows are called Pipelines. Steps run individual tasks or commands. We will set up three types of pipelines: pull-request, merge, and deploy.
Pull Request Workflow
Creating a new Pull Request in GitHub triggers the pull-request pipeline in Buildkite. This pipeline then runs git diff to identify which folders (projects) within the monorepo have changed. If it detects changes, then it will dynamically trigger the appropriate Pull Request pipeline defined for that project. Buildkite reports the status of each pipeline back to GitHub status check.
Merge Workflow
The Pull Request is merged when all status checks in GitHub pass. Merging Pull Request triggers the merge pipeline in Buildkite. Similar to the previous pipeline, the merge pipeline identifies the projects that have changed and triggers the corresponding deploy pipeline for it. The Deploy pipeline initially deploys changes to the staging environment. Once the deployment to staging is complete, production deployment is manually released.
Project Structure
Here is the final project structure we will build. This structure separates CI configuration at the root level and per-service level, with automation scripts in the bin directory.
buildkite-monorepo-example/
├── .buildkite/
│ ├── diff (git diff script)
│ ├── merge.yml (merge pipeline config)
│ ├── pull-request.yml (PR pipeline config)
│ └── pipelines/
│ ├── deploy.json (deploy pipeline template)
│ ├── merge.json (merge pipeline template)
│ └── pull-request.json (PR pipeline template)
├── bin/
│ ├── create-pipeline (pipeline creation script)
│ ├── create-secrets-bucket (S3 + SSH key setup)
│ ├── deploy-ci-stack (CloudFormation deployment)
│ └── stack-config (CI stack parameters)
├── foo-service/
│ ├── .buildkite/
│ │ ├── deploy.yml
│ │ ├── merge.yml
│ │ └── pull-request.yml
│ └── bin/
│ └── deploy
└── bar-service/
├── .buildkite/
│ ├── deploy.yml
│ ├── merge.yml
│ └── pull-request.yml
└── bin/
└── deploy
Prerequisites
AWS Account
AWS account to deploy Buildkite agents and configure AWS CLI to talk to your account.
Buildkite Account
Buildkite account to create continuous integration pipelines and get your agent token.
GitHub Account
GitHub account to host the monorepo source code and configure webhooks.
Bash/CLI Access
Terminal access to run bash scripts for automated infrastructure setup.
Step 1: Set Up the Project
Create a new Git project and push it to GitHub. Run the following commands in the CLI.
mkdir buildkite-monorepo-example
cd buildkite-monorepo-example
git init
echo node_modules/ > .gitignore
git add .
git commit -m "initialize repository"
git remote add origin
git push origin master
Step 2: Set Up the Buildkite Infrastructure
Create the Secrets Bucket and SSH Keys
Create a bin directory with executable scripts. The create-secrets-bucket script creates an S3 bucket that is used to store SSH keys. Buildkite uses this key to connect to the GitHub repo. The script also generates an SSH key and sets its permission correctly.
#!/bin/bash
set -eou pipefail
CURRENT_DIR=$(pwd)
ROOT_DIR="$( dirname "${BASH_SOURCE[0]}" )"/..
BUCKET_NAME="buildkite-secrets-adikari"
KEY="id_rsa_buildkite"
echo "creating bucket $BUCKET_NAME.."
aws s3 mb s3://$BUCKET_NAME
# Generate SSH Key
ssh-keygen -t rsa -b 4096 -f $KEY -N ''
# Copy SSH Keys to S3 bucket
aws s3 cp --acl private --sse aws:kms $KEY "s3://$BUCKET_NAME/private_ssh_key"
aws s3 cp --acl private --sse aws:kms $KEY.pub "s3://$BUCKET_NAME/public_key.pub"
if [[ "$OSTYPE" == "darwin"* ]]; then
pbcopy < id_rsa_buildkite.pub
echo "public key contents copied in clipboard."
else
cat id_rsa_buildkite.pub
fi
# Move SSH Keys to ~/.ssh directory
mv ./$KEY* ~/.ssh
chmod 600 ~/.ssh/$KEY
chmod 644 ~/.ssh/$KEY.pub
cd $CURRENT_DIR
The script copies the generated public and private keys to the ~/.ssh folder. These keys can be used later to SSH into the EC2 instance running the Buildkite agent for debugging. Navigate to GitHub Settings > SSH Keys, add a new SSH key, then paste in the contents of id_rsa_buildkite.pub.
Deploy AWS Elastic CI CloudFormation Stack
The folks at Buildkite have created the Elastic CI Stack for AWS, which creates a private, autoscaling Buildkite Agent cluster in AWS. Let us deploy the infrastructure to our AWS Account.
#!/bin/bash
set -euo pipefail
[ -z $BUILDKITE_AGENT_TOKEN ] && { echo "BUILDKITE_AGENT_TOKEN is not set."; exit 1;}
CURRENT_DIR=$(pwd)
ROOT_DIR="$( dirname "${BASH_SOURCE[0]}" )"/..
PARAMETERS=$(cat ./bin/stack-config | envsubst)
cd $ROOT_DIR
echo "downloading elastic ci stack template.."
curl -s https://s3.amazonaws.com/buildkite-aws-stack/latest/aws-stack.yml -O
aws cloudformation deploy --capabilities CAPABILITY_NAMED_IAM --template-file ./aws-stack.yml --stack-name "buildkite-elastic-ci" --parameter-overrides $PARAMETERS
rm -f aws-stack.yml
cd $CURRENT_DIR
You can get the BUILDKITE_AGENT_TOKEN from the Agents tab in Buildkite's Console. Next, create a new file called bin/stack-config. Configuration in this file overrides the CloudFormation parameters.
BuildkiteAgentToken=$BUILDKITE_AGENT_TOKEN
SecretsBucket=buildkite-secrets-adikari
InstanceType=t2.micro
MinSize=0
MaxSize=3
ScaleUpAdjustment=2
ScaleDownAdjustment=-1
Run the script in the CLI to deploy the CloudFormation stack: ./bin/deploy-ci-stack. The script will take some time to finish. The CloudFormation stack creates an Autoscaling Group that Buildkite will use to spawn EC2 instances. The Buildkite Agents and the builds run inside those EC2 instances.
Step 3: Create Build Pipelines in Buildkite
At this point, we have the infrastructure ready that is required to run Buildkite. Next, we configure Buildkite and create some Pipelines. Create an API Access Token at https://buildkite.com/user/api-access-tokens and set the scope to write_builds, read_pipelines, and write_pipelines.
#!/bin/bash
set -euo pipefail
export SERVICE="."
export PIPELINE_TYPE=""
export REPOSITORY=git@github.com:adikari/buildkite-docker-example.git
CURRENT_DIR=$(pwd)
ROOT_DIR="$( dirname "${BASH_SOURCE[0]}" )"/..
STATUS_CHECK=false
BUILDKITE_ORG_SLUG=adikari
USAGE="USAGE: $(basename "$0") [-s|--service] service_name [-t|--type] pipeline_type
Eg: create-pipeline --type pull-request
create-pipeline --type merge --service foo-service
create-pipeline --type merge --status-checks
NOTE: BUILDKITE_API_TOKEN must be set in environment
ARGUMENTS:
-t | --type buildkite pipeline type (required)
-s | --service service name (optional, default: deploy root pipeline)
-r | --repository github repository url (optional)
-c | --status-checks enable github status checks (optional, default: true)
-h | --help show this help text"
[ -z $BUILDKITE_API_TOKEN ] && { echo "BUILDKITE_API_TOKEN is not set."; exit 1;}
while [ $# -gt 0 ]; do
if [[ $1 =~ "--"* ]]; then
case $1 in
--help|-h) echo "$USAGE"; exit; ;;
--service|-s) SERVICE=$2;;
--type|-t) PIPELINE_TYPE=$2;;
--repository|-r) REPOSITORY=$2;;
--status-check|-c) STATUS_CHECK=${2:-true};;
esac
fi
shift
done
[ -z "$PIPELINE_TYPE" ] && { echo "$USAGE"; exit 1; }
export PIPELINE_NAME=$([ $SERVICE == "." ] && echo "" || echo "$SERVICE-")$PIPELINE_TYPE
BUILDKITE_CONFIG_FILE=.buildkite/pipelines/$PIPELINE_TYPE.json
[ ! -f "$BUILDKITE_CONFIG_FILE" ] && { echo "Invalid pipeline type: File not found $BUILDKITE_CONFIG_FILE"; exit; }
BUILDKITE_CONFIG=$(cat $BUILDKITE_CONFIG_FILE | envsubst)
if [ $STATUS_CHECK == "false" ]; then
pipeline_settings='{ "provider_settings": { "trigger_mode": "none" } }'
BUILDKITE_CONFIG=$((echo $BUILDKITE_CONFIG; echo $pipeline_settings) | jq -s add)
fi
cd $ROOT_DIR
echo "Creating $PIPELINE_TYPE pipeline.."
RESPONSE=$(curl -s POST "https://api.buildkite.com/v2/organizations/$BUILDKITE_ORG_SLUG/pipelines" -H "Authorization: Bearer $BUILDKITE_API_TOKEN" -d "$BUILDKITE_CONFIG"
)
[[ "$RESPONSE" == *errors* ]] && { echo $RESPONSE | jq; exit 1; }
echo $RESPONSE | jq
WEB_URL=$(echo $RESPONSE | jq -r '.web_url')
WEBHOOK_URL=$(echo $RESPONSE | jq -r '.provider.webhook_url')
echo "Pipeline url: $WEB_URL"
echo "Webhook url: $WEBHOOK_URL"
echo "$PIPELINE_NAME pipeline created."
cd $CURRENT_DIR
The script uses the Buildkite REST API to create the pipelines with the given configuration. The script uses a pipeline configuration defined as a JSON document and posts it to the REST API. Pipeline configurations live in the .buildkite/pipelines folder.
Pipeline Configuration Templates
{
"name": "$PIPELINE_NAME",
"description": "Pipeline for $PIPELINE_NAME pull requests",
"repository": "$REPOSITORY",
"default_branch": "",
"steps": [
{
"type": "script",
"name": ":buildkite: $PIPELINE_TYPE",
"command": "buildkite-agent pipeline upload $SERVICE/.buildkite/$PIPELINE_TYPE.yml"
}
],
"cancel_running_branch_builds": true,
"skip_queued_branch_builds": true,
"branch_configuration": "!master",
"provider_settings": {
"trigger_mode": "code",
"publish_commit_status_per_step": true,
"publish_blocked_as_pending": true,
"pull_request_branch_filter_enabled": true,
"pull_request_branch_filter_configuration": "!master",
"separate_pull_request_statuses": true
}
}
{
"name": "$PIPELINE_NAME",
"description": "Pipeline for $PIPELINE_NAME merge",
"repository": "$REPOSITORY",
"default_branch": "master",
"steps": [
{
"type": "script",
"name": ":buildkite: $PIPELINE_TYPE",
"command": "buildkite-agent pipeline upload $SERVICE/.buildkite/$PIPELINE_TYPE.yml"
}
],
"cancel_running_branch_builds": true,
"skip_queued_branch_builds": true,
"branch_configuration": "master",
"provider_settings": {
"trigger_mode": "code",
"build_pull_requests": false,
"publish_blocked_as_pending": true,
"publish_commit_status_per_step": true
}
}
{
"name": "$PIPELINE_NAME",
"description": "Pipeline for $PIPELINE_NAME deploy",
"repository": "$REPOSITORY",
"default_branch": "master",
"steps": [
{
"type": "script",
"name": ":buildkite: $PIPELINE_TYPE",
"command": "buildkite-agent pipeline upload $SERVICE/.buildkite/$PIPELINE_TYPE.yml"
}
],
"provider_settings": {
"trigger_mode": "none"
}
}
Now, run the ./bin/create-pipeline command to create the root pipelines:
./bin/create-pipeline --type pull-request --status-checks
./bin/create-pipeline --type merge --status-checks
Copy the Webhook URL from the console output and create a webhook integration in GitHub. Navigate to the GitHub repository Settings > Webhooks and add a webhook. Select Just the push event, then add the webhook. Repeat this for both pipelines.
Next, add GitHub integration to allow Buildkite to send status updates to GitHub. You only need to set up this integration once per account. It is available at Settings > Integrations > GitHub in the Buildkite Console.
Now create the remaining service-specific pipelines:
# foo service pipelines
./bin/create-pipeline --type pull-request --service foo-service
./bin/create-pipeline --type merge --service foo-service
./bin/create-pipeline --type deploy --service foo-service
# bar service pipelines
./bin/create-pipeline --type pull-request --service bar-service
./bin/create-pipeline --type merge --service bar-service
./bin/create-pipeline --type deploy --service bar-service
Step 4: Set Up Buildkite Steps
Now that the pipelines are ready, let us configure steps to run for each pipeline. Add the following script in .buildkite/diff. This script diffs between all the files changed in a commit against the master branch. The output of the script is used to trigger respective pipelines dynamically.
#!/bin/bash
[ $# -lt 1 ] && { echo "argument is missing."; exit 1; }
COMMIT=$1
BRANCH_POINT_COMMIT=$(git merge-base master $COMMIT)
echo "diff between $COMMIT and $BRANCH_POINT_COMMIT"
git --no-pager diff --name-only $COMMIT..$BRANCH_POINT_COMMIT
Create a new file .buildkite/pull-request.yml and add the following step configuration. We use the buildkite-monorepo-diff plugin to run the diff script and automatically upload and trigger the respective pipelines.
steps:
- label: "Triggering pull request pipeline"
plugins:
chronotc/monorepo-diff#v1.1.1:
diff: ".buildkite/diff ${BUILDKITE_COMMIT}"
wait: false
watch:
- path: "foo-service"
config:
trigger: "foo-service-pull-request"
- path: "bar-service"
config:
trigger: "bar-service-pull-request"
Now create the configuration for the merge pipeline by adding the following content in .buildkite/merge.yml:
steps:
- label: "Triggering merge pipeline"
plugins:
chronotc/monorepo-diff#v1.1.1:
diff: "git diff --name-only HEAD~1"
wait: false
watch:
- path: "foo-service"
config:
trigger: "foo-service-merge"
- path: "bar-service"
config:
trigger: "bar-service-merge"
Step 5: Configure Individual Service Pipelines
At this point, we have configured the topmost level pull-request and merge pipelines. Now we need to configure individual pipelines for each service.
Foo Service Pull Request Pipeline
Create foo-service/.buildkite/pull-request.yml with the following content. When the pull-request pipeline for foo service runs, specify that the lint and test commands should run.
steps:
- label: "Foo service pull request"
command:
- "echo linting"
- "echo testing"
Foo Service Merge Pipeline
Set up a merge pipeline for the foo service by adding the following content in foo-service/.buildkite/merge.yml:
steps:
- label: "Run sanity checks"
command:
- "echo linting"
- "echo testing"
- label: "Deploy to staging"
trigger: "foo-deploy"
build:
env:
STAGE: "staging"
- wait
- block: ":rocket: Release to Production"
- label: "Deploy to production"
trigger: "foo-deploy"
build:
env:
STAGE: "production"
When the foo-service-merge pipeline runs, here is what happens:
Run Sanity Checks
The pipeline runs linting and testing commands to verify code quality before deployment.
Deploy to Staging
The foo-deploy pipeline is dynamically triggered with the STAGE=staging environment variable to deploy to the staging environment.
Manual Production Approval
Once deployment to staging is complete, the pipeline is blocked. The pipeline can be resumed by pressing the "Release to Production" button.
Deploy to Production
Unblocking the pipeline triggers the foo-deploy pipeline again, but this time with STAGE=production.
Foo Service Deploy Pipeline
Add configuration for the foo-deploy pipeline by creating foo-service/.buildkite/deploy.yml. In the deploy configuration, we trigger a bash script and pass the STAGE variable which was received from the foo-service-merge pipeline.
steps:
- label: "Deploying foo service to ${STAGE}"
command: "./foo-service/bin/deploy ${STAGE}"
Now, create the deploy script foo-service/bin/deploy and add the following content:
#!/bin/bash
set -euo pipefail
STAGE=$1
echo "Deploying foo service to $STAGE"
Make the deploy script executable: chmod +x ./foo-service/bin/deploy. Repeat all the above steps to configure pipelines for bar-service.
Step 6: Test the Overall Workflow
We have configured Buildkite and GitHub and we have set up the appropriate infrastructure to run the builds. Next, test the entire workflow and see it in action.
Create Branch and Push Changes
Create a new branch and modify some file in foo-service. Push the changes to GitHub and create a Pull Request.
Verify PR Pipeline Trigger
Pushing changes to GitHub should trigger the pull-request pipeline in Buildkite, which then triggers the foo-service-pull-request pipeline. GitHub should report the status in GitHub checks.
Merge and Deploy
Once all checks have passed in GitHub, merge the Pull Request. This triggers the merge pipeline. The changes in foo service are detected, and foo-service-merge pipeline is triggered. The pipeline will eventually be blocked when the foo-service-deploy runs against the staging environment.
Release to Production
Unblock the pipeline by manually clicking the "Release to Production" button to run deployment against production.
CI/CD Pipeline Architecture Summary
| Component | Purpose |
|---|---|
| Buildkite Elastic CI Stack | Autoscaling EC2 cluster that runs Buildkite agents and build steps |
| monorepo-diff Plugin | Runs git diff to detect changed folders and dynamically triggers appropriate pipelines |
| GitHub Webhooks | Triggers Buildkite pipelines on push events and pull request events |
| GitHub Status Checks | Reports build status back to GitHub for branch protection and merge gating |
| Manual Approval Block | Pauses pipeline before production deployment, requires human approval |
| S3 Secrets Bucket | Stores SSH keys for Buildkite agents to access GitHub and debug EC2 instances |
Improvement: Use Docker for Build Isolation
Consider using the buildkite-docker-compose-plugin to isolate the builds in Docker containers. This provides consistent build environments and prevents dependency conflicts between different services in your monorepo.
Frequently Asked Questions
What is a monorepo and why is CI/CD challenging for it?
A monorepo is a single Git repository holding all code and multiple projects. CI/CD is challenging because when any sub-project changes, we need to identify which specific projects changed and only build and deploy those, rather than rebuilding everything.
How does Buildkite detect which services changed?
Buildkite uses the monorepo-diff plugin which runs a git diff script comparing the current commit against the branch point on master. The output lists changed files, and the plugin matches file paths to configured service paths to trigger the correct pipelines.
What is the Buildkite Elastic CI Stack?
It is a CloudFormation template provided by Buildkite that creates a private, autoscaling Buildkite Agent cluster in AWS. It provisions an Auto Scaling Group that spawns EC2 instances to run your builds, scaling up and down based on build queue demand.
How does the manual production approval work?
The merge pipeline includes a "block" step that pauses execution after staging deployment. A human must manually click the "Release to Production" button in the Buildkite UI to resume the pipeline and trigger the production deployment.
Can I use this setup with other CI/CD platforms?
The concepts apply broadly. The git diff approach for detecting changed services works with any CI system. Buildkite's monorepo-diff plugin is specific to Buildkite, but similar plugins exist for GitHub Actions, GitLab CI, and Jenkins.
Need Help with CI/CD Infrastructure?
Our experts can help you design and implement automated CI/CD pipelines, monorepo architectures, and cloud infrastructure on AWS.
