Deployment of aws resources via jenkins

HI Team,
Currently in my project we already have aws resources like Event bridge, stepfunctions, lambda, glue jobs created. Now we are trying to redeploy them via jenkins.
DEV and QA are sharing one aws account and pre-prod and prod are sharing one aws account
In current environment

  1. Event bridge rules and schedules are triggering stepfunctions, which in turn are triggering lambda and glue jobs
  2. Event bridge rules and schedules are passing the payload information of bucket details of particular environment to stepfunctions which are internally used by lambda and glue job

Ryt now we are supposed to create a Jenkins pipeline for this
How do I start with it. DO we need to have individual pipelines for each resource, or can have one single pipeline
How do we parameterize the solution, how do we pass the details to glue or lambda definitions
Kindly help me with solution asap

Thank You in advance

Hi @v.susmitha68

How are you currently deploying and managing the resources?

If you’re using Terraform to deploy AWS resources, you’ll need a Terraform pipeline. This pipeline should run terraform plan and terraform apply to deploy resources across different environments using .tfvars files.
You might also need a separate pipeline to deploy Lambda code, and probably another one for deploying Glue jobs.

I’d add to that you should have two separate Jenkins, one for the non-prod account and one for the prod account, such that only the people with production authorization may access the prod Jenkins.

Both Jenkins should be able to see the project’s git repo so adding to what Ray says, I would do based on the assumption you’re using terraform for this (you should be)

  • One Jenkinsfile with conditional logic for nonprod/prod which may set some variables like proxy addresses etc based on the target account
  • .tfvars files for each environment
  • Parameter which allows you to select the environment name (dev/QA) etc.
    Again your account specific conditional logic in the Jenkinsfile can control what values are populated here. This parameter selection would be used in the Jenkinsfile to select the correct tfvars file.
  • Parameters for anything else that can’t be static in the tfvars files which you’d pass to terraform with --var arguments.
  • Parameter for the terraform action to perform, e.g. Plan Only, Apply etc.
  • Steps in the jenkinsfile for init, plan to output a tfplan file (you may make it interactive here to inspect the plan first), then apply using the tfplan file generated in previous step.
  • Terraform backend must absolutely be S3, using a bucket in the appropriate AWS account for the deployment.
1 Like

If using cloudformation, it’s not dissimilar. Your jenkins runner needs aws cli installed, your template should be in S3 somewhere and you’ll have to pass the template parameters (which should also be in source control) to aws cloudformation command.
Sensitive values should be kept in Secrets Manager (or Hashicorp Vault if you use that) and retrieved by code in the Jenkinsfile to pass to cloudformation, or with data sources in the terraform config for terraform.

1 Like