End-to-End devops pipeline using aws services

The objective of this blog post is to

  1. Demonstrate the construction of an end to end devops pipeline using aws code pipeline service
  2. To demo the triggering of a build and deployment without human intervention

Why aws code pipeline ?

  1. It helps you leverage your existing investment in aws services.
  2. It makes the devops pipeline highly scalable as opposed to setting up Jenkins on ec2 instances and scaling them manually or autoscaling as the number of build jobs grow.
  3. Security comes out of box vs using an opensource tool like Jenkins
  4. Comes with the inherent advantage of Serverless environment compared to managing the Jenkins server and underlying infrastructure.

We shall be using Atlassian bitbucket as the source code control system for this example

Java will be the language in which all code is assumed to be written.Although steps are almost similar for Python or any other language.

Prerequisites for creating the pipeline

  1. Bitbucket repository with the artifacts as described below.
  2. Default S3 Bucket for CodePipeline (1 Bucket for all pipelines in region)
  3.  S3 Bucket for Artifacts.
  4. IAM roles with policy for Stack Creation.

The bitbucket repo should have the following directory structure:

Projectfolder (folder)

            .git (file)

            Module (folder)

                        pom.xml (file)

                        env-buildspec.yml (where env like dev,qa,prod) (file)

                        Submodule (folder)

                                    pom.xml (file)

                                    env-submodule-cf.template (cloudformation template) (file)

Sample env-buildspec.yml and env-submodule-serverless.template files have been provided below. Lets call them dev-buildspec.yml and dev-submodule-cf.template

version: 0.4
phases:
  install:
    runtime-versions:
       java: openjdk8
  build:
    commands:
      - CurrEPOCHTime=$(date +%F_%T) 
      - cd $CODEBUILD_SRC_DIR
      - ls
      - cd $CODEBUILD_SRC_DIR/module/
      - ls
      - echo Build started on `date`
      - mvn clean install
      #- echo override parameter for submodule
      - cd $CODEBUILD_SRC_DIR/module/submodule/target/
      - mv submodule-1.0.0-SNAPSHOT.jar submodule-1.0.0-SNAPSHOT-$CurrEPOCHTime.jar
      - configData='{"Parameters":{"CodeBucketName":"module-mvp-artifacts","CodeObjectName":"submodule-1.0.0-SNAPSHOT-'$CurrEPOCHTime'.jar"}}'
      - echo $configData
      - echo $configData > device-parameter_config.json
      

 
reports: #New
  SurefireReports: # CodeBuild will create a report group called "SurefireReports".
     files: #Store all of the files
      - submodule/target/surefire-reports/*
     base-directory: ' $CODEBUILD_SRC_DIR/module/'
artifacts:
  files:
      - 'submodule/target/submodule*'
      - 'submodule/env-submodule-serverless*'
  base-directory : '$CODEBUILD_SRC_DIR/module/'
  discard-paths: yes
  name: $(date +%Y-%m-%d)

Cloudformation template for submodule. (This is just a sample and yours could be totally different. Just want to showcase the use of variables and other configuration parameters that are used in buildspec.yml)

{
	"AWSTemplateFormatVersion": "2010-09-09",
	"Transform": [
		"AWS::Serverless-2016-10-31"
	],
	"Description": "Submodule Template. Stack-Name : Env-Module-Submodule",
	"Globals": {
		"Function": {
			"Runtime": "java8",
			"CodeUri": { "Bucket": {"Ref":"CodeBucketName"}, "Key": {"Ref":"CodeObjectName"} },
            "Timeout": 300,
			"Environment": {
				"Variables": {
					"REGION": {
						"Ref": "AWS::Region"
					},
					"ACCESS_KEY_ID": "XXXXXXXXXXXXXXXXXXXX",
					"SECRET_ACCESS_KEY": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
					"ACCOUNT_ID": {
						"Ref": "AWS::AccountId"
					},
					"CLIENT_END_POINT": "xxxxxxxxxxxxx.amazonaws.com",
					"CORS_URL" : "*",
                    "STCT_TR_SE_VAL" : "max-age=31536000,includeSubdomains,preload"
				}
			},
			"Tags": {
				"Application": {
					"Ref": "Application"
				},
				"Environment": {
					"Ref": "Environment"
				},
				"Owner": {
					"Ref": "Owner"
				},
				"SubDivision": {
					"Ref": "SubDivision"
				}
			}
		},
		"Api": {
			"Auth": {
				"ResourcePolicy":{
                    "CustomStatements": [{
                      	"Effect": "Allow",
                      	"Principal": "*",
                      	"Action": "execute-api:Invoke",
                      	"Resource": "execute-api:/*"
                    }]
                },
				"DefaultAuthorizer": "LambdaCustomAuthorizer",
				"Authorizers": {
					"LambdaCustomAuthorizer": {
						"FunctionArn": {
							"Fn::Sub": "arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:function:sciex_customized_auth_lambda"
						},
						"Identity": {
							"Header": "AccessToken",
							"ReauthorizeEvery": 0
						}
					}
				},
				"AddDefaultAuthorizerToCorsPreflight": false
			}
		}
	},
	"Parameters": {
		"CodeBucketName" : {
			"Description": "Code Bucket Name",
			"Type": "String",
			"Default": "NA"
		},
		"CodeObjectName" : {
			"Description": "Code Object Name",
			"Type": "String",
			"Default": "NA"
		},
		"Application": {
			"Description": "Application Name",
			"Type": "String",
			"Default": "StatusScope",
			"AllowedPattern": "^[a-zA-Z0-9 ]*$",
			"ConstraintDescription": "Malformed input-Parameter MyParameter must only contain uppercase and lowercase letters and numbers"
		},
		"Environment": {
			"Description": "Environment information",
			"Type": "String",
			"Default": "Dev",
			"AllowedValues": [
				"Dev",
				"Prod",
				"Stag",
				"Test"
			]
		},
		"Owner": {
			"Description": "Owner Name",
			"Type": "String",
			"Default": "Raghu",
			"AllowedPattern": "^[a-zA-Z0-9 ]*$",
			"ConstraintDescription": "Malformed input-Parameter MyParameter must only contain uppercase and lowercase letters and numbers"
		},
		"SubDivision": {
			"Description": "SubDivision Name",
			"Type": "String",
			"AllowedPattern": "^[a-zA-Z0-9 ]*$",
			"Default": "Device Provisioning Module",
			"ConstraintDescription": "Malformed input-Parameter MyParameter must only contain uppercase and lowercase letters and numbers"
		},
		"AWSCertificates" : {
            "Type" : "String",
            "Default" : "arn:aws:acm:eu-east-1:9999999999999999:certificate/xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx"
		},
		"ModuleURL" : {
            "Type" : "String",
            "Default" : "env.module.submodule.sitedomain.com"
		},
		"Route53HostedZoneId" : {
            "Type" : "String",
            "Default" : "XXXXXXXXXXXXXXXXXXXXXX"
        }
	},
	"Resources": {
		"GetFeatureDetails": {
			"Type": "AWS::Serverless::Function",
			"Properties": {
				"Role": null,
				"MemorySize": 512,
				"Description": null,
				"Policies": [
					"AmazonDynamoDBFullAccess",
					"AmazonS3FullAccess"
				],
				"Events": {
					"PostResource": {
						"Type": "Api",
						"Properties": {
							"Path": "/v2/submodule/feature/{paramNo}",
							"Method": "get",
							"RestApiId": {
								"Ref": "SubmoduleAPIGateway"
							},
							"Auth": {
								"Authorizer": "SubmoduleAuthorizer"
							}
						}
					}
				},
				"Timeout": 300,
				"Handler": "com.domain.submodule.function.GetFeatureDetails",
				"Environment": {
					"Variables": {
						"FEATURE_TABLE_NAME": "feature_information"
					}
				}
			}
		},
		"SubmoduleTopicRulePermission": {
			"Type": "AWS::Lambda::Permission",
			"Properties": {
				"Action": "lambda:InvokeFunction",
				"FunctionName": {
					"Fn::Sub": "${FeatureStatusUpdateAction.Arn}"
				},
				"Principal": "iot.amazonaws.com",
				"SourceAccount": {
					"Ref": "AWS::AccountId"
				},
				"SourceArn": {
					"Fn::Sub": "${FeatureStatusUpdateTopicRule.Arn}"
				}
			}
		},
		"SubmoduleAPIGateway": {
			"Type": "AWS::Serverless::Api",
			"Properties": {
				"StageName": {
					"Ref": "Environment"
				},
				"Cors": {
					"AllowMethods": "'POST,GET,DELETE,PUT,OPTIONS'",
					"AllowHeaders": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
					"AllowOrigin": "'*'"
				},
				"GatewayResponses": {
					"BAD_REQUEST_BODY": {
						"StatusCode": "400",
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					},
					"WAF_FILTERED": {
						"StatusCode": "403",
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					},
					"EXPIRED_TOKEN": {
						"StatusCode": "503",
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					},
					"AUTHORIZER_CONFIGURATION_ERROR": {
						"StatusCode": "500",
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					},
					"UNAUTHORIZED": {
						"StatusCode": "401",
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					},
					"DEFAULT_5XX": {
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					},
					"DEFAULT_4XX": {
						"ResponseParameters": {
							"Headers": {
								"Access-Control-Allow-Methods": "'POST,GET,DELETE,PUT,OPTIONS'",
								"X-Requested-With": "'*'",
								"Access-Control-Allow-Headers": "'Content-Type,X-Amz-Date,Authorization,X-Api-Key,x-requested-with,AccessToken'",
								"Access-Control-Allow-Origin": "'*'"
							}
						}
					}
				},
				"Tags": {
					"Application": {
						"Ref": "Application"
					},
					"Environment": {
						"Ref": "Environment"
					},
					"Owner": {
						"Ref": "Owner"
					},
					"SubDivision": {
						"Ref": "SubDivision"
					}
				},
				"Domain" : {
					"DomainName": { "Ref" : "ModuleURL" },
					"CertificateArn": { "Ref" : "AWSCertificates" },
					"EndpointConfiguration": "EDGE",
					"Route53": {
						"HostedZoneId": { "Ref" : "Route53HostedZoneId" }
					},
					"BasePath": ["/"]
				}
			}
		}
	}
}

Step 1 : Login to AWS Console

Select CodePipeline service from Developer Tools

Click on Createpipeline.

Add pipeline name and click next

Step 2 :  Bitbucket Connectivity

  Select source code provider (Bitbucket)

Click on Connect to Bitbucket Cloud

  • Popup window appears to connect to bitbucket, provide any connection name say ModuleAlphaConnection.
  • For first time connection, click on Install a new App, this will take you to BitBucket login page.
  • If not click on search bar to view previous connections.

  • On the app installation page, a message shows that the AWS CodeStar app is trying to connect to your Bitbucket account. Choose Grant access.
  • The connection ID for your new installation is displayed. Choose Complete connection.
  • In Repository name, choose the name of your third-party repository. In Branch name, choose the branch where you want your pipeline to detect source changes.
  • In Output artifact format, you must choose the format for your artifacts.
  • To store output artifacts from the Bitbucket action using the default method, choose CodePipeline default. The action accesses the files from the Bitbucket repository and stores the artifacts in a ZIP file in the pipeline artifact store.
  • Once logged in, select Repository and branch for which source code to build and click next

Step 3 : Add Build Stage

       Next step is to add Build stage, select CodeBuild and click next.

If Codebuild is already present, click on search to select. Or click on Create project

A popup will appear, enter project name to create build project.

Step 4 :      Build Stage Configuration

       Select operating system Ubuntu

Once you select ubuntu, follow below screenshot to select Image and Image version.

If buildspec.yml is present in Repository, then select use a buildspec file or select Insert build commands. And give path of buildapec file (Optional).

Next, click on Continue to codepipeline.

Once CodeBuild is created it shows successful message, then click next.

Click on Skip deploy stage.

Step 5 : Configuration Overview

   Review all the configuration, then click on Create Pipeline.

Once you create, pipeline will trigger automatically. Pipeline should look like below.

Add Artifacts

  • Before Adding S3 bucket as Artifacts, create a bucket in S3
  • In pipeline click on Add stage after the Build stage

Enter the stage name as Artifacts, Click on Add stage.

Click on Add action group, to add S3 bucket for Artifacts.

Select Input artifacts as BuildArtifacts.

Create IAM role for Deployment

AlphaSQSandIot
{
    "Version": "2012-10-17",
    "Statement": [ {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "iot:ReplaceTopicRule",
                "iot:DeleteTopicRule",
                "iot:DisableTopicRule",
                "iot:GetTopicRule",
                "iot:EnableTopicRule",
                "sqs:*",
                "iot:CreateTopicRule"
            ],
            "Resource": "*"
        } ]
}
AlphaApiGatewayAndOthersAccess
{
    "Version": "2012-10-17",
    "Statement": [ {
            "Sid": "AllowPublicHostedZonePermissions",
            "Effect": "Allow",
            "Action": [
                "route53:GetHostedZone",
                "route53:ListHostedZones",
                "route53:ChangeResourceRecordSets",
                "route53:ListResourceRecordSets",
                "route53:GetChange"
            ],
            "Resource": "*"
        }  ]
}
AlphaRoute53Access
 {
    "Version": "2012-10-17",
    "Statement": [ {
            "Effect": "Allow",
            "Action": [
                "acm:ListCertificates",
                "cloudfront:*"
            ],
            "Resource": "*"
        }  ]
}
Lambda@EdgePolicy
{
    "Version": "2012-10-17",
    "Statement": [ {
  "Effect": "Allow",
		"Action": "iam:CreateServiceLinkedRole",
		"Resource": "arn:aws:iam::*:role/aws-service-role/*"        } ]
}

Step 6 : Add a section for each submodule

Deploy Stage  

Click on Action

  • Save the Pipeline & Click on the Edit Pipeline and Add the Deployment Stage
  • Click on Add Action Group

Note: Repeat the above deploy configuration for all sub modules

Stacks Created

Step 7 : Pipeline in action

Now go ahead and make source code changes in bit bucket and check them into the branch that has been configured in the pipeline

git add mysource.java
git commit -m "file modified to trigger pipeline"
git push origin branch_name

As soon as the code is checked in the pipeline will be triggered and the progress can be monitored through the aws console.

The pipeline can also be manually triggered by clicking on the release change button.

You are now all set to use postman or appropriate rest client to test your rest api for which you made the above code change.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: