Introduction.

If you have a static site set up on S3. You should be able to easily setup auto deploy using AWS Code pipeline.

For this article, we would be using Hugo as our static site generator, GitHub as our source control, S3 to host our static site along with CloudFront to speed up distribution of our static site.

AWS has a example article on how to use all of the above to set up a static site. We are not going to go over setting up a static site in this blog post.

Setting up your source

We’d be following similar steps to this AWS Article but we will use Github instead as our source control.

Let’s use a simple deployment process. For simplicity, we will deploy the latest on the main branch.

  1. Open the Code Pipeline Console
  2. Click on “Create Pipeline”.
  3. Enter a name for your pipeline and click on next.
  4. In the source panel select GitHub (Version 2).
  5. AWS automatically creates authorization on your GitHub account for this you have to click on Connect to Github and that will guide you through the process.
  6. Select the repository you created for your static site.
  7. Select a branch that you’d like to be designated as a deployment target. This build will be run every time you make changes to this branch.

Now we are set up to automatically run this build every time you check in to your Github repository.

Setting up your build stage and deploy stages.

Let’s set up your build stage, to build your static site. Again AWS has an excellent article to guide you through this.

The above article should guide you through setting up a Hugo build and deploy to S3. Once you are done with this step save your pipeline and run an end-to-end test. Go to your S3 bucket and validate that the right files have been updated.

Invalidating Cloudfront.

If you ended up using CloudFront to front your static site, chances are when you run your build, your changes are not picked up right away. This is because based on how you set it up your static site is cached by cloud front for up to 24 hours. Checkout Invalidating Files for more details on CloudFront invalidation.

As part of this step, we will set up another step to our code pipeline to invalidate CloudFront. Before we set up the invalidation step in code pipeline we will create a lambda function to invalidate CloudFront.

Go to AWS Lambda console and create a new lambda function. You may choose any language you are comfortable with but the following is a code example in Go lang.

package main

import (
 "context"
 "fmt"
 "os"

 "github.com/aws/aws-lambda-go/events"
 "github.com/aws/aws-lambda-go/lambda"
 "github.com/aws/aws-lambda-go/lambdacontext"
 "github.com/aws/aws-sdk-go/aws"
 "github.com/aws/aws-sdk-go/aws/session"
 "github.com/aws/aws-sdk-go/service/cloudfront"
 "github.com/aws/aws-sdk-go/service/codepipeline"
)

func main() {
 lambda.Start(invalidatecloudFront)
}

func invalidatecloudFront(ctx context.Context, event events.CodePipelineEvent) {
 dist := os.Getenv("CLOUD_FRONT_DISTRIBUTION_ID")
 region := os.Getenv("CLOUD_FRONT_DISTRIBUTION_REGION")
 jobID := event.CodePipelineJob.ID

 if len(jobID) == 0 {
 panic("Job id is not set")
 }

 fmt.Println("Processing job id: ", jobID)

 if len(dist) == 0 {
 panic("No distribution have been found")
 }

 if len(region) == 0 {
 panic("Distribution region has not been found")
 }

 lc, _ := lambdacontext.FromContext(ctx)
 if len(lc.AwsRequestID) == 0 {
 panic("AwsRequestID is not set")
 }
 sess := session.New(&aws.Config{
 Region: aws.String(region),
 })

 svc := cloudfront.New(sess)
 cpl := codepipeline.New(sess)

 cfInvalidation := &cloudfront.CreateInvalidationInput{
 DistributionId: aws.String(dist), // Required
 InvalidationBatch: &cloudfront.InvalidationBatch{ // Required
 // use to dedup reqeusts.
 CallerReference: aws.String(lc.AwsRequestID), // Required
 Paths: &cloudfront.Paths{ // Required
 Quantity: aws.Int64(1), // Required
 Items: []*string{
 // invalidate everything
 aws.String("/*"),
 },
 },
 },
 }
 _, err := svc.CreateInvalidation(cfInvalidation)

 if err != nil {
 fmt.Println(err.Error())

 // mark the job as error.
 cplFailure := &codepipeline.PutJobFailureResultInput{
 JobId: aws.String(jobID),
 FailureDetails: &codepipeline.FailureDetails{
 Message: aws.String(err.Error()),
 Type: aws.String("JobFailed"),
 ExternalExecutionId: aws.String(lc.AwsRequestID),
 },
 }
 cpl.PutJobFailureResult(cplFailure)
 return
 }

 // mark the job as success.
 cplSuccess := &codepipeline.PutJobSuccessResultInput{
 JobId: aws.String(jobID),
 }

 _, err = cpl.PutJobSuccessResult(cplSuccess)

 if err != nil {
 fmt.Println("Error when updating code pipeline job status.")
 fmt.Println(err.Error())
 }
}

Notice that in the above method we are expecting two inputs from Lambda ctx context.Context and event events.CodePipelineEvent. The first one is an invocation context set by lambda, second one is specific to CodePipeline that is passed in when invoked from code pipeline.

The above lambda function does three things.

  1. Invalidate all files CloudFront distribution.
  2. Mark the code pipeline job to failed if the invalidation fails.
  3. Mark the code pipeline job to success if the invalidation succeeds.

If you were to use the above code, select your run time as Go 1.x when creating your Lambda function. Go runtimes are not supported for console editing as of writing this article so you’d have to build your code locally and upload a zip.

If you don’t already use go follow the steps at Golang home to install go. Save the above code as main.go and use the following command to build your code.

GOARCH=amd64 GOOS=linux go build main.go

Compress the output file into main.zip and upload on to lambda. Set both CLOUD_FRONT_DISTRIBUTION_ID and CLOUD_FRONT_DISTRIBUTION_REGION environment variables. Test your Lambda function before you plug it into Code pipeline.

Go to your job in the code pipeline console and click on edit.

Click on add additional stage and add then click on add another action. Choose an action name and set the action provider to Aws Lambda. Chose your function you created above and save it as a build step.

There is also an AWS article on invoking lambda form Code Pipeline for more reading.