Auto Deploying My Blog

I wanted to write on my blog without it being intrusive. I really wanted to make sure things were as easy as possible, required essentially no maintenance, and all i had to do was write about things I was interested in or working and then click a button or commit and it would deploy.

I’ve been spending a fair amount of time doing dev ops things for the last year and have been really enjoying containerized architecture. Docker is a fun piece of tech that has made my personal and professional life MUCH easier in regards to deployment, uptimes, scaling, etc.

I decided to put some of the tools I use on a daily basis to work making my blogging easier. One of those things was to get a straightforward blog setup created and have automated deployment. I don’t need or want to maintain servers for things like wordpress and don’t want to have to secure anything…..

I had been peeping Werner Herzog’s blog All Things Distributed and stumbled across his article about moving his blog to S3 w/Jekyll. I really liked the setup. Jekyll uses markdown, templating, and a local web server to let you create, edit, and view posts without actually using a server.

Here are the tools I used to achieve that.



Jekyll is a ruby based environment in which you write a mixture of markdown and mix it with templates to generate a static html site. This is great for blogs, photo sites, things that don’t require a backend.



Free: 5 GB of standard storage, 20,000 Get Requests, and 2,000 Put Requests.

So with all those static HTML files generated, I needed a place to store them so they could be served as a website. I chose S3 because it’s incredibly cheap to host a website, deployment is as straightforward as can be, and it’s easy to put a CDN in front of it or something along those lines depending on your needs.

AWS: CloudFront

AWS CloudFront

Free: 50 GB Data Transfer Out, 2,000,000 HTTP and HTTPS Requests.

Since I have a website I’m serving, and content doesn’t change all the time it might be nice to have a caching layer in front of it. A CDN (Content Delivery Network) allows files to be retrieved from various locations as if they are being requested locally. Example, my blog is in Oregon, but someone in Florida wants to read it. When the user loads my site, a replicated version of the site is served from a much closer location to Florida than Oregon.

A bonus is that it has direct integration with AWS Certificate Manager.

Once this is completed, I just point the backing data source to my S3 bucket which contains my site files and added the CNames I wanted the distribution to respond to.

AWS: Certificate Manager

AWS Certificate Manager

Since we like HTTPS here, it makes sense that I would want to provide a certificate so that my site could be served securely. You can easily and quickly create a cert that is signed by AWS and link it directly to your CloudFront distribution to automatically enable HTTPS.

AWS: Route53

AWS Route 53

Nothing fancy here. Just DNS records. I pointed my DNS to the CloudFront distribution and within a few minutes my site was replicated to a bunch of regions w/HTTPS enabled.

Auto deployment

Alright. Now that my blog is up, using HTTPS, and a CDN, it’s kind of annoying to generate the files and manually copy them to S3. Thankfully there is the aws cli which makes life sooooo much easier.

If i were to manually script the generation of the static content of my site and deploying to S3, it would only be a few commands, but I don’t want to have to run them. So how can I go about getting this working?

Well, I am using gitlab for managing my repositories, and my blog is one of those repositories.

Gitlab has the concept of “runners” which are essentially scripts that can be run whenever a commit is received. I created a runner to handle my blog, and then added a simple yaml file describing the build step. Here’s the whole file. Since Jekyll uses ruby, we tell the runner to pull down the ruby 2.3 docker image and install a few packages. We also define what stages are available and the order to run them. We only have one stage for this pipeline though, which is deploy.

Here are the packages and why we need them:

  • alpine-sdk - required so ruby can compile
  • python-dev - so python can compile
  • py-pip - so we can install the aws cli w/pip
  • python - so we can run the aws cli

Once those are installed, we let the runner know to only build for the master branch. When a commit comes into the master branch, we will automatically do all the things located in the “scripts” section.

What the “scripts” section does (in order):

  • Install all the ruby dependencies that Jekyll requires
  • Run Jekyll so that it generates the static site files
  • Use the aws cli to sync our files up to s3
  • Create a cache invalidation event so that once this is deployed, CloudFront will refresh it’s cache from the backing store (S3)

And that’s it. The blog automatically deploys when i commit. In fact, that’s how i deployed this post :)

Hope you enjoyed!


image: ruby:2.3-alpine
  - deploy

    - vendor
  key: "$CI_BUILD_REPO"

  - gem install bundler
  - apk -v --update add alpine-sdk python-dev python py-pip && pip install awscli

  stage: deploy
    - master
    - bundle install --path=vendor/
    - bundle exec jekyll build
    - aws s3 sync _site/ s3://
    - aws cloudfront create-invalidation --distribution-id {redacted} --paths /

Built with Hugo
Theme Stack designed by Jimmy