CI/CD Using Gitlab CI And Amazon AWS S3

Mohammad Ranjbar Z
3 min readDec 28, 2019

--

After building a pipeline for Github CI/CD and Amazon AWS S3 deployment, I made a pipeline like that for Gitlab CI

The Sample Project

For this project, we will using create-react-app to generate a static site that will be hosted in public AWS S3 bucket. CI/CD will execute with Gitlab CI

Getting Started

At the end of this article we should have a project like this.

Using the documentation on create-react-app github page, run the following command :

and navigate to http://localhost:3000 to see the sample project.

and should be build with npm run build

Gitlab CI Setup

First of all , you should create a (public/private) repo to host your code and run Gitlab CI.

  • git remote add origin yourRepositoryUrl

Create a .gitlab-ci.yml file

  • touch .gitlab-ci.yml

and fill it with following data

image: docker:stable

services:
- docker:dind
stages:
- build
- deploy_aws

variables:
AWS_ACCESS_KEY_ID: "myAWSSecretKeyId"
AWS_SECRET_ACCESS_KEY: "myAWSSecretKeyId"

build:
image: node:10-alpine
stage: build
script:
- npm i
- npm run build

# to pass data between stages we should add build folders as artifacts,
# and set expires_in to 1 week then until a weeks later you can download build foder
# from gitlab panel
artifacts:
paths:
- build
expire_in: 1 week

deploy_aws:
image: node:10-alpine
stage: deploy_aws
only:
- master
script:
- npm i mime
- npm i aws-sdk
- BUILD_DIRECTORY=build BUCKET_NAME=renjer-test AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID} AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY} npm run deploy-aws

You should create an Amazon S3 bucket to host your website, if you don’t have any one you can create like this, then should replace your bucket name to yourBucketName in above code.

Notice, We’ll need to add these fields as variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in above file.

DANGER!!!!! : if you want host a static website (in this case you want this), in Amazon AWS S3 , properties of bucket you should set index.html to index.html and for error page you SHOULD ALSO SET index.html if you set error.html , you get 404 error in routes of your websites and will be confused and saying OMG why is this happening? for learning how deploy static website on Amazon S3 you can read this : https://medium.com/@channaly/how-to-host-static-website-with-https-using-amazon-s3-251434490c59

This will trigger Gitlab CI, but for deploying our static website to Amazon S3 , we should create a JS script file in project.

Run following commands:

  • npm i aws-sdk
  • touch aws-deploy.js

then fill aws-deploy.js with following codes :

const AWS = require("aws-sdk");
const fs = require("fs");
const path = require("path");
const mime = require('mime');
const rootFolderName = process.env.BUILD_DIRECTORY || 'dist'// configuration
const config = {
s3BucketName: process.env.BUCKET_NAME,
// path relative to script's location
folderPath: `./${rootFolderName}`
};
// initialize S3 client
const s3Config = {
signatureVersion: 'v4',
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
const s3 = new AWS.S3(s3Config);
console.log('s3 config ', s3Config)// resolve full folder path
const distFolderPath = path.join(__dirname, config.folderPath);
uploadDirectoryFiles(distFolderPath)function uploadDirectoryFiles(distFolderPath) {
const files = fs.readdirSync(distFolderPath)
if (!files || files.length === 0) {
console.log(`provided folder '${distFolderPath}' is empty or does not exist.`);
return;
}
for (const fileName of files) {
// get the full path of the file
const filePath = path.join(distFolderPath, fileName);
// If it was directory recursively call this function again
if (fs.lstatSync(filePath).isDirectory()) {
uploadDirectoryFiles(filePath)
continue;
}
uploadFile(filePath, fileName)
}
}function uploadFile(filePath, fileName) {
const relativeFilePath = `${__dirname}/${rootFolderName}/`
const fileKey = filePath.replace(relativeFilePath, '')
console.log({fileName, filePath, fileKey})
const fileContent = fs.readFileSync(filePath)
const ContentType = mime.getType(filePath)
// upload file to S3
s3.putObject({
Bucket: config.s3BucketName,
Key: fileKey,
Body: fileContent,
ContentType
}, (err, res) => {
if (err) {
return console.log("Error uploading file ", err)
}
console.log(`Successfully uploaded '${fileKey}'!`, {res});
});
}

and then add "deploy-aws": "node aws-upload.js"

to your package.jsonscripts

The complete package.json should be like this :

https://gitlab.com/mranjbar.z2993/gitlab-asw-s3-ci-cd/blob/master/package.json

Now you can commit and push your code, the Gitlab CI should build your website and deploy it on Amazon S3.

  • git add .
  • git commit -am "Initial commit, CI/CD integrated"
  • git push origin master

You can check this public repository for this tutorial:

gitlab-ci-cd-aws-s3

References:

--

--