CI/CD Using Github Actions And Amazon AWS S3
Interested in Github integration with Amazon S3, to build a CI/CD pipeline , that after every push , build, test and deploy my website (React, …) to Amazon S3.
I found some articles about that, like :
But they were not fit to my problem and had some extra thing(like using severless) that make project complicated, So I used the knowledges of those articles and my experiences, to create a simple and flexible node projects for this problem.
The Sample Project
For this project, we will using create-react-app to generate a static site that will be hosted in public AWS S3 bucket. CI/CD will execute with Github Actions.
Getting Started
At the end of this article we should have a project like this.
Using the documentation on create-react-app github page, run the following command :
npx create-react-app github-ci-cd-aws-s3
cd github-ci-cd-aws-s3
npm start
and navigate to http://localhost:3000 to see the sample project.
and should be build with npm run build
Github Action Setup
First of all , you should create a (public/private) repo to host your code and run actions.
git remote add origin yourRepositoryUrl
mkdir -p .github/workflows
Create a push.yml file
touch .github/workflows/push.yml
and fill it with following data
name: CI/CD
on:
# Trigger the workflow on push or pull request,
# but only for the master branch
push:
branches:
- master
pull_request:
branches:
- master
jobs:
primary:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: 12.x
- name: install dependencies
run: npm install
- name: build
run: npm run build
- name: test
run: npm run test
# should save this keys in github, settings of the project, secrets :
# AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
- name: aws deploy
run: BUILD_DIRECTORY=build BUCKET_NAME=yourBucketName AWS_ACCESS_KEY_ID=${{ secrets.AWS_ACCESS_KEY_ID }} AWS_SECRET_ACCESS_KEY=${{ secrets.AWS_SECRET_ACCESS_KEY }} npm run deploy-aws
You should create an Amazon S3 bucket to host your website, if you don’t have any one you can create like this, then should replace your bucket name to yourBucketName
in above code.
Notice, AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
. We’ll need to configure these in our Github repository secrets section with the Credential data that you can find here . Navigate to your repository in Github → Settings → Secrets and create secrets for those two keys. Those new secrets are now available to any workflow we define going forward.
DANGER!!!!! : if you want host a static website (in this case you want this), in Amazon AWS S3 , properties of bucket you should set index.html
to index.html
and for error page you SHOULD ALSO SET index.html
if you set error.html
, you get 404 error in routes of your websites and will be confused and saying OMG why is this happening?.for learning how deploy static website on Amazon S3 you can read this : https://medium.com/@channaly/how-to-host-static-website-with-https-using-amazon-s3-251434490c59
This will trigger Github Action, but for deploying our static website to Amazon S3 , we should create a JS script file in project.
Run following commands:
npm i aws-sdk
touch aws-deploy.js
then fill aws-deploy.js
with following codes :
const AWS = require("aws-sdk");
const fs = require("fs");
const path = require("path");
const rootFolderName = process.env.BUILD_DIRECTORY || 'dist'// configuration
const config = {
s3BucketName: process.env.BUCKET_NAME,
folderPath: `./${rootFolderName}` // path relative script's location
};
// initialize S3 client
const s3Config = {
signatureVersion: 'v4',
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
const s3 = new AWS.S3(s3Config);
//remove this log in production
// Github show **** instead of your keys(some privacy things), so //don't panic if you see your password like that
console.log('s3 config ', s3Config)// resolve full folder path
const distFolderPath = path.join(__dirname, config.folderPath);
uploadDirectoryFiles(distFolderPath)
function uploadDirectoryFiles(distFolderPath) {
const files = fs.readdirSync(distFolderPath)
if (!files || files.length === 0) {
console.log(`provided folder '${distFolderPath}' is empty or does not exist.`);
return;
}
for (const fileName of files) {
// get the full path of the file
const filePath = path.join(distFolderPath, fileName);
// If it was directory recursively call this function again
if (fs.lstatSync(filePath).isDirectory()) {
uploadDirectoryFiles(filePath)
continue;
}
uploadFile(filePath, fileName)
}
}
function uploadFile(filePath, fileName) {
const relativeFilePath = `${__dirname}/${rootFolderName}/`
const fileKey = filePath.replace(relativeFilePath, '')
console.log({fileName, filePath, fileKey})
const fileContent = fs.readFileSync(filePath)
// upload file to S3
s3.putObject({
Bucket: config.s3BucketName,
Key: fileKey,
Body: fileContent
}, (err, res) => {
if (err) {
return console.log("Error uploading file ", err)
}
console.log(`Successfully uploaded '${fileKey}'!`, {res});
});
}
and then add "deploy-aws": "node aws-upload.js"
to your package.json
→ scripts
The complete package.json
should be like this : https://github.com/mohammadranjbar/github-ci-cd-aws-s3/blob/master/package.json
Now you can commit and push your code, the Github action should build your website and deploy it on Amazon S3.
git add .
git commit -am "Initial commit, CI/CD integrated"
git push origin master
You can check this public repository for this tutorial: