Azure DevOps: Deploy your SPA on AWS S3 bucket

Introduction


According to Amazon's definition: Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance.

These features make AWS S3 a really interesting candidate to store static files. However, since AWS S3 supports http static hosting, single page applications (SPA) can also get benefit from AWS S3.

 

Additionally, if you want to support https, you can also use AWS CloudFront. You can find more info in this article: https://aws.amazon.com/blogs/networking-and-content-delivery/amazon-s3-amazon-cloudfront-a-match-made-in-the-cloud/

 In a project were I'm working, I configured CI/CD to deploy a React application in AWS s3. Due to the poor quality of AWS documentation, even the simplest task can be a challenge, so I will try to explain how to configure Azure DevOps to deploy to AWS S3, as this may be of help to somebody.

 Configuring Release Pipeline on Azure DevOps

In order to work with AWS, the first thing you will need is AWS Tools for Microsoft Visual Studio Team Services installed, which you can install from here: https://marketplace.visualstudio.com/items?itemName=AmazonWebServices.aws-vsts-tools&refid=gs_card.

Once you install the extensions, you will be able to see tasks related to AWS S3, particularly, Download and Upload.

So lets upload our react artifact files to AWS S3.

Configuring AWS Credentials

To use AWS services from Azure DevOps, you need to configure your AWS credentials. For example, in the S3 Upload task, you will see the AWS Credentials section, where you need to add your credentials.


If you don't know how to create your AWS key id and access key, here you have a tutorial (quite a bit more straightforward than AWS official documentation): https://www.cloudberrylab.com/resources/blog/how-to-find-your-aws-access-key-id-and-secret-access-key/

Configure S3 Upload task

Now you have to configure the S3 Upload task. To do so, you have to specify your S3 bucket name and the folder from the build that you want to upload.


If you didn't create your bucket, you can allow Azure DevOps to create it for you. I also configure Overwrite files, in case I want to upload an existing file.
 

 Cleaning Previous Deployments

In spite of the fact that I configured "overwrite" feature to update existing files, react (actually webpack) creates a new bundle with a different name each time you build. for this reason, I need to clean previous deployed files, before starting a new deployment.

To remove existing files, we don't have an Azure DevOps specific task, so we will need to use AWS cli task. I also want to keep folders sandbox and logos, where I have data I don't deploy with react application. In order to do it through a AWS command, you can execute:

aws s3 rm s3://react-app-dev --recursive --exclude "sandbox/*" --exclude "logos/*"

So you need to configure your command in AWS cli task in this way:


However, when you execute your pipeline, you will see this error: AWS CLI is not installed on this machine.



This error, is because you need to install first AWS cli. To install it, add a command task and write this script:


As this is a python script, you will need to add a "Use Python 3.x task" before to use python tasks.

And now your deployment pipeline is ready to deploy your SPA into AWS S3.

To sum up, this is the pipeline in order that you need to follow to deploy your react app on AWS S3 bucket.



Notice that tasks such as AWS S3 or configuring AWS Cli in Azure DevOps can be very useful for many purposes, so this guide can be helpful for example if you need AWS cli for other purposes. 

I hope you find useful this post.

Happy coding!! :)






Comments

  1. Really good information, this information is excellent and essential for everyone. I am very very thankful to you for providing this kind of information. improve cash flow in business

    ReplyDelete
  2. Thank you so much.. It helped me very much

    ReplyDelete

Post a Comment

Popular posts from this blog

Building Micro-Frontends with Single-Spa

AWS assuming role + MFA with pulumi (and other applications)

Managing snapshots for Amazon ElasticSearch with Dotnet Core Lambdas