During my countless hours of research into the cloud computing space, I came across the Cloud Resume Challenge. This was a few months ago before I obtained my AWS Certified Cloud Practitioner cert, and I remember reading over it and thinking "Hmm, this doesn't seem too difficult." Suffice it to say that I was wrong and this certainly is a challenge for someone at a beginner knowledge level! In this blog post, I intend to give a brief overview of how I accomplished each part of the challenge and point out any difficulties I had and how I overcame them.
1. Certification
There isn't much to say about this one. I studied for about 3 weeks and felt pretty confident going into the exam. I used the below course on Udemy (free with my WGU tuition!) and was sufficiently prepared for any questions I saw. Click here for my verifiable certification badge via Credly!
2. HTML/CSS
This was one that took me a bit of time. I didn't have much (recent) experience with HTML so I had to mostly relearn the basics. I cannot provide the resources I used for general learning as they were included in a course I am enrolled in at WGU, but this video was a great help for me in getting started. The especially helpful bits of this video for me were on utilizing custom fonts, icons, and CSS. I enjoyed this part as after finishing I feel much more competent working with HTML/CSS, and may end up going back and revamping my site later.
3. Static Website
With my index.html
and style.css
in tow, I went about researching deploying a static website via S3. There is tons of great documentation from AWS, and I used this document to assist in this part. I didn't run into any issues with this one as it's all pretty clear-cut. I will say this part did open my eyes to what all S3 can do. Poking around the options within shows you there is much more than just storing objects, such as the ability to redirect requests for an object to another bucket or custom domain!
4. HTTPS
This is where we start to get to the meat of things. First off, I went and requested a certificate with Certificate Manager. For this project I'll be working with my custom domain brandon-daley.com so I requested my cert cover the apex domain and the www subdomain.
For the CloudFront portion of this part I utilized this document to guide me. I definitely feel like this was the hardest part up to this point for me personally. While going through the CloudFront distribution creation process I left most options default except for the following:
Enabled OAI, created new OAI, selected automatically update Bucket Policy for OAI
Viewer protocol policy = Redirect HTTP to HTTPS
Added brandon-daley.com to the list of Alternate domain names
Selected my custom SSL cert
The areas I ran into problems with this one were with DNS, the origin points, bucket policies/permissions, caching, and alternate domain names.
Origin Points: There are two methods of adding an origin point when it comes to S3 (to my knowledge. You can use a custom origin point which involved copy+pasting the website endpoint from S3 or select it from the origin point dropdown. Since I'm using OAI instead of opening my S3 bucket to public access, I had to select my S3 bucket from the dropdown in order to have the OAI option available. If you're not using OAI you'd want to just copy+paste the website endpoint from the bucket properties.
DNS: This one was a simple mistake. I was testing the DNS in Route 53 straight to the S3 objects before involving CloudFront. When I went to implement CloudFront, I tried to simply edit the previous records I had and for some reason they wouldn't stick. I had to delete the records and recreate them as an A record alias pointing to my CloudFront distribution.
Bucket policies/permissions: Prior to creating the CloudFront distribution I had public/anonymous access enabled for my buckets. When reading, I saw that OAI was a way around this so I chose to implement it for increased security. It seems that when creating one of my origin points it didn't automatically edit the bucket policy as it was supposed to or disable public access. So, I just had to go do these manually.
Caching: After all this, I was testing and still wasn't getting the page to load in HTTPS with the certificate showing. Stumped, I checked few other things such as DNS and ensured I was getting CloudFront IPs in my A records. I randomly decided to ctrl+F5 refresh (this does a complete refresh of any webpage, ignoring any cached info for the site on your local machine) and voila! The site was now loading in HTTPS and I verified the certificate was there and accurate. You could also resolve this by creating an invalidation which will prevent caching from taking place.
Alternate domain names (CNAMEs): I had initially only added my apex domain in the "Alternate domain names" field in the distribution settings. The issue I was experiencing was when navigating to brandon-daley.com everything was working fine, but the www subdomain was getting a 403 error. I had to add brandon-daley.com to the aforementioned setting and that resolved this issue.
5. DNS
This part was more difficult than it needed to be since I was working with a domain registered with a different company. Normally you could just manage the DNS with the 3rd party, but I wanted the functionality of Route 53 enabling you to link straight to AWS resources and also just the hands-on experience so I had to migrate my DNS to Route 53. This also had the fun side-effect of me having to sign up for Amazon Workmail since I couldn't use the other company's webmail anymore, but that's a story for another post. If you're just registering a brand new domain with Route 53 this one is cake. My process was as follows:
Create a hosted zone for my domain in Route 53
Enable custom DNS in my Namecheap domain portal
Point my domain at Route 53's name servers
Wait for propagation
Add two alias A records pointing to my Cloudfront distribution's URL
Now, if I visit brandon-daley.com or brandon-daley.com will hit Cloudfront edge points, which then apply the SSL cert enabling HTTPS, routes to my S3 bucket via its' origin point, which displays my resume webpage securely.
6. Javascript
I didn't have any prior experience with Javascript so this was a whole new world for me. I find the syntax to be especially different and confusing compared to Python and PowerShell which I'm most familiar with. Granted, this was such a brief exposure to it I'm sure it just takes more time to acclimate.
For learning, I used these two videos from The Coding Train on YouTube. I'd highly recommend his videos as he explains things very simply and is clearly an enthusiastic instructor!
At first I used the free-to-use CountAPI and got this up and running very quickly and easily. However, I realized I had inadvertently skipped the need for creating my own API and database within AWS. So, I scrapped this.
The main hang-up I had with this was getting the script to actually increment the value in the HTML. My problem ended up being that I had the <script></script>
code in the header and it needs to be in the body of the HTML instead. A helpful tip I found was using the browser developer console (F12) to view the errors when the page was loading. I also had a somewhat hard time wrapping my head around arrow functions, which the above videos helped with.
These next three areas I struggled quite a bit with. Prior to this I was making great progress, but here I got stuck for a couple days. My biggest problem was that I was trying to tackle the API/Lambda/Database at the same time. So, I took a step back and started from a fresh slate with the area I had most experience with as a foundation.
7. Lambda
I have a good bit of experience with Python, so this one was the least painful of these three areas of the challenge. The easiest way I could imagine to get this done would be to have the function get the current count from the DB, increment it by 1, return this value to be placed in the website, and record the new value to the DB. I initially started with watching some YouTube videos, but I found that many of them were using old boto3 syntax. So, I read through the boto3 documentation instead. I would suggest anyone doing this part of the challenge to read up on Python dictionaries as they are required and somewhat confusing. My initial plan for the Lambda code worked out fine though, and I was able to move onto the next section.
8. DynamoDB
Admittedly, this part was somewhat in tandem with the previous. Regardless, I had to do a good amount of reading up on this as I had no experience. A difficulty for me here was the formatting of the item within the Lambda code so it matched up with the data in the DB. What helped me with this was entering an example value into the table and reviewing it in JSON format so I could better see what values were expected within the boto3 syntax. The moment I was able to test my Lambda function and see the value increment in the response was awesome!
9. API
I won't lie, this part was very tough for me. I was completely stumped several times and had difficulty reaching out for help as I didn't even know what questions to ask. I have read a lot of accounts that this area of the challenge is quite difficult though, so this made me feel better. Despite all this, my stubborn nature carried me through it to the end. I did so many things wrong in this area it would be impossible for me to note them all, let alone remember them. The main points were:
Was using a REST API instead of HTTP API
Wasn't sure if I should be using CORS or not
In-line with the last one, I didn't know what headers needed to be declared in the backend or were implied from CORS.
In the end, I scrapped my REST API I had been working with the majority of the time and was able to get the HTTP API up and running within a few hours compared to the day+ I was working on the other. I'm sure there was a way to get it working, but I read that an HTTP API made more sense for this scenario. The only real difficulty I had with the HTTP API was I had to modify my prior JS code a bit to properly output the response so it could be displayed on the webpage. I used console.log()
in each step of my JS so I could narrow down where the data was not coming out in a format displayable by my webpage so I could resolve it. I fixed my CORS issues by matching the Access-Control-Allow-Origin
headers in my HTTP API and in my Lambda response headers. A way I could improve upon the security of this API is to attach an authorizer and utilize this to 100% prevent requests from anywhere other than my website. I also implemented some throttling for the API to help mitigate the risks of it being open to the public.
10. Tests
Because of the way I setup my API there was little need for testing of formats and such. My API will either return an integer value or it won't. So, I was thankfully able to implement all my testing within their respective scripts. For my Lambda Python code I wrote tests to ensure the following:
Value retrieved from DB is correct format and is incremented correctly.
Correct new value is added to the database successfully.
Based on outcome of tests, the API response will either be a 200 or a 500 status code for pass and fail respectively.
For my webpage Javascript code I wrote tests to ensure the following:
Value returned from API is an integer.
The returned value is successfully inserted into the proper element in the webpage.
Based on outcome of the tests, the webpage console will log information to help me debug where the issue is.
When combined, this provides end-to-end testing for my entire web application.
Additionally, I plan on improving my testing by using Cypress to test my API when I get to the CI/CD section of the challenge. I'll also be using Terraform to deploy my resources which I'm excited about digging into!