Static Website

Recently, I decided to renew my silentYak blog and recreate it as a static website built using Zola and deployed to S3 with CloudFront. These are the ultra-brief cliff notes on what to do in case you want to set up something similar for yourself.

  • Create the source code Q (example) of the blog using Zola.
  • Create the GitHub repository S hosting the source code Q of the blog.
  • Create a private S3 bucket B to which artifacts will be published.
  • Create a hosted zone H in Route 53 for your domain M.
  • Update your domain registrar to use Route 53’s nameservers for domain M.
  • Wait for the hosted zone H in Route 53 to become authoritative.
  • Request a new certificate X for your domain M from AWS Certificate Manager (ACM) in us-east-1 (CloudFront has this constraint on AWS region).
  • Create domain verification (TXT) records in hosted zone H as prompted by ACM.
  • Wait for the requested certificate X to be issued.
  • Create a CloudFront function F to redirect to index.html files correctly (example).
  • Create a CloudFront distribution D with certificate X, “Viewer Request” function F, and alternate domain name M (not optional).
  • Update hosted zone H to create A and AAAA aliases for distribution D.
  • Update S3 bucket B‘s access permissions to authorize CloudFront distribution D.
  • Create an OpenID Connect identity provider C to represent GitHub Actions.
  • Create an IAM policy P (see below) for allowing access to S3 and CloudFront.
  • Create an IAM role R for publishing artifacts.
  • Associate the IAM policy P with the IAM role R.
  • Associate a trust policy T (see below) with role R for GitHub Actions.
  • Generate a GitHub Actions workflow file W (example) to build and publish artifacts.
  • Commit the GitHub Actions workflow file W to the GitHub repository S.
Example IAM Policy P
    "Version": "2012-10-17",
    "Statement": [
            "Sid": "Statement0",
            "Effect": "Allow",
            "Action": [
            "Resource": [
Example Trust Policy T
    "Version": "2012-10-17",
    "Statement": [
            "Effect": "Allow",
            "Principal": {
                "Federated": "arn:aws:iam::965356658022:oidc-provider/"
            "Action": "sts:AssumeRoleWithWebIdentity",
            "Condition": {
                "StringEquals": {
                    "": "",
                    "": "repo:rri/silentYak:ref:refs/heads/main"

That’s all for today, folks! 🖖

Installing Gentoo

I got my first computer when I finished tenth grade. It was an Intel Pentium III clocked at 1.1GHz, with 128MB of memory and Microsoft Windows 98 installed. I say “my” computer though it was really “our” home computer that I shared with my parents and siblings, because after all, I was the only one that tried to understand the psychology of our 56K dial-up modem, or sit through the night to download Internet Explorer 6 while hoping that my Internet connection wouldn’t get interrupted or throttled. When I went to DAIICT1Dhirubhai Ambani Institute of Information & Communication Technology for my undergraduate studies, I carried the computer with me to my hostel, and it lasted about a year before succumbing to dust and moisture.

My next computer had an AMD core and this time, I had the IT vendor give me a dual boot system with Linux. Unfortunately, I didn’t have much of a choice in distribution, and I ended up with a computer that had Red Hat Enterprise Linux 3 installed on it. As soon as I got the computer, I realized it had a problem that the IT vendor didn’t know how to fix – a BIOS bug resulted in my RHEL3 computer clock running at multiples of normal speed. I could observe my system clock (and everything else) tick at a frantic pace, and thus, my foray into GNU/Linux began with a sense of great urgency. Fortunately, with some sleuthing and help from the Internet, I eventually managed to figure out a workaround for the bug.

Several of us from the class of ’07 had dual boot configurations (with GNU/Linux). Only some of us actually used our GNU/Linux operating systems, and I wasn’t one of them. One of the things I learned then was that I could never really learn to work with GNU/Linux until I let go of Windows as a crutch, and so I finally wiped my disk, installed an early version of Fedora, and switched to a pure GNU/Linux system. Later on, I discovered Gentoo Linux, a so-called “meta” distribution2…because every Gentoo user’s operating system can be fairly distinctive based on their own choice of packages., and never looked back.

Gentoo has a certain elegance and beauty to it that is undiminished to this day. Gentoo’s original claim to fame was its extensive documentation and dogged insistence on user choice. By “user choice” I do mean that the user gets to decide practically every facet of their system setup, starting with the bootloader all the way through the Linux kernel configuration and installed drivers and software packages. Every feature of Gentoo stems from this philosophy of user choice. Gentoo gives you a brilliantly designed repository and package management system called Portage that lets you pick and choose the features you want via “USE” flags, then compile the packages yourself. You even get to compile the Linux kernel yourself, if you’re so inclined, packaging exactly the drivers and firmware you actually need. If it matters to you, you can choose to create a system that only uses free software. Over the past decade, many GNU/Linux systems have begun to use systemd, a complex, controversial and arguably bloated software management system. Gentoo is one of the few distributions that offers users viable and practical alternatives (starting with an init system called OpenRC) that adhere to the Unix philosophy of each program doing one thing really well.

In recent years, I had drifted away from using GNU/Linux on a day-to-day basis. Other priorities made it easy for me to yield to the convenience of relying on a Macbook Pro. But recently, work has led me back to embedded software, forcing me think about compilers, linkers, firmware, and build systems. I decided I would take the plunge and dive back into the world of GNU/Linux by setting up a home computer once again.

And so last week, I bought an AMD Ryzen 7 Beelink SER5 Mini PC and installed Gentoo on it. Surprisingly little has changed with Gentoo’s installation process over the past 15 years…and this is a good thing. The installation process is as simple, straightforward and pleasant as it was the first time I worked through it. The Gentoo Handbook provides step-by-step instructions for installation, instructions that have been kept working and up-to-date, thanks to the maintainers’ hard work.

Gentoo Handbook for the x86_64 architecture (2007 [L] vs 2023 [R])

So what’s changed over all these years?

The Unified Extensible Firmware Interface (UEFI) standard seems to have become widely accepted, displacing the proprietary BIOS de facto standard. UEFI avoids relying on “boot sectors” to determine the system initialization code to be executed, instead using a “boot manager” application that reads from the “EFI System Partition” or ESP. Overall, this change seems to be for the better, although it has implications on the partitioning scheme (requiring a FAT32-formatted ESP) and bootloader (software such as GRUB, which works very differently today than it did in the early 2000s).

Partition management has evolved dramatically, moving us away from the Master Boot Record (MBR) and having us embrace the GUID Partition Table (GPT), a part of the UEFI standard. This is definitely an improvement; the MBR was rather clunky and severely limited the number of partitions one could create, whereas GPT is a straightforward replacement that lifts these limitations.

I will not say much about Logical Volume Management (LVM), which adds a software layer on top of any partitioning scheme. LVM is useful primarily for dynamic partition resizing and full disk encryption, and it remains as cryptic and inscrutable as ever. Since neither of these features really mattered to me, I carefully steered clear of LVM.

The core steps behind installing Gentoo have remained largely the same, starting with downloading a minimal version of a “live” operating system on a removable disk, booting into it, downloading a “Stage 3” archive of the Gentoo system, and chroot-ing into the installation environment. Optical disks are no longer in fashion, but bootable USB sticks are even easier to flash, and Internet download speeds are significantly faster than they used to be.

Compiling the Linux kernel was as simple as ever before (how challenging this is has always been a function of the degree of support provided by the hardware vendor). I didn’t have any trouble with my wireless device, Ethernet card, or graphics card (the usual suspects). Networking was straightforward, though I had to look up a couple of configurations to use WPA3 (the latest wireless authentication standard that the world is starting to adopt). Portage works the same as before, just a little easier – for instance, I no longer need to use revdep-rebuild to fix broken symlinks after cleaning up unneeded packages.

This time around, I added a touch of newfound minimalism to my system by installing i3, an extremely lightweight tiling window manager, in lieu of a more heavy-weight desktop environment. So far I’m loving it, and wish I had used it sooner.

That’s all for today, folks! 🖖

UX Design

…for software engineers.

Numerous books have been written about user experience design, and at least some of them are well worth reading. But this presupposes that UX design requires a certain kind of unique expertise, and that its application must be left to the experts. My goal for today is to convince you that, like any other field, the basics are fairly easy to grasp, and as a software engineer, it would be remiss of you to not learn and apply it in the products that you build.

The term “User Experience” (UX) is broader than, and in a sense distinct from the term “user interface”, and isn’t limited to graphical screens or visual elements. User experience is truly about the experience of the user, and captures the qualitative aspect of their interactions with their ecosystem in the context of your product. Notice how I said “interactions with their ecosystem” rather than “interactions with your product” – making the right UX decisions requires a good understanding of the user’s ecosystem, and the role your product plays within that ecosystem.

The user experience captures the qualitative aspect of the user’s interactions with their ecosystem in the context of your product.

Without further ado, here are some critical points to remember:

User-centric. First and foremost, remember that any good UX begins by viewing the world from the perspective of the user, not the perspective of the product or its technical implementation. If I am the user, what goal am I trying to accomplish? How is the product making my life easier? Taking this user-centric approach is surprisingly hard, especially for someone who’s deeply steeped in building their own product. You might need to unlearn and forget what your product does for a while, and re-examine the problem afresh.

Goal-oriented. In user-centric design, any activity taken up by the user can be attributed to a goal that they have in mind. Users don’t mindlessly click buttons; they do so because they believe it helps them achieve their goal. What is that goal? How can it be expressed clearly in the language of the user? Notice that any goal expressed in this manner is effectively agnostic to your product. For instance, the user does not want to “create an order for pencils, enter their payment information, check out, and confirm the order”; rather, she just wants to buy pencils.

The role of your product in this context is two-fold. First, it is to convey to the user that taking a specific action clearly helps them make progress towards achieving their goal. Second, when simple actions don’t suffice, it is to help break down the goal into smaller sub-goals, and guide the user towards understanding how fulfilling these sub-goals would, in fact, help them achieve their overarching goal.

In most cases, helping the user achieve their goal in the quickest and simplest possible way means fewer buttons, prompts, steps, or visual elements. However, there are notable exceptions, driven by concurrent goals that subtly conflict with or constrain each other. For instance, although Amazon’s “1-Click” experience eliminates all steps other than a single click, users actually prefer the “Buy Now” experience that requires two clicks, first to click the “Buy Now” button, and second to confirm the order after checking their payment and delivery details. The reason for this is quite simple: users have the additional goals of wanting to use the right credit card for their order, and wanting to have the item delivered to the right address. In the “Buy Now” experience, they can view these details prior to confirming their order, whereas the “1-Click” experience, though arguably faster, leaves these auxiliary goals unmet.

Reward-driven. As should be evident by now, the user is not a mindless automaton, but rather a reward-driven agent. Users like to be rewarded for the actions they take, and shy away from actions that are repetitive or ambiguous with regard to whether they lead towards their goal. By being thoughtful about each action requested and ensuring that any action results in the completion of a sub-goal, your product can make every step rewarding and valuable to the user, leading to a delightful experience.

Late binding to product concepts. A common pitfall especially amongst software engineers is that they identify closely with the product that they are creating, and consequently (and unknowingly) force their users to learn about and understand concepts that are bespoke to the product. These concepts may include product names, feature branding, and implementation-specific entities. Sometimes, the introduction of new concepts is unavoidable (e.g., “to buy pencils, you must <place> an <order> for a <quantity> of <items> listed as ‘pencils'”). Applying the idea of being reward-driven (from the previous section), it is best not to introduce new concepts unnecessarily and require users to learn what they mean, and if you absolutely must, look for the “last responsible moment” to do so.

Integrated with the ecosystem. All products have a need for some degree of integration with the user’s ecosystem. Remember that the user’s goals are agnostic to your product and span their ecosystem (including products you don’t own and constraints you don’t control). For instance, if your product offers data entry features that eliminates the need for standalone Excel spreadsheets in that context, is there a different reason (unrelated to your product) that the user still needs to avail of those Excel spreadsheets? If the answer is yes, you might be doubling the amount of data entry from the perspective of the user! And if so, the ability to export your data to Excel may become a life-saving feature in that ecosystem. In general, consider how your product seeks to change the user’s ecosystem, and whether that change is an overall improvement or worsening of the user experience.

In conclusion, I would say that a good UX is like candles on a birthday cake: of course, you absolutely need the cake (your product innovations and its unique value proposition to the user), but you don’t want to proceed to cut the cake without lighting candles on it first. Also, you don’t need to be afraid of improvising, if you’ve forget to buy the candles from the store.

All you have to do is put yourself in the shoes of your users.

That’s all for today, folks! 🖖