<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[adistrim]]></title><description><![CDATA[adistrim]]></description><link>https://blog.adistrim.in</link><generator>RSS for Node</generator><lastBuildDate>Wed, 15 Apr 2026 16:13:39 GMT</lastBuildDate><atom:link href="https://blog.adistrim.in/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Why I built a web search library]]></title><description><![CDATA[I was working on a side project where I needed to add basic web search capability to an AI agent. When I looked around, I noticed most projects rely on existing search APIs like Bing, Google, Perplexity, Brave, etc.
For my use case, that felt like ov...]]></description><link>https://blog.adistrim.in/why-i-built-a-web-search-library</link><guid isPermaLink="true">https://blog.adistrim.in/why-i-built-a-web-search-library</guid><category><![CDATA[web search]]></category><category><![CDATA[Developer Tools]]></category><category><![CDATA[Go Language]]></category><category><![CDATA[JavaScript]]></category><category><![CDATA[ai agents]]></category><dc:creator><![CDATA[Aditya Raj]]></dc:creator><pubDate>Mon, 29 Dec 2025 07:49:29 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766990804757/ce6630be-d8d9-439e-9ccd-26a6ce987d2c.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I was working on a side project where I needed to add basic web search capability to an AI agent. When I looked around, I noticed most projects rely on existing search APIs like Bing, Google, Perplexity, Brave, etc.</p>
<p>For my use case, that felt like overkill. I did not need ranking dashboards or large-scale indexing, and honestly I did not want to pay for it. A friend jokingly said, “Why don’t you just send a curl request to Google and parse the HTML?” It sounded dumb at first, but the idea itself was not completely wrong.</p>
<p>The real issue is that Google has very strong protections and the markup changes often. That makes it unreliable.</p>
<p>So I chose DuckDuckGo. It has a <code>html.duckduckgo.com</code> endpoint that returns stable, JS-free HTML. DuckDuckGo provides this for text-only browsers, accessibility tools, and low-bandwidth use cases. It is predictable and easy to parse.</p>
<p>So I built the core functionality in Go. The core does two things:</p>
<ul>
<li><p>Sends requests to <code>html.duckduckgo.com</code> and parses the HTML to extract search results.</p>
</li>
<li><p>Sends direct HTTP requests to public web pages and extracts readable text from them, as long as the page is openly accessible and not behind authentication.</p>
</li>
</ul>
<p>This is important to say clearly: this is not a web-scraping or crawling tool. It does not bypass protections or execute JavaScript. It is meant for simple, repeatable searches and lightweight text extraction for agents and backend systems.</p>
<p>I chose Go for the core mainly because of networking. Go’s standard library gives solid HTTP, TLS, timeouts, cancellation, and connection reuse out of the box. The workload here is network I/O, not heavy computation, and Go handles that very reliably.</p>
<p>Once the core was working, I needed to expose it as a JavaScript library. I already knew this pattern from tools like esbuild and Prisma, where the core is written in a different language and wrapped by a JS layer. I wrote a JS wrapper that spawns the Go binary. Platform-specific binaries (darwin-arm64, darwin-x64, linux, windows) are listed as dependencies, so the correct one gets installed based on the user’s platform. At this point the library was around version 0.0.4, and everything worked fine locally. I posted about it on LinkedIn.</p>
<p>The real problem showed up later. I bundled my backend (built using Bun and Hono) into a single binary. In that setup, the JS wrapper could no longer reliably find the Go binary. The issue was not Go itself, but assumptions about filesystem layout once everything is bundled.</p>
<p>My first thought was to port the Go core to JavaScript, but that would mean losing Go’s networking advantages. Then I looked at how native bcrypt works. Its core is written in C/C++ and exposed as a <code>.node</code> extension, which is a completely different architecture. I also considered rewriting the core in C or Rust and shipping it as a native addon, but that felt like serious over-engineering for a library where performance is dominated by network latency.</p>
<p>So I went back and researched how Prisma handles binaries. That led to the actual fix.</p>
<p>Now the library still ships with platform binaries, but if the binary is not found at runtime, it automatically downloads it from the GitHub releases page and caches it locally. There is also an option for users to manage the binary themselves if they want full control. This removed fragile assumptions and made the setup much more reliable.</p>
<p>The current version at the time of writing is 0.1.0.</p>
<p>I initially thought this would be a very small project. Feature-wise, it is. But it pushed me into areas I had not dealt with before: how JS runtime libraries really work, how native binaries are distributed, and how easy it is to make the wrong architectural choice early.</p>
<p>I named the library quack-search because it uses DuckDuckGo, and ducks make a quack sound.</p>
<p>GitHub: <a target="_blank" href="https://github.com/adistrim/quack">https://github.com/adistrim/quack</a><br />npm: <a target="_blank" href="https://www.npmjs.com/package/quack-search">https://www.npmjs.com/package/quack-search</a></p>
<p><em>Article Banner Image is AI-generated using Gemini.</em></p>
<p>That’s it. Thanks for reading.</p>
]]></content:encoded></item><item><title><![CDATA[I self hosted a VPN on AWS, should you?]]></title><description><![CDATA[Setting up a personal VPN provides a reliable method for securing internet connections, protecting privacy, and maintaining control over sensitive data. This guide explains the process of deploying a self-hosted VPN on AWS, utilizing the OpenVPN Acce...]]></description><link>https://blog.adistrim.in/self-hosting-a-vpn</link><guid isPermaLink="true">https://blog.adistrim.in/self-hosting-a-vpn</guid><category><![CDATA[vpn]]></category><category><![CDATA[AWS]]></category><category><![CDATA[SelfHosting]]></category><category><![CDATA[ec2]]></category><category><![CDATA[Linux]]></category><category><![CDATA[beginner]]></category><dc:creator><![CDATA[Aditya Raj]]></dc:creator><pubDate>Tue, 29 Oct 2024 12:03:22 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1745935010867/e0df4461-e46a-4013-9735-d4a27fe57a68.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Setting up a personal VPN provides a reliable method for securing internet connections, protecting privacy, and maintaining control over sensitive data. This guide explains the process of deploying a self-hosted VPN on AWS, utilizing the OpenVPN Access Server. The solution offers a straightforward and dependable way to establish secure connections for individual use.</p>
<p><em>Setting up your own VPN sounds cool (and it is), but honestly, I wouldn't recommend it unless you really know what you're doing. If you mess up the setup, you could end up making things less secure. But if you're serious about handling your own privacy and ready to manage it properly, stick around.</em></p>
<h3 id="heading-prerequisites">Prerequisites</h3>
<ul>
<li><p><strong>AWS Free Tier account</strong> (optional but beneficial for beginners)</p>
</li>
<li><p><strong>Basic familiarity with SSH</strong> and EC2 instances</p>
</li>
</ul>
<h3 id="heading-step-1-launch-an-ec2-instance-with-openvpn">Step 1: Launch an EC2 Instance with OpenVPN</h3>
<ol>
<li><p><strong>Launch a new instance</strong> and search for the <strong>OpenVPN Access Server AMI</strong> in the AWS Marketplace.</p>
</li>
<li><p><strong>Select an instance type</strong>:</p>
<ul>
<li><p><strong>t2.micro</strong>: Free under the AWS Free Tier, suitable for running OpenVPN Access Server.</p>
</li>
<li><p><strong>t2.nano</strong>: An economical choice if not using the Free Tier, costing around $0.006 per hour, capable of supporting OpenVPN’s needs for up to two devices.</p>
</li>
</ul>
</li>
<li><p>Configure <strong>network settings</strong> to allow essential traffic only. Open <strong>port 1194 (UDP)</strong> for the VPN connection and <strong>port 943</strong> for the admin interface.</p>
</li>
<li><p>Launch the instance with an <strong>SSH key pair</strong> for secure access.</p>
</li>
</ol>
<blockquote>
<p><strong>VPN Location Matters</strong>: The location of your VPN server is based on the <strong>AWS region</strong> where the EC2 instance is hosted. For example, if your instance is in <strong>us-east-1 (North Virginia)</strong>, your internet traffic will appear to originate from the United States. Similarly, hosting in <strong>ap-south-1 (Mumbai)</strong> routes traffic through the Indian internet. Choose a region based on the geographic benefits or restrictions you prefer.</p>
</blockquote>
<h3 id="heading-step-2-ssh-into-the-instance-and-set-up-the-admin-user">Step 2: SSH into the Instance and Set Up the Admin User</h3>
<ol>
<li><p>Once your instance is running, SSH into it by the SSH code provided by the AWS.</p>
</li>
<li><p>Accept the licenses and agreements &amp; keep the default settings.</p>
</li>
<li><p>You’ll be prompted again to login but as <code>openvpnas</code>.</p>
</li>
<li><p><strong>Configure the admin password</strong> for the OpenVPN Access Server by running:</p>
<pre><code class="lang-http"> sudo passwd openvpn
</code></pre>
<p> Enter a strong password for logging into the OpenVPN admin portal. The same credentials can be used to access the user portal.</p>
</li>
</ol>
<h3 id="heading-step-3-access-the-openvpn-admin-portal">Step 3: Access the OpenVPN Admin Portal</h3>
<ol>
<li><p>Open a browser and navigate to the admin portal at:</p>
<pre><code class="lang-http"> https://your-ec2-instance-public-ip:943/admin
</code></pre>
</li>
<li><p><strong>Log in with the credentials</strong> you set in the previous step.</p>
</li>
<li><p><strong>Configure your VPN settings</strong> as needed. This includes adding users, enabling multi-factor authentication (Optional).</p>
</li>
<li><p>Make sure both the options in the routing section are marked <code>YES</code>: Configuration/VPN Settings. Save the settings and update the running server.</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1730201299844/dae7d0db-12bb-4f00-bc76-49cc5e24dca1.png" alt class="image--center mx-auto" /></p>
<blockquote>
<p><strong>Note</strong>: OpenVPN Access Server's free license allows up to <strong>2 simultaneous connections</strong>, suitable for personal use across two devices.</p>
</blockquote>
</li>
</ol>
<h3 id="heading-step-4-download-and-install-the-openvpn-client">Step 4: Download and Install the OpenVPN Client</h3>
<ol>
<li><p>Now that your OpenVPN server is set up, open this URL:</p>
<pre><code class="lang-http"> https://your-ec2-instance-public-ip:943
</code></pre>
<p> This page provides a download link for the <strong>OpenVPN client</strong>, pre-configured to connect to your server.</p>
</li>
<li><p><strong>Download and install the client</strong> on your device, then enter your VPN credentials when prompted. You’ll now be securely connected to your OpenVPN server on AWS!</p>
</li>
</ol>
<h3 id="heading-conclusion">Conclusion</h3>
<p>In just a few steps, it’s possible to set up a fully working, self-hosted VPN on AWS that keeps internet connections secure and private from anywhere. This setup is a good enough option for anyone who wants more control over their online security without paying for a huge monthly subscription. Even though it’s self-hosted, the VPN runs fast (actually fast) and reliably thanks to AWS’s strong infrastructure, making it perfect for streaming and browsing.</p>
<p><strong>However,</strong> I <strong>do not recommend</strong> setting this up unless you <strong>exactly know what you're doing</strong>. Self-hosting a VPN means you are responsible for securing the server, managing the EC2, handling firewall rules, and monitoring for potential vulnerabilities. A badly configured VPN can actually expose you to <strong>more</strong> risks than using a trusted paid service.</p>
<h3 id="heading-benefits-only-if-managed-well-according-to-me">Benefits (only if managed well) according to me:</h3>
<ul>
<li><p><strong>Privacy and Control</strong>: Since the VPN is self-hosted, all data stays private without being handled by any third party services.</p>
</li>
<li><p><strong>Geographic Flexibility (not as flexible as traditional VPNs)</strong>: AWS lets users pick server locations from different regions, like <code>us-east-1</code> for the U.S. or <code>ap-south-1</code> for India, offering more choices based on where the connection needs to be routed.</p>
</li>
<li><p><strong>Cost Efficiency</strong>: Using AWS’s Free Tier and affordable instances like <code>t2.nano</code> helps keep costs low while still getting good performance.</p>
</li>
</ul>
<p>After using this setup for a long time now, it has been extremely reliable on both laptop and mobile. The VPN connection feels smooth, even when streaming 4K videos, with almost no noticeable difference compared to a regular connection. CPU usage on the EC2 instance usually stays under 10%, except during heavier activities like video calls, or being active on Discord (I haven’t tested it for gaming).</p>
]]></content:encoded></item><item><title><![CDATA[What Are Git Orphan Branches and How to Use Them]]></title><description><![CDATA[What is Git?
Git is a free and open-source distributed version control system (DVCS) that is widely used in software development. It helps track changes, collaborate with others, and manage branches effectively.
This brief introduction only touches o...]]></description><link>https://blog.adistrim.in/what-are-git-orphan-branches-and-how-to-use-them</link><guid isPermaLink="true">https://blog.adistrim.in/what-are-git-orphan-branches-and-how-to-use-them</guid><category><![CDATA[orphan branches]]></category><category><![CDATA[GitHub]]></category><category><![CDATA[Git]]></category><category><![CDATA[version control]]></category><category><![CDATA[Git branches]]></category><dc:creator><![CDATA[Aditya Raj]]></dc:creator><pubDate>Sun, 02 Jun 2024 19:06:39 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1717350738252/9dd8ff00-a110-4ec6-aae8-9b8fd55de60f.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h4 id="heading-what-is-git">What is Git?</h4>
<p>Git is a free and open-source distributed version control system (DVCS) that is widely used in software development. It helps track changes, collaborate with others, and manage branches effectively.</p>
<p>This brief introduction only touches on the basics of what Git can do. For a deeper dive, check out this <a target="_blank" href="https://docs.github.com/en/get-started/using-git/about-git">link</a>.</p>
<h4 id="heading-what-are-branches">What are Branches?</h4>
<p>Imagine the codebase as a linear highway representing its history. A regular Git branch is like an exit ramp off that highway, creating a new path for development that diverges from the main road (often called master or main). On this new branch, you can work independently, making changes without affecting the main codebase.</p>
<p>To create a regular branch named <code>feature-branch</code>, use the following command:</p>
<pre><code class="lang-bash">git checkout -b feature-branch
</code></pre>
<p>This command creates a new branch and switches your working directory to it.</p>
<p>Regular branches offer the advantage of isolating new features for testing while maintaining a connection to the project’s history. For a detailed understanding of branches in Git, check out this <a target="_blank" href="https://www.git-scm.com/docs/git-branch/2.7.6#:~:text=The%20command's%20second%20form%20creates,switch%20to%20the%20new%20branch.">link</a>.</p>
<h4 id="heading-what-are-orphan-branches">What are Orphan Branches?</h4>
<p>Unlike regular branches that extend from existing history, orphan branches are like entirely new roads built from scratch. They have no connection to the main codebase or any other branch's history.</p>
<p>To create an orphan branch named <code>experimental-code</code>, use the following command:</p>
<pre><code class="lang-bash">git checkout --orphan experimental-code
</code></pre>
<p>This command creates a new branch named <code>experimental-code</code> and detaches your working directory from any previous branch, making it the starting point for the orphan branch.</p>
<h4 id="heading-orphan-branch-use-cases">Orphan Branch Use Cases</h4>
<ul>
<li><p><strong>Standalone Experiments</strong>: When exploring a radical new feature or refactoring approach, you might not want to clutter the main branch's history. An orphan branch allows you to experiment freely.</p>
</li>
<li><p><strong>External Code Integration</strong>: If you're incorporating code from an external source that doesn't align with your project's history, creating an orphan branch helps maintain a clean separation.</p>
</li>
<li><p><strong>Brand New Repositories</strong>: When requesting a code review for a brand new repository, an orphan branch provides a clean slate for reviewers.</p>
</li>
</ul>
<h4 id="heading-key-differences-to-remember">Key Differences to Remember</h4>
<ul>
<li><p><strong>History</strong>: Regular branches maintain a connection to the main codebase's history, while orphan branches have no such connection.</p>
</li>
<li><p><strong>Merging</strong>: Merging orphan branches back into the main branch can be trickier than standard merges due to the lack of shared history.</p>
</li>
</ul>
<h4 id="heading-conclusion">Conclusion</h4>
<p>Regular branches and orphan branches each serve unique purposes. Regular branches are ideal for most development workflows, offering isolation while preserving historical context. Orphan branches, on the other hand, are useful for scenarios where a clean break from existing history is needed. Understanding the strengths and considerations of both types of branches allows you to effectively leverage them to enhance your Git development experience.</p>
]]></content:encoded></item><item><title><![CDATA[Localhost to Production]]></title><description><![CDATA[It's been 2 years since I started to learn how to build web applications, and there is a special kind of comfort in building things locally, The code works, and the frontend layout is perfect on my screen. That was my first pre-internship phase; my c...]]></description><link>https://blog.adistrim.in/localhost-to-production</link><guid isPermaLink="true">https://blog.adistrim.in/localhost-to-production</guid><category><![CDATA[Docker]]></category><category><![CDATA[ci-cd]]></category><category><![CDATA[CI/CD]]></category><category><![CDATA[Python]]></category><category><![CDATA[JavaScript]]></category><category><![CDATA[GitHub]]></category><category><![CDATA[github-actions]]></category><category><![CDATA[production]]></category><category><![CDATA[localhost]]></category><dc:creator><![CDATA[Aditya Raj]]></dc:creator><pubDate>Tue, 28 May 2024 06:29:07 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1716868330686/92ea81db-6cd5-4b7a-ae1b-e011565315a5.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It's been 2 years since I started to learn how to build web applications, and there is a special kind of comfort in building things locally, The code works, and the frontend layout is perfect on my screen. That was my first pre-internship phase; my code always stayed peacefully on GitHub, a silent validation of how things should work.</p>
<p>During my first internship, I had to write code that was supposed to be deployed on the servers. The code that ran perfectly on my machine (thanks, 127.0.0.1) was pretty bad on the servers. It was a humbling experience, but it opened my eyes as a beginner to the crucial fact: <strong>development isn't just about what works on my machine – it's about making things work consistently in different environments.</strong></p>
<p>After my internship was over I carried those lessons forward with me and embraced them.</p>
<h3 id="heading-the-cicd-continuous-integration-ci-continuous-deploymentdelivery-cd"><strong>The CI/CD - Continuous Integration (CI) Continuous Deployment/Delivery (CD)</strong></h3>
<p>What is it? In simpler words making an environment that tests the new merge (new code addition) and if that code passes the test without any red flags then it gets staged for the deployment, and this whole process is automated the best example for beginners is GitHub Actions.</p>
<h4 id="heading-benefits-of-cicd"><strong>Benefits of CI/CD:</strong></h4>
<p><strong>Early Detection of Bugs:</strong> Automated tests run on every code commit, catching issues early.</p>
<p><strong>Consistent Releases:</strong> Frequent, smaller updates reduce the risk and complexity of big releases.</p>
<p><strong>Faster Feedback:</strong> Devs receive immediate feedback, allowing them to address issues quickly.</p>
<p><strong>Reduced Manual Effort:</strong> Automation minimizes the manual steps involved in integration and deployment.</p>
<h4 id="heading-setting-up-a-cicd-pipeline"><strong>Setting Up a CI/CD Pipeline:</strong></h4>
<p><strong>Version Control:</strong> Start by using a version control system like Git. All your code changes should be committed to a shared repository (e.g., GitHub, GitLab).</p>
<p><strong>CI Configuration:</strong> Set up a CI server to automate the build and test process. Popular tools include Jenkins, GitHub Actions, and GitLab CI/CD.</p>
<h4 id="heading-example-github-actions-for-ci-for-anodeproject"><strong>Example: GitHub Actions for CI for a</strong><code>node</code><strong>project</strong></h4>
<p>Create a <code>.github/workflows/ci.yml</code> file in your repository:</p>
<pre><code class="lang-yaml"><span class="hljs-attr">name:</span> <span class="hljs-string">CI</span> <span class="hljs-string">Pipeline</span>

<span class="hljs-attr">on:</span> [<span class="hljs-string">push</span>]

<span class="hljs-attr">jobs:</span>
  <span class="hljs-attr">build:</span>
    <span class="hljs-attr">runs-on:</span> <span class="hljs-string">ubuntu-latest</span>

    <span class="hljs-attr">steps:</span>
    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Checkout</span> <span class="hljs-string">code</span>
      <span class="hljs-attr">uses:</span> <span class="hljs-string">actions/checkout@v2</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Set</span> <span class="hljs-string">up</span> <span class="hljs-string">Node.js</span>
      <span class="hljs-attr">uses:</span> <span class="hljs-string">actions/setup-node@v2</span>
      <span class="hljs-attr">with:</span>
        <span class="hljs-attr">node-version:</span> <span class="hljs-string">'21'</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Install</span> <span class="hljs-string">dependencies</span>
      <span class="hljs-attr">run:</span> <span class="hljs-string">npm</span> <span class="hljs-string">install</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Run</span> <span class="hljs-string">tests</span>
      <span class="hljs-attr">run:</span> <span class="hljs-string">npm</span> <span class="hljs-string">test</span>
</code></pre>
<p><em>This configuration runs the CI pipeline on every push, checking out the code, setting up Node.js, installing dependencies, and running tests.</em></p>
<p>This <code>ci.yml</code> can be extended depending on the deployment environment we are working with.</p>
<h3 id="heading-dockerizing">Dockerizing</h3>
<p>Basically packing up code in a cartoon (container) box with all the dependencies and environment it requires, which makes it easy to ship/share.</p>
<p>Containers bundle the application code with all its dependencies, libraries, and configuration files, ensuring that the application runs consistently across different environments.</p>
<h4 id="heading-benefits-of-using-docker"><strong>Benefits of Using Docker</strong></h4>
<p><strong>Consistency:</strong> Docker containers ensure that your application behaves the same way in the development, testing, and production environments.</p>
<p><strong>Isolation:</strong> Each container is isolated from others, preventing conflicts between applications.</p>
<p><strong>Portability:</strong> Docker containers can run on any system that supports Docker (Raspberry Pi too), making it easy to move applications between environments.</p>
<p><strong>Scalability:</strong> Docker makes it straightforward to scale applications horizontally by running multiple container instances.</p>
<h4 id="heading-getting-started-with-docker"><strong>Getting Started with Docker</strong></h4>
<p><strong>Installing Docker:</strong> Before you can start using Docker, you need to install it on your local machine. Docker provides detailed installation instructions for different operating systems on its official website.</p>
<p><strong>Creating a Dockerfile:</strong> A Dockerfile is a text file that contains instructions on how to build a Docker image. Here’s a simple example for a Next.js application:</p>
<pre><code class="lang-dockerfile"><span class="hljs-keyword">FROM</span> node:<span class="hljs-number">21</span>-alpine

<span class="hljs-keyword">WORKDIR</span><span class="bash"> /usr/src/app</span>

<span class="hljs-keyword">COPY</span><span class="bash"> package*.json ./</span>

<span class="hljs-keyword">RUN</span><span class="bash"> npm install</span>

<span class="hljs-keyword">COPY</span><span class="bash"> . .</span>

<span class="hljs-keyword">EXPOSE</span> <span class="hljs-number">3000</span>

<span class="hljs-keyword">CMD</span><span class="bash"> [<span class="hljs-string">"npm"</span>, <span class="hljs-string">"run"</span>, <span class="hljs-string">"dev"</span>]</span>
</code></pre>
<p><strong>Building the Docker Image:</strong> With your Dockerfile ready, you can build the Docker image using the following command:</p>
<pre><code class="lang-bash">docker build -t my-next-app .
</code></pre>
<p><strong>Running the Container:</strong> Once the image is built, you can run it as a container:</p>
<pre><code class="lang-bash">docker run -d -p 3000:3000 my-next-app
</code></pre>
<p><em>If everything is alright then the web application will be running at localhost:3000</em></p>
<h3 id="heading-code-for-collaboration-writing-clear-and-understandable-code"><strong>Code for Collaboration: Writing Clear and Understandable Code</strong></h3>
<p>I've come to realize that my code isn't solely for my understanding (although sometimes it feels nice that nobody else can understand what's happening here). But good code is not just for others; it makes my own life easier in the long run. Using clear variable names and comments makes my code easier to navigate, for both myself and my teammates. This makes projects more collaborative and maintainable in the long run.</p>
<h4 id="heading-principles-of-writing-understandable-code"><strong>Principles of Writing Understandable Code</strong></h4>
<p><strong>Descriptive Variable Names:</strong> Instead of using cryptic abbreviations or single-letter variable names, opt for descriptive names that convey the purpose of the variable. For example:</p>
<pre><code class="lang-python"><span class="hljs-comment"># Bad</span>
x = <span class="hljs-number">10</span>
y = calculate(x)

<span class="hljs-comment"># Good</span>
number_of_items = <span class="hljs-number">10</span>
total_price = calculate_total_price(number_of_items)
</code></pre>
<p><strong>Meaningful Comments:</strong> Comments should explain the intention behind complex code segments or provide context where necessary. However, strive to write self-explanatory code whenever possible.</p>
<pre><code class="lang-javascript"><span class="hljs-comment">// Bad</span>
<span class="hljs-comment">// Increment counter</span>
counter += <span class="hljs-number">1</span>;

<span class="hljs-comment">// Good</span>
<span class="hljs-comment">// Increase the counter by one</span>
counter += <span class="hljs-number">1</span>;
</code></pre>
<p><strong>Consistent Formatting:</strong> Consistent formatting improves code readability and makes it easier to understand at a glance. Follow established coding conventions or style guides for the programming language you're using.</p>
<pre><code class="lang-javascript"><span class="hljs-comment">// Inconsistent formatting</span>
<span class="hljs-keyword">if</span>(condition){
doSomething();
} <span class="hljs-keyword">else</span>{
doSomethingElse();
}

<span class="hljs-comment">// Consistent formatting</span>
<span class="hljs-keyword">if</span> (condition) {
    doSomething();
} <span class="hljs-keyword">else</span> {
    doSomethingElse();
}
</code></pre>
<p><strong>Modularization:</strong> Break down complex tasks into smaller, modular functions or classes. Each function or class should have a single responsibility, making it easier to understand and maintain.</p>
<pre><code class="lang-python"><span class="hljs-comment"># Monolithic function</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">process_data</span>(<span class="hljs-params">data</span>):</span>
    <span class="hljs-comment"># Lengthy code here...</span>

<span class="hljs-comment"># Modularized functions</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">clean_data</span>(<span class="hljs-params">data</span>):</span>
    <span class="hljs-comment"># Clean data logic...</span>

<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">analyze_data</span>(<span class="hljs-params">data</span>):</span>
    <span class="hljs-comment"># Analyze data logic...</span>

<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">visualize_data</span>(<span class="hljs-params">data</span>):</span>
    <span class="hljs-comment"># Visualize data logic...</span>
</code></pre>
<h3 id="heading-conclusion">Conclusion</h3>
<p>Deploying my code these past few months has opened my eyes to a whole new world. It's pushed me to think beyond the familiar local environment and see the bigger software delivery picture. This journey has also emphasised the importance of writing code that's not just functional, but also collaborative and maintainable.</p>
<p>So I have graduated from my GitHub comfort zone? Maybe Yes.</p>
<p>But, there's always A LOT left to LEARN.</p>
]]></content:encoded></item></channel></rss>