<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="4.4.1">Jekyll</generator><link href="https://ricardo.dev/feed.xml" rel="self" type="application/atom+xml" /><link href="https://ricardo.dev/" rel="alternate" type="text/html" /><updated>2026-04-11T00:57:40+00:00</updated><id>https://ricardo.dev/feed.xml</id><title type="html">ricardo.dev</title><subtitle>Programmer working with Ruby and Elixir. Always building, always learning. Full-time dad, part-time chaos wrangler.</subtitle><author><name>Ricardo van Hoepen</name><email>hello@ricardo.dev</email></author><entry><title type="html">Reconsidering Phoenix Moving Forward</title><link href="https://ricardo.dev/reconsidering-phoenix-moving-forward" rel="alternate" type="text/html" title="Reconsidering Phoenix Moving Forward" /><published>2025-08-15T17:00:00+00:00</published><updated>2025-08-15T17:00:00+00:00</updated><id>https://ricardo.dev/reconsidering-phoenix-moving-forward</id><content type="html" xml:base="https://ricardo.dev/reconsidering-phoenix-moving-forward"><![CDATA[<p>It has been a while since I wrote any blog posts. This was primarily due to me doing some internal introspection in my hopes, wants, dreams and desires regarding my career and the way forward in these unknown times of AI.</p>

<p>Personally, I really enjoy programming, and for the foreseeable future I don’t see myself caving in to the desires of using it to write code. I will most certainly use it to review some pieces of code I wrote, and ask it for implementation suggestions, but I will remain in control of the code and the quality of the applications that I produce.</p>

<h2 id="the-downwards-spiral">The downwards spiral</h2>

<p>I read this article the day it was published: <a href="https://fly.io/blog/phoenix-new-the-remote-ai-runtime/">Phoenix.new - The Remote AI Runtime for Phoenix</a>. The TLDR is that Phoenix now has an AI agent supplied by Fly.io and it understands Elixir and Phoenix more than other AI tools apparently, so is a much better choice at vibe coding Phoenix apps.</p>

<p>The part from the article that concerned me the most is this:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>I’m already using Phoenix.new to triage phoenix-core Github issues and pick problems to solve. I close my laptop, grab a cup of coffee, and wait for a PR to arrive
</code></pre></div></div>

<p>Now I don’t want to dunk on Chris at all. This is not that kind of post. I appreciate everything he has built and he is one of the OSS maintainers that I absolutely respect the most. But, knowing that the core maintainers are experimenting with AI agents to actively contribute to the framework itself is very concerning for me.</p>

<p>A framework is part of an application’s core architecture. They should aim to be as stable and reliable as possible, because they’re foundational. If this were happening in some optional generator for a feature that might or might not be useful then I would have totally been okay with the use of AI. But in <code class="language-plaintext highlighter-rouge">phoenix-core</code>?</p>

<p>With the latest 1.8.0 release the framework now ships with <code class="language-plaintext highlighter-rouge">AGENTS.md</code>, and sure we can manually delete it, but it shows how opinionated the framework is in shoving features you never asked for down your throat.</p>

<p>Another point about how opinionated Phoenix is, is how they changed <code class="language-plaintext highlighter-rouge">phx.gen.auth</code> in 1.8.0. IMHO, it was great as is. Now it comes with magic links by default, with no way to opt out unless the user goes to their settings and provides a password. What???</p>

<h2 id="the-alternative">The alternative</h2>

<p>For me the alternative is simple. Just stick with good ‘ol Rails for everything. I respect the heck out of DHH for standing his ground in this uncertain time and doubling down on his statement that Ruby is for humans. We don’t need machines writing or reading Ruby. For those of us that have worked with Ruby we know how readable and understandable it is.</p>

<p>I totally get how you need to copy and paste some wild functional mess from JavaScript into ChatGPT to get an understanding of what is happening and maybe to make a couple of simple changes. We don’t have that concern in Ruby. Code is executed in a predictable way and it is concise and the purpose is clear.</p>

<p>Rails is also still maintained by hand, DHH is still in control of everything that goes in, and he sure as hell won’t be doing it with his laptop closed and just waiting for machine trained regurgitated slop to be fed into the framework.</p>

<p>I have always been on the side of the Rails users shouting out about the lack of a built-in auth (to name one missing feature). Although we did get a very basic auth generator in Rails 8, what I love about it is that it doesn’t assume much. It doesn’t force features you don’t need (or want) on you.</p>

<p>I totally get why Rails has (and lacks) the features it does now. The Rails maintainers are extremely considerate about not wasting your time with things you will need to painstakingly delete manually after scaffolding a new application.</p>

<h2 id="conclusion">Conclusion</h2>

<p>In conclusion, I am still very troubled by the direction Phoenix is heading. That being said, this is all my personal opinion. Take it with the lightest grain of salt. There is no perfect language or framework. Some devs might love all this AI agent crap being stuffed into the frameworks, while it is a disappointment for others.</p>

<p>I really wanted to use Phoenix more in my personal projects. But from here on out I will be dedicating more time to Ruby and Rails instead. I still work for a company using Elixir and Phoenix heavily, so while this isn’t goodbye, it is more of a rebalancing of where I spend my personal coding energy.</p>]]></content><author><name>Ricardo van Hoepen</name><email>hello@ricardo.dev</email></author><category term="elixir" /><category term="opinion" /><category term="phoenix" /><summary type="html"><![CDATA[It has been a while since I wrote any blog posts. This was primarily due to me doing some internal introspection in my hopes, wants, dreams and desires regarding my career and the way forward in these unknown times of AI.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://ricardo.dev/assets/images/og/posts/reconsidering-phoenix-moving-forward.png" /><media:content medium="image" url="https://ricardo.dev/assets/images/og/posts/reconsidering-phoenix-moving-forward.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">RSS is awesome</title><link href="https://ricardo.dev/rss-is-awesome" rel="alternate" type="text/html" title="RSS is awesome" /><published>2025-05-23T17:00:00+00:00</published><updated>2025-05-23T17:00:00+00:00</updated><id>https://ricardo.dev/rss-is-awesome</id><content type="html" xml:base="https://ricardo.dev/rss-is-awesome"><![CDATA[<p><strong>RSS is awesome.</strong> There, I said it.</p>

<p>It truly is one of the most resilient syndication methods that the web has ever seen. I hope that it not only continues to exist but goes though a full-blown revival.</p>

<h2 id="anti-social-media">Anti-social media</h2>

<p>Social media is everywhere and it is utterly exhausting. Platforms constantly fight for your attention by sending useless notifications to your phone.</p>

<blockquote>
  <p>No I don’t know who Karen is and I don’t care, leave me alone Facebook!</p>
</blockquote>

<p>Algorithms are constantly tuned for one thing: <strong>engagement</strong>, so they can feed you more ads. There is no other way to state it, that is pretty much the goal of every social network. Even the supposedly “pure” platforms like Bluesky are working on subscriptions, which means they’ll soon need your participation too.</p>

<p>There’s no sincere desire to connect people. On social media, <strong>you and your content are the product.</strong></p>

<h2 id="stay-in-control">Stay in control</h2>

<p>With RSS, <strong>you’re in control.</strong></p>

<p>That blog you followed for crochet patterns suddenly turns political? Just remove their feed URL - gone. No algorithm will keep throwing their “enlightened” takes into your feed.</p>

<p>Unlike social media, RSS doesn’t try to “optimize” or curate your feed. There’s no engagement bait or outrage amplification. Just pure, unfiltered content. The way the internet was meant to be.</p>

<h2 id="no-ads-or-trackers-usually">No ads or trackers (usually)</h2>

<p>Most RSS readers show you just the content:</p>
<ul>
  <li>No annoying ads</li>
  <li>No pop-ups</li>
  <li>No tracking scripts.</li>
</ul>

<p>It’s <strong>faster, cleaner, and respects your privacy.</strong></p>

<p>Many RSS readers even let you control notifications and enable offline access - perfect for catching up anywhere and anytime.</p>

<h2 id="centralized-reading-experience">Centralized reading experience</h2>

<p>This was the killer feature for me.</p>

<p>No more bouncing between 20 open tabs or manually checking bookmarks.</p>

<p>With RSS:</p>
<ul>
  <li>All your sources are in one place.</li>
  <li>You can group by topic or priority.</li>
  <li>You can mark articles as read, unread or save them for later.</li>
</ul>

<p>It’s reading on <em>your</em> terms.</p>

<h2 id="great-for-following-brilliant-minds">Great for following brilliant minds</h2>

<p>Everyone’s cross-posting. People jump platforms, change usernames, go silent. It’s hard to keep up.</p>

<p>RSS fixes that. As long as someone maintains a feed, you’ll know where their content goes (even if they abandon a platform).</p>

<p><strong>You never miss an update</strong>, no matter how rarely they post.</p>

<h2 id="completely-open-and-decentralized">Completely open and decentralized</h2>

<p>RSS is not controlled by company. Anyone can publish a feed. Anyone can subscribe.</p>

<ul>
  <li>No shadowbanning</li>
  <li>No moderation</li>
  <li>No algorithmic interference</li>
</ul>

<p>It’s one of the last open web standards still widely in use.</p>

<blockquote>
  <p>You don’t need Web 3 for this. RSS has been here all along.</p>
</blockquote>

<h2 id="more-focus-less-doomscrolling">More focus, less doomscrolling</h2>

<p>Stop being fed viral outrage bait.</p>

<p>With RSS, there is no trending tab and you consume only what <em>you</em> want. It’s a calmer, more enriching way to read online content. <strong>No dopamine loops required.</strong></p>

<h2 id="no-comments-yes-thats-a-feature">No comments (Yes, that’s a feature)</h2>

<p>Even if the blogs have comment sections, RSS shows you <strong>just the content</strong>. No distractions, no off-topic rants, no comment wars.</p>

<p>For those of us who want to hear thoughtful voices (not arguments from random strangers) this is a blessing, not a bug.</p>

<h2 id="signal--noise">Signal &gt; Noise</h2>

<p>Social media platforms thrive on noise: memes, hot takes, political rants and attention-seeking.</p>

<p>RSS thrives on <strong>signal</strong>. You get content from creators, thinkers, and writers you <strong>chose</strong> to follow. No likes, no trending drama, just the good stuff.</p>

<h3 id="rss-isnt-dead-its-just-quietly-better">RSS isn’t dead. It’s just quietly better.</h3>

<p>If you value your time, attention, and sanity - ditch the feed and take the control back. Go RSS.</p>]]></content><author><name>Ricardo van Hoepen</name><email>hello@ricardo.dev</email></author><category term="publishing" /><category term="rss" /><category term="opinion" /><summary type="html"><![CDATA[RSS is awesome. There, I said it.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://ricardo.dev/assets/images/og/posts/rss-is-awesome.png" /><media:content medium="image" url="https://ricardo.dev/assets/images/og/posts/rss-is-awesome.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">QA in the AI era</title><link href="https://ricardo.dev/qa-in-the-ai-era" rel="alternate" type="text/html" title="QA in the AI era" /><published>2025-05-17T17:00:00+00:00</published><updated>2025-05-17T17:00:00+00:00</updated><id>https://ricardo.dev/qa-in-the-ai-era</id><content type="html" xml:base="https://ricardo.dev/qa-in-the-ai-era"><![CDATA[<p>As AI tools become part of our daily dev workflows, there’s a growing concern: will AI replace us? I don’t think so.</p>

<p>If you read my previous post, you will know that I am cautiously optimistic about this new era of AI. I do believe that the way we as developers write apps will change forever. Does that mean we won’t write ANY code ourselves? Definitely not. Does it mean our careers and the profession in general is at risk? No chance.</p>

<p>In fact, skilled developers will become even more sought after as the knowledge gap gets bigger. When using generative AI, non-technical users will not be able to fine-tune, defend or scale their applications to fit all the requirements that modern web apps need. That’s where we come in.</p>

<h2 id="the-importance-of-qa">The importance of QA</h2>

<p>Right now if you open two different tabs using the same AI model and you provide it the same instruction, it will generate two wildly different results. Even if you give it an existing piece of code, and ask it to change something, it might restructure or modify parts that were completely irrelevant to your request.</p>

<p>This immediately raises a few red flags. If you are a non-technical user, blindly deploying those changes gives zero certainty that your application code still adheres to all the previous business requirements you gave the model. Sure, you could ask AI to generate tests (if you even know what they are or should test) but this is generally when you will receive the most pointless, flaky or slow running tests. Why? Because AI simply doesn’t know any better.</p>

<h2 id="the-problem-with-ai-generated-tests">The problem with AI-generated tests</h2>

<p>The main problem with AI-generated tests is that AI learns just by observing what it saw somewhere on the world wide web and using those inputs, tries to find some pattern that works most of the time. It cannot ever truly “understand” your program logic or runtime behavior, despite what the AI companies are trying to sell you.</p>

<p>It will mimic basic test structures (e.g. <code class="language-plaintext highlighter-rouge">assert something == something_else</code>) without knowing if the output is meaningful in your specific context. It is also limited to the training data, and if that data contained shallow or boilerplate tests, it will learn to reproduce those.</p>

<p>AI also suffers from hallucinations, often inventing support libraries or helper methods that don’t exist. Unfortunately, there’s no real way around this.</p>

<h2 id="specs-can-provide-context">Specs can provide context</h2>

<p>If you look at some testing frameworks like RSpec, you see that they are extremely readable and as such, LLMs can understand your requirements way better just by looking at those. TDD anyone?</p>

<p>Take a look at this example:</p>

<div class="language-ruby highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="no">RSpec</span><span class="p">.</span><span class="nf">describe</span> <span class="s2">"Report management"</span><span class="p">,</span> <span class="ss">type: :request</span> <span class="k">do</span>
  <span class="n">it</span> <span class="s2">"creates a Report and redirects to the Report's page"</span> <span class="k">do</span>
    <span class="n">get</span> <span class="s2">"/reports/new"</span>
    <span class="n">expect</span><span class="p">(</span><span class="n">response</span><span class="p">).</span><span class="nf">to</span> <span class="n">render_template</span><span class="p">(</span><span class="ss">:new</span><span class="p">)</span>

    <span class="n">post</span> <span class="s2">"/reports"</span><span class="p">,</span> <span class="ss">params: </span><span class="p">{</span> <span class="ss">report: </span><span class="p">{</span> <span class="ss">name: </span><span class="s1">'Daily report'</span> <span class="p">}</span> <span class="p">}</span>

    <span class="n">expect</span><span class="p">(</span><span class="n">response</span><span class="p">).</span><span class="nf">to</span> <span class="n">redirect_to</span><span class="p">(</span><span class="n">assigns</span><span class="p">(</span><span class="ss">:report</span><span class="p">))</span>
    <span class="n">follow_redirect!</span>

    <span class="n">expect</span><span class="p">(</span><span class="n">response</span><span class="p">).</span><span class="nf">to</span> <span class="n">render_template</span><span class="p">(</span><span class="ss">:show</span><span class="p">)</span>
    <span class="n">expect</span><span class="p">(</span><span class="n">response</span><span class="p">.</span><span class="nf">body</span><span class="p">).</span><span class="nf">to</span> <span class="kp">include</span><span class="p">(</span><span class="s2">"Report was successfully created."</span><span class="p">)</span>
  <span class="k">end</span>
<span class="k">end</span>
</code></pre></div></div>

<p>This is a highly readable piece of code, and it will outlast whatever AI-generated code we use today. If your AI agents decide to rewrite everything, you have a very clear set of rules that must pass in order for your app to hit production.</p>

<p>This kind of consistency will be essential for any serious company navigating the AI era.</p>

<h2 id="co-existing-with-ai">Co-existing with AI</h2>

<p>The truth is that AI is here to stay. It will become more and more integrated into our workflows.</p>

<p>We need to adapt and integrate it in a way that makes sense and let us extract the good parts while also filtering out the bad use-cases.</p>

<p>We must also be the gatekeepers of the AI code that hits production and consciously work on ensuring that it adheres to a certain level of quality that users of a paid product deserve.</p>

<p>This is why I am of the opinion that we will write more test specs and less app code in the future. This is similar to what we have seen in previous technological revolutions where humans were replaced with machines, but humans remained in charge of QA.</p>

<p>There has always been a balance between work and quality assurance. You <em>should</em> only give AI one of the two, while remaining in control of the other.</p>

<p>If AI is writing your code (including code completions), you need to be the one writing your tests. If you don’t want to write your tests, you shouldn’t be letting AI also write your application code. Simple as that.</p>

<p>Don’t offload both to the machines, unless you really don’t care about your product (or your users).</p>

<p>You can automate code, but you can’t automate care. Care is what turns code into products that users trust.</p>]]></content><author><name>Ricardo van Hoepen</name><email>hello@ricardo.dev</email></author><category term="ai" /><category term="ai" /><category term="opinion" /><summary type="html"><![CDATA[As AI tools become part of our daily dev workflows, there’s a growing concern: will AI replace us? I don’t think so.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://ricardo.dev/assets/images/og/posts/qa-in-the-ai-era.png" /><media:content medium="image" url="https://ricardo.dev/assets/images/og/posts/qa-in-the-ai-era.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Programming 2.0</title><link href="https://ricardo.dev/programming-2-0" rel="alternate" type="text/html" title="Programming 2.0" /><published>2025-04-26T17:00:00+00:00</published><updated>2025-04-26T17:00:00+00:00</updated><id>https://ricardo.dev/programming-2-0</id><content type="html" xml:base="https://ricardo.dev/programming-2-0"><![CDATA[<p>The time has come for us, as an industry, to adapt. The goalposts have been completely moved to the other end of the field. Thousands of hours of hard-earned knowledge have become significantly less valuable.</p>

<p>If we want to stay relevant, now is the time to plan ahead and predict what the future holds.</p>

<h2 id="a-look-back">A look back</h2>

<p>This is not the first time that humanity has experienced a shift this large.</p>

<h3 id="the-industrial-revolution-late-1700s-to-1800s">The Industrial Revolution (late 1700s to 1800s)</h3>

<p>During the Industrial Revolution, machines replaced manual labor in factories. It massively increased productivity but also caused social upheaval. Workers were terrified that the machines would cost them their livelihoods and transform industries beyond recognition. Sounds familiar?</p>

<h3 id="the-electricity-revolution-late-1800s-to-early-1900s">The Electricity Revolution (late 1800s to early 1900s)</h3>

<p>Widespread electrification reshaped how businesses operated and gave rise to things like 24/7 factories and modern cities. It also birthed entire new professions: electricians and electrical engineers (to name a few). Electricity became a general purpose technology that society simply cannot live without. Candle-lit family gatherings, anyone?</p>

<h3 id="the-computing-revolution-1950s-to-1980s">The Computing Revolution (1950s to 1980s)</h3>

<p>The invention of computers (and later personal computers) revolutionized everything from finance to publishing to communication. Other than electricity, one could argue that it was the most important invention of the millennium. It shifted us from manual processes to automation and data-driven decision-making. Without this revolution, we wouldn’t even be here discussing AI today.</p>

<h3 id="the-internet-revolution-1990s-to-2000s">The Internet Revolution (1990s to 2000s)</h3>

<p>The spread of the internet didn’t just transform businesses; it reshaped culture and society itself. It led to the dotcom bubble, sure - but optimism endured. Entire new industries were born, and many old ones were wiped out. Think about it: without this revolution, we wouldn’t have a single FAANG company.</p>

<h3 id="the-mobile-revolution-2007-to-2010s">The Mobile Revolution (2007 to 2010s)</h3>

<p>Smartphones took everything - books, websites, notes, memories - and fit them into the palm of our hands. Ubiquitous access to powerful tools changed our habits on a massive scale, for better and for worse. It also brought us mobile applications which is a very recent convenience and solves our problems in ways we couldn’t even comprehend just a couple of decades ago.</p>

<h3 id="the-ai-revolution-2020s-to-">The AI Revolution (2020s to ??)</h3>

<p><strong>TBD.</strong></p>

<p>We find ourselves at the start of a new era. There’s a lot of hype. A lot of noise. Like every revolution before, there will be a lot of trial and error. History books (or websites) will only remember the wins - and the losses.</p>

<h2 id="what-is-this-all-about">What is this all about?</h2>

<p>Why the history lesson? Is this Britannica? No - it’s to highlight that all of the five previous revolutions listed above had <em>something</em> in common: Even though some industries (and jobs) collapsed, each revolution created new opportunities. New industries. New professions. New ways to thrive.</p>

<p>There’s no reason to believe the AI revolution will be any different.</p>

<p>So that’s it then, right? Post done. We should just sit around and wait for the next wave of jobs to emerge, then go back to college to re-skill and become junior engineers again?</p>

<p>You <em>could</em> do that.</p>

<p>Or, you could stay ahead of the curve - keep up with the new developments (both good and bad) and find ways to profit from it. No one knows exactly how this will unfold. Maybe all the AI companies will crash in their next seed rounds, and we will call this “The AI Bubble”.</p>

<p>Or maybe, like every time before, new industries will rise.</p>

<p>It’s too early to confidently announce what the future holds, but I remain <em>cautiously</em> optimistic. I am already looking for ways that this all might unfold.</p>

<p>I love programming. It is no secret. If I could code every day until the day I die, I’d be a happy man. But the reality is clear: AI is not coming, it’s already here. And it’s getting better every day.</p>

<p>We can mourn the way things were (like the folks in the Industrial Revolution did) and go burn down SkyNet. Or we can <em>build</em> what comes next.</p>

<p>Programming is about to get a major version bump. If we used semantic versioning, we’d call it “Programming 2.0”. Programming 2.0 won’t just be about writing code anymore. It will be about understanding (and applying) new tools.</p>

<p>The next chapter is being written - Programming 2.0 is here. Let’s get to work.</p>]]></content><author><name>Ricardo van Hoepen</name><email>hello@ricardo.dev</email></author><category term="future" /><category term="ai" /><category term="opinion" /><summary type="html"><![CDATA[The time has come for us, as an industry, to adapt. The goalposts have been completely moved to the other end of the field. Thousands of hours of hard-earned knowledge have become significantly less valuable.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://ricardo.dev/assets/images/og/posts/programming-2-0.png" /><media:content medium="image" url="https://ricardo.dev/assets/images/og/posts/programming-2-0.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Hosting my own Git server</title><link href="https://ricardo.dev/hosting-my-own-git-server" rel="alternate" type="text/html" title="Hosting my own Git server" /><published>2025-04-11T17:00:00+00:00</published><updated>2025-04-11T17:00:00+00:00</updated><id>https://ricardo.dev/hosting-my-own-git-server</id><content type="html" xml:base="https://ricardo.dev/hosting-my-own-git-server"><![CDATA[<p>The time has come. After months of watching AI companies quietly tweak their terms and AI agents scraping the internet
like hungry beasts, I’ve had enough. With a weekend and 3 bonus days off, the motivation and the need are both here. It’s
time to spin up a VPS, slap on Nginx, Docker and a Git server - like a true weekend warrior.</p>

<p>You might wonder what brought me to this point in the first place. The answer is pretty simple: AI. You see, for long
GitHub was an okay place to host my own repos. Never really had any problem except for the occasional outage. When AI
came along though things changed. Suddenly they updated their terms to allow big daddy Microsoft to teach models based
on private (at first) and public repos.</p>

<p>First I thought nothing of it. Now things are changing though, the rate at which the models are growing and pulling in
all sorts of different material without any kind of reward or even a simple attribution for the original copyright owner’s
work led me to realize that getting “off the grid” is the logical next step.</p>

<p>If I can’t control how my code is used, then at least I should control where it lives.</p>

<p>I don’t personally have any repositories (yet) that are worth learning from. When I do have any in the future though,
trust me that I want to be ready!</p>

<h2 id="a-look-at-the-playing-field">A look at the playing field</h2>

<p>There are lots of great providers to choose from. Some I have used before, some are new. So I had to compare them each
for my individual use-case. I just wanted something lightweight that lets me push my repos up and let me work in isolation.
Privacy and security are the number 1 consideration, otherwise I wouldn’t even be writing this post.</p>

<ol>
  <li><strong>GitLab</strong>: Tried and trusted.</li>
  <li><strong>Gitea</strong>: The familiar kid on the block.</li>
  <li><strong>Forgejo</strong>: Never heard of this one until now.</li>
  <li><strong>Git</strong>: Yeah apparently you can just run the repo as a daemon, who knew?</li>
</ol>

<p>After careful consideration of the pros and cons of each I decided that running plain Git would be too much for me to manage.
I would still have to figure the rest of the story for my CI/CD needs, auth, etc. it is way too barebones. GitLab, is getting
a bit too “enterprisey” for me personally. If I am about to get off GitHub, why replace it with another one exactly
like it. Forgejo is apparently a fork of Gitea, and is run with democratic governance. I personally don’t like politics
in my Git provider, so since Gitea and Forgejo is pretty much the same I decided to go with Gitea.</p>

<h2 id="setting-up-the-server">Setting up the server</h2>

<p>Nothing too fancy here. I already paid $4 per month for GitHub “Pro” just to host private repos on GitHub Pages.
So dropping the same $4 (or even just a bit more) on a Digital Ocean VPS felt like a no-brainer - I can do way more with the VPS than I ever could with a GitHub subscription.</p>

<p>I don’t live in the US, so being able to choose a server location closer to me also brought in the bonus of lower latency.
It ain’t much, but as engineers, we have that innate tendency to squeeze out every last bit of juice we can.</p>

<h2 id="installing-gitea-no-docker-no-fuss">Installing Gitea (no Docker, no fuss)</h2>

<p>I decided to skip Docker for this one. I have nothing against containers, but I wanted something a bit more simple and
transparent. If anything goes sideways, I can easily debug it with plain old system tools.</p>

<p>First, I grabbed the latest Gitea binary for Ubuntu from the <a href="https://dl.gitea.com/gitea/">official site’s releases</a>. I went with the stable release (for obvious reasons) and dropped it in <code class="language-plaintext highlighter-rouge">/user/local/bin</code>:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="c"># change the version to whatever is the latest</span>
wget <span class="nt">-O</span> gitea https://dl.gitea.io/gitea/1.23.7/gitea-1.23.7-linux-amd64
<span class="nb">chmod</span> +x gitea
<span class="nb">mv </span>gitea /usr/local/bin/</code></pre></figure>

<p>Next I created a dedicated <code class="language-plaintext highlighter-rouge">git</code> user to keep things clean and secure: (Always do this if you run things on remote servers.)</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash">adduser <span class="se">\</span>
  <span class="nt">--system</span> <span class="se">\</span>
  <span class="nt">--shell</span> /bin/bash <span class="se">\</span>
  <span class="nt">--gecos</span> <span class="s1">'Git Version Control'</span> <span class="se">\</span>
  <span class="nt">--group</span> <span class="se">\</span>
  <span class="nt">--disabled-password</span> <span class="se">\</span>
  <span class="nt">--home</span> /home/git <span class="se">\</span>
  git</code></pre></figure>

<p>Then I set up the directory structure that Gitea expects: (this is all straight from their docs btw):</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="nb">mkdir</span> <span class="nt">-p</span> /var/lib/gitea/<span class="o">{</span>custom,data,log<span class="o">}</span>
<span class="nb">chown</span> <span class="nt">-R</span> git:git /var/lib/gitea/
<span class="nb">chmod</span> <span class="nt">-R</span> 750 /var/lib/gitea/

<span class="nb">mkdir</span> /etc/gitea
<span class="nb">chown </span>root:git /etc/gitea
<span class="nb">chmod </span>770 /etc/gitea</code></pre></figure>

<p>Nearly done. Let’s create a <code class="language-plaintext highlighter-rouge">systemd</code> service so Gitea runs a daemon:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="nb">tee</span> /etc/systemd/system/gitea.service <span class="o">&gt;</span> /dev/null <span class="o">&lt;&lt;</span><span class="no">EOF</span><span class="sh">
Description=Gitea
After=network.target

[Service]
RestartSec=2s
Type=simple
User=git
Group=git
WorkingDirectory=/var/lib/gitea/
ExecStart=/usr/local/bin/gitea web --config /etc/gitea/app.ini
Restart=always
Environment=USER=git HOME=/home/git GITEA_WORK_DIR=/var/lib/gitea

[Install]
WantedBy=multi-user.target
EOF</span></code></pre></figure>

<p>Then it was just a matter of starting it up:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash">systemctl daemon-reexec
systemctl <span class="nb">enable</span> <span class="nt">--now</span> gitea</code></pre></figure>

<p>Once the service is up and running, Gitea will be listening on port <code class="language-plaintext highlighter-rouge">3000</code> by default. I navigated to my server’s IP in
the browser and walked through the web installer. It sets up SQLite, an admin user and then you are pretty much done.</p>

<p>PS. you have to go through the web installer for the Gitea config to be generated.</p>

<p>Now you can stop here if you want, but there is still more work to be done if we want the experience to be more
GitHub-like.</p>

<h2 id="setting-up-the-reverse-proxy-and-ssl">Setting up the reverse proxy and SSL</h2>

<p>After pointing a custom subdomain to my server’s IP, it was time to set up the reverse proxy. Gitea runs on port <code class="language-plaintext highlighter-rouge">3000</code>,
but no one wants to type <code class="language-plaintext highlighter-rouge">:3000</code> every time they set up a new repo. Plus this is 2025, <code class="language-plaintext highlighter-rouge">https</code> is a MUST!</p>

<p>Time to set up Nginx and a free SSL cert from Let’s Encrypt using <code class="language-plaintext highlighter-rouge">certbot</code>.</p>

<p>First we need to install the dependencies:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash">apt update
apt <span class="nb">install </span>nginx certbot python3-certbot-nginx</code></pre></figure>

<p>Then, we configure Nginx. I created a basic server block pointing to Gitea on port 3000:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash">vi /etc/nginx/sites-available/gitea</code></pre></figure>

<p>I used this config (just with my new domain instead of <code class="language-plaintext highlighter-rouge">your.domain.com</code>):</p>

<figure class="highlight"><pre><code class="language-nginx" data-lang="nginx"><span class="k">server</span> <span class="p">{</span>
    <span class="kn">listen</span> <span class="mi">80</span><span class="p">;</span>
    <span class="kn">server_name</span> <span class="s">your.domain.com</span><span class="p">;</span>

    <span class="kn">location</span> <span class="n">/</span> <span class="p">{</span>
        <span class="kn">proxy_pass</span> <span class="s">http://localhost:3000</span><span class="p">;</span>
        <span class="kn">proxy_set_header</span> <span class="s">Host</span> <span class="nv">$host</span><span class="p">;</span>
        <span class="kn">proxy_set_header</span> <span class="s">X-Real-IP</span> <span class="nv">$remote_addr</span><span class="p">;</span>
        <span class="kn">proxy_set_header</span> <span class="s">X-Forwarded-For</span> <span class="nv">$proxy_add_x_forwarded_for</span><span class="p">;</span>
        <span class="kn">proxy_set_header</span> <span class="s">X-Forwarded-Proto</span> <span class="nv">$scheme</span><span class="p">;</span>
    <span class="p">}</span>
<span class="p">}</span></code></pre></figure>

<p>Going great so far. Next we need to link it and reload Nginx:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash"><span class="nb">ln</span> <span class="nt">-s</span> /etc/nginx/sites-available/gitea /etc/nginx/sites-enabled/
nginx <span class="nt">-t</span>
systemctl reload nginx</code></pre></figure>

<p>Now comes the more interesting part. We add <code class="language-plaintext highlighter-rouge">https</code> with Let’s Encrypt:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash">certbot <span class="nt">--nginx</span> <span class="nt">-d</span> your.domain.com</code></pre></figure>

<p>This gives us our certificate with the SSL all correctly configured. It even sets up auto renewal. Certbot takes care of all
the heavy lifting so we don’t have to mess with any of the config ourselves. Sweet!</p>

<p>Nginx is now handling the https requests and forwarding them to Gitea, which is still happily doing its thing on port 3000.</p>

<p>One last thing. We need to tell Gitea it is running behind a proxy. (don’t remove the other values set under <code class="language-plaintext highlighter-rouge">server</code>):</p>

<figure class="highlight"><pre><code class="language-ini" data-lang="ini"><span class="nn">[server]</span>
<span class="py">PROTOCOL</span> <span class="p">=</span> <span class="s">http</span>
<span class="py">DOMAIN</span> <span class="p">=</span> <span class="s">your.domain.com</span>
<span class="py">ROOT_URL</span> <span class="p">=</span> <span class="s">https://your.domain.com/</span>
<span class="py">HTTP_ADDR</span> <span class="p">=</span> <span class="s">127.0.0.1</span>
<span class="py">HTTP_PORT</span> <span class="p">=</span> <span class="s">3000</span>

<span class="c"># Optional, but highly recommended if you don't want anyone to register on your service:
</span><span class="nn">[service]</span>
<span class="py">DISABLE_REGISTRATION</span> <span class="p">=</span> <span class="s">true</span></code></pre></figure>

<p>Restart Gitea for good measure:</p>

<figure class="highlight"><pre><code class="language-bash" data-lang="bash">systemctl restart gitea</code></pre></figure>

<p>Boom! - secure, clean and no <code class="language-plaintext highlighter-rouge">:3000</code> in sight. We have our self-hosted Git server with SSL and privacy. Most importantly
we no longer have the corporate overlords scraping our commits.</p>

<p>Thanks for following along. If you enjoyed this post, be sure to add my RSS URL at the bottom of the page to your favorite RSS syndicator to get notified when I post anything new.</p>

<p>(^ That’s right, there might a future post on why I prefer RSS for blogging)</p>]]></content><author><name>Ricardo van Hoepen</name><email>hello@ricardo.dev</email></author><category term="self" /><category term="hosting" /><category term="hosting" /><category term="git" /><summary type="html"><![CDATA[The time has come. After months of watching AI companies quietly tweak their terms and AI agents scraping the internet like hungry beasts, I’ve had enough. With a weekend and 3 bonus days off, the motivation and the need are both here. It’s time to spin up a VPS, slap on Nginx, Docker and a Git server - like a true weekend warrior.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://ricardo.dev/assets/images/og/posts/hosting-my-own-git-server.png" /><media:content medium="image" url="https://ricardo.dev/assets/images/og/posts/hosting-my-own-git-server.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry></feed>