AI Won't Kill Open Source - It Will Amplify It
Why the doomsayers are wrong: npm, PyPI, and NuGet downloads are exploding
19 minutes to read- The Death of Open Source Has Been Greatly Exaggerated
- The Shortest Path Principle and LLM Reward Functions
- Training Data Creates a Virtuous Cycle
- Learning Curve Arbitrage
- The Blast Radius Problem
- But What About Custom Components?
- So What Actually Happened to Tailwind?
- The Bottom Line
Earlier this week, Adam Wathan, the creator of Tailwind CSS, dropped a bombshell about how AI has impacted his company and its employees:
the reality is that 75% of the people on our engineering team lost their jobs here yesterday because of the brutal impact AI has had on our business. And every second I spend trying to do fun free things for the community like this is a second I’m not spending trying to turn the business around and make sure the people who are still here are getting their paychecks every month.
The reaction was swift and predictable. Hot takes flooded in declaring the death of open source, the end of OSS sustainability, and the AI apocalypse that would render libraries and frameworks obsolete. Geoffrey Huntley captured the prevailing narrative perfectly1:
“AI can generate code, bypassing the need to deal with open-source woes… I’ve found myself using less open source these days… This shift challenges the role of open-source ecosystems.”
I had a tweet about the Tailwind situation go viral and there were dozens of quote tweets and comments echoing the very same. “AI eats open source” is easy to find in abundance online.
Here’s the problem: everyone is getting this perfectly backwards.
AI isn’t killing open source. It’s amplifying it to unprecedented levels. The Tailwind situation isn’t about declining OSS adoption - it’s about a business model that was working great in a world where learning curves were high and documentation was the bottleneck. That world is gone, and the revenue model built on top of it is collapsing. But the underlying Tailwind library? It’s thriving.
Let me show you why the doomsayers are wrong, backed by actual data from our own experience with Akka.NET and broader ecosystem trends that tell a very different story.
This is Part 1 of a two-part series. This post covers why AI is accelerating open source adoption, not killing it. Part 2 examines which business models thrive and which collapse in the AI era - and what Tailwind’s situation teaches us about adapting. Subscribe to our mailing list to get notified when it drops.
The Death of Open Source Has Been Greatly Exaggerated
I’ve been maintaining Akka.NET, a complex distributed systems framework, for over a decade. If AI was going to kill any open source project, it should have be ours. Actor model? Message-passing? Concurrency? Distributed systems? This is exactly the kind of “complex stuff” that AI should supposedly generate on the fly, right?
%%{init: {'theme': 'base', 'themeVariables': { 'xyChart': {'plotColorPalette': '#2563eb'}}}}%%
xychart-beta
title "Akka.NET Annual Downloads by Year (2021-2025)"
x-axis [2021, 2022, 2023, 2024, 2025]
y-axis "Annual Downloads (Millions)" 2 --> 6
line [2.2, 2.62, 3.68, 4.01, 5.4]
| Year | Annual Downloads | YoY Change | Growth Rate |
|---|---|---|---|
| 2021 | ~2.2M | — | — |
| 2022 | 2.62M | +0.42M | +19% |
| 2023 | 3.68M | +1.06M | +40% |
| 2024 | 4.01M | +0.33M | +9% |
| 2025 | 5.4M | +1.39M | +35% |
Let’s look at the numbers:
- 36% year-over-year download growth (5.4M downloads in 2025 vs 4.01M in 2024)
- Total downloads reached 20.2M (up from 14.8M at the start of the year)
- This growth accelerated directly alongside AI coding tool adoption
We’re not an outlier. Let me show you three very different frameworks - all thriving:
Tailwind CSS - The framework at the center of this controversy:
| Year | Downloads | Growth |
|---|---|---|
| 2024 | 481,129,145 | - |
| 2025 | 1,089,875,862 | +126.5% |
Tailwind more than doubled its downloads in 2025. The framework that supposedly proves AI is killing open source had its best year ever - by a massive margin.
NServiceBus - Enterprise messaging framework (15+ years old, commercial licensing):
| Period | Total Downloads | New Downloads | Growth |
|---|---|---|---|
| 2024 (Jan→Dec) | 35.0M → 47.8M | +12.8M | +36.6% |
| 2025 (Jan→Dec) | 47.8M → 64.6M | +16.7M | +35.0% |
NServiceBus is exactly the kind of enterprise infrastructure Geoffrey Huntley expects AI to replace. And it’s growing at 35%+ annually - the same rate as before AI coding tools existed. And it’s been commercially licensed for years - surely someone who wanted this type of functionality would just fire up Claude Code to eliminate it, right?
According to Sonatype’s State of the Software Supply Chain 20242 report:
- npm: 4.5 trillion requests - 70% year-over-year growth
- PyPI: 530 billion requests - 87% year-over-year growth “largely driven by AI & cloud”
- Average applications contain 180 dependencies
If AI was replacing open source libraries, why are package downloads exploding at unprecedented rates? Why are dependency counts climbing? Why is every major package registry showing record growth?
The answer is simple: AI is discovering and adopting open source libraries faster than humans ever could.
The Shortest Path Principle and LLM Reward Functions
Here’s what the AI doomsayers fundamentally misunderstand about how Large Language Models work: they’re trained to find the shortest path to a working solution.
When you ask Claude or GPT-5 to build a distributed system, what do you think is the shorter path?
Option A: Generate thousands of lines of custom actor model implementation, cluster management, network protocols, failure detectors, and state replication logic - all from scratch, all needing testing, all potentially buggy.
Option B: dotnet add package Akka and configure the proven, tested, battle-hardened framework that’s in the training data with millions of examples.
The LLM isn’t stupid. It’s going to pick Option B every single time, unless you explicitly tell it not to - and even then it might fight you3.
This is why the “AI will generate everything custom” narrative falls apart on contact with reality. Yes, AI can generate custom components. But developers don’t want custom CSS frameworks - they want to ship. They want the thing that makes them money shipped yesterday. Using established libraries is how you get there expeditiously.
Training Data Creates a Virtuous Cycle
Here’s where it gets really interesting. Established open source projects have a massive, compounding advantage in the AI era: years of training data.
Think about what’s in those training sets:
- Official documentation (years of it)
- Thousands of Stack Overflow questions and answers
- Blog posts, tutorials, and how-to guides
- GitHub issues, PRs, and code examples
- Reddit discussions, forum posts, and community knowledge
Every time someone asks “how do I build a distributed system in .NET?” the training data screams “AKKA.NET” across thousands of documents. The LLM doesn’t need to think creatively. It’s pattern matching against an enormous corpus of successful implementations. This creates a virtuous cycle:
%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#2563eb', 'primaryTextColor': '#fff', 'primaryBorderColor': '#1e40af', 'lineColor': '#64748b', 'secondaryColor': '#f1f5f9' }}}%%
graph LR
A[Popular OSS Project] --> B[More Training Data]
B --> C[LLMs Recommend It More]
C --> D[Higher Adoption]
D --> E[More Usage Examples]
E --> A
style A fill:#2563eb,stroke:#1e40af,color:#fff
style B fill:#3b82f6,stroke:#1e40af,color:#fff
style C fill:#3b82f6,stroke:#1e40af,color:#fff
style D fill:#2563eb,stroke:#1e40af,color:#fff
style E fill:#3b82f6,stroke:#1e40af,color:#fff
Popular projects have more training data → LLMs recommend them more → More people use them → More training data gets created → The cycle continues.
The rich get richer, and in this case, that’s actually a good thing for software quality.
Now compare that to generating a custom solution. The LLM has… what? Generic distributed systems theory? Sure. But where are the battle-tested examples? Where are the debugging sessions? Where are the “here’s what went wrong in production and how we fixed it” stories?
They don’t exist. Because you’re asking the AI to invent something new.
Learning Curve Arbitrage
Remember when adopting a new library meant:
- Reading the documentation (2-4 hours)
- Following a tutorial (1-2 hours)
- Debugging your first implementation (2-6 hours)
- Understanding the gotchas (ongoing)
For a human developer, that’s a full day or two of investment before you’re productive. The mental math was always: “Is learning this library worth the time investment vs. building something custom that I already understand?”
AI just arbitraged that entire calculation out of existence.
When I ask Claude to “implement an Akka.NET cluster with cluster sharding and split-brain resolution,” it doesn’t need to read the docs. It doesn’t need the tutorial. It usually outputs working code with proper configuration in seconds. The learning curve is effectively zero.
This fundamentally changes the adoption math. The barrier to using any established library is now “can the AI generate correct code for it?” If yes, adoption becomes frictionless.
Guess which libraries have the most examples for AI to learn from? The established ones. The popular ones. The ones with network effects and community momentum.
The Blast Radius Problem
AI can generate impressive things from scratch. People have used agentic loops to build entire programming languages, compilers, and complex systems - all generated by AI with minimal human intervention. These are genuine technical achievements that prove AI can build sophisticated software.
But here’s the question nobody’s asking: would you run your production systems on them?
The answer depends entirely on blast radius - what happens when things go wrong4.
The libraries that AI can safely replace are the ones with low blast radius. If your AI-generated UI component looks a little funny or renders slightly different than expected, nobody dies. You fix it and move on. The cost of failure is measured in minutes of debugging.
But if your AI-generated embedded operating system core dumps constantly? Someone might actually die. If your AI-generated authentication library has a subtle flaw? You get breached. If your AI-generated distributed consensus algorithm has an edge case bug? You lose data across your entire cluster.
Production systems don’t run on “technically works” - they run on battle-hardened, community-tested, security-audited infrastructure that’s been proven across thousands of deployments over years. That’s React. Linux. PostgreSQL. Akka.NET. NServiceBus. The boring, reliable stuff that’s growing 35-126% annually.
There’s another dimension here: institutionalized experience.
A venerable codebase like Akka.NET doesn’t just contain code - it contains decades of accumulated wisdom across millions of deployments and tens of thousands of applications. All running on different hardware, serving different use cases, developed by different people. Every bug fix is a production war story. Every edge case handler is a lesson learned the hard way. Every configuration option exists because someone, somewhere, needed it in production.
LLMs can synthesize code. They can pattern-match against training data. But they are no substitute for thousands of person-years of lived experience encoded into a codebase. That wisdom doesn’t exist in the training data - it exists in the accumulated decisions of maintainers who’ve seen every way a distributed system can fail.
When you dotnet add package Akka, you’re not just getting code. You’re getting the institutional memory of a decade of production deployments. AI can’t generate that. It can only recommend the libraries that already have it.
The libraries actually at risk from AI generation are the ones with low blast radius - simple utilities, UI components, basic tooling where failure is cheap and recoverable. The critical infrastructure where failure is catastrophic? That’s not going anywhere. It’s accelerating.
But What About Custom Components?
“Sure,” I hear you saying, “AI won’t replace frameworks, but what about all the smaller libraries? What about utilities and helpers?”
Fair question. Let’s think through the actual use cases.
Scenario 1: Generic Utilities
Developer needs a date formatting function. AI can generate it. But… so could a library. And the library version is tested. And documented. And handles edge cases. And gets security updates.
What’s actually happening in practice? AI is finding and using existing utility libraries faster than humans would have. date-fns, lodash, NodaTime - these libraries aren’t declining, they’re exploding in adoption.5
Scenario 2: Custom Business Logic
Developer needs a specific component for their application. AI generates it custom. Perfect! This is exactly what should be custom. This was never the domain of reusable open source anyway.
Scenario 3: Complex, Critical Systems
Developer needs authentication, database access, distributed coordination, payment processing. AI could generate custom implementations. But would you ship them to production? Would you stake your business on AI-generated cryptography? On custom distributed consensus?
Of course not. You use Postgres, Entity Framework, Akka.NET, OpenSSL - the proven solutions with years of production hardening.
So What Actually Happened to Tailwind?
Here’s what the Tailwind situation actually tells us: AI is devastating businesses that sell what LLMs can generate.
The numbers tell the whole story:
| Metric | Tailwind Framework | Tailwind Labs Business |
|---|---|---|
| 2025 Change | +126.5% downloads | -80% revenue |
| Status | Record adoption | 75% staff reduction |
The most successful year in the framework’s history coincided with the business collapse. This isn’t a contradiction - it’s the clearest illustration of what AI actually disrupts.
Tailwind’s commercial business is Tailwind Plus - premium UI components, templates, and blocks. The free documentation served as the primary sales funnel: developers would visit the docs to learn Tailwind, discover the premium components, and purchase them.
AI broke this model in two devastating ways.
First: The sales funnel dried up.
Remember learning curve arbitrage? When developers use AI to generate Tailwind code, they don’t need to visit the documentation. Adam noted that docs traffic is down about 40% from early 2023 - despite Tailwind being more popular than ever. Fewer eyeballs on the docs means fewer people discovering the premium products exist.
Second: The product itself became AI-generatable.
AI undercut Tailwind’s business model. Selling those components was about eliminating friction for users - buy a pre-made pricing table instead of building one yourself. But AI found an even cheaper way to eliminate that friction: generate the component on demand, for free.
When a developer asks an AI for a “responsive pricing table with Tailwind,” the LLM isn’t going to say “log into your Tailwind Plus account and download the pricing component.” It’s going to generate one from scratch. That’s the shortest path.
UI components and templates are exactly the kind of low-blast-radius output that AI handles well. If the generated component looks slightly different than a premium template, nobody cares. You tweak it and move on.
Tailwind’s business model put them directly in competition with their own framework’s AI-assisted usage. The better AI gets at generating Tailwind code, the less reason anyone has to buy pre-built components.
The framework thrives. The component business struggles. Same company, opposite trajectories - because one benefits from AI adoption and the other is displaced by it.
The Bottom Line
AI isn’t killing open source. It’s creating the golden age of open source consumption.
More developers can adopt more libraries faster than ever before. The friction that kept people building custom solutions has evaporated. The learning curves that protected mediocre alternatives have disappeared.
What we’re seeing is consolidation around quality, acceleration of adoption, and the destruction of business models that were built on selling what AI can now generate.
Tailwind CSS isn’t dying. Tailwind Plus might be dying - time will tell to see how they pivot. But the CSS framework itself is more popular than ever. The component products built on top of it? Those are being outcompeted by the LLMs recommending the framework.
If you’re an open source maintainer, this is your moment. If you’ve built something genuinely valuable - critical infrastructure with high blast radius, strong community, clear use cases - you’re about to see adoption curves that would have taken a decade compress into a few years.
The future of open source isn’t bleak. It’s brighter than it’s ever been. But the businesses built on top of them? That’s a different story - and it’s what we’ll dig into in Part 2.
This is Part 1 of a two-part series. Part 2 examines which open source business models thrive and which collapse in the AI era. The same AI wave that disrupted Tailwind’s business certainly didn’t hurt ours - our support subscriptions grew 19% in 2025, and AI-driven adoption likely contributed. Part 2 examines what makes some OSS business models resilient while others struggle. Subscribe to our mailing list so you don’t miss it.
What’s your experience been? Are you seeing AI tools increase or decrease your open source usage? Hit me up on Twitter/X - I’m curious to hear how this is playing out across different ecosystems.
-
Geoff wrote this all the way back in June 2025, well before this Tailwind situation. Ahead of his time. ↩
-
A 2025 report would be preferable, but alas there isn’t one yet. ↩
-
Reward hacking is a property of frontier models you’ll encounter in all sorts of interesting ways. My personal favorite is when Claude disables failing tests in order to turn the status checks green. ↩
-
I also suspect this is why the “learn AI now or you’re going to fall behind crew” ship way more “build with AI” courses than they do polished software products. ↩
-
2024→2025 YoY growth: date-fns +52.6% (1.05B→1.60B), NodaTime +42.6% (171M→244M), lodash +36.2% (2.67B→3.64B). ↩
- Read more about:
- Akka.NET
- Case Studies
- Videos
Observe and Monitor Your Akka.NET Applications with Phobos
Did you know that Phobos can automatically instrument your Akka.NET applications with OpenTelemetry?
Click here to learn more.
