01 The Trigg ering Event

In 2025 , L ate Post interviewed Wu Minghui, founder of M ininglamp, and he offered some very specific numbers : he personally sp ends around $ 5 ,000 per week on Agents ; he once spent $10,000 on tokens in two weeks ; Mininglamp's internal " Oc to Oct opus" core team of fewer than 10 people hit a single - day token spend of $37 ,000 at its peak; individual engineers have burned through $8 ,000 in tokens in a single day; and this small team produced nearly one million lines of code over two months .

More importantly, he offered a judgment : the value of the software " shell " is approaching zero, closed -source is a death sentence, and future pricing will shift from person -days and person -months to token costs plus a management fee.

This is not a founder 's sl ogan.

This is someone who has spent 20 years in enterprise software — someone who lived through the AI mi sj udgments and organizational cont raction of 2020 to 2022 — re def ining the cost structure and value capture of enterprise software.

I haven 't run Octo inside Mininglamp, and I can 't verify how much of those " nearly one million lines of code" is maint ainable production code. But these numbers are already sufficient to make the point: tokens are no longer an experimental budget . They are becoming a department -level , product -level, and even P & ;L - level operating line item .

The value of software itself has nearly van ished. One hundred million tokens can roughly repl icate what others spent billions building over years . In the past , everyone charged by person -month or person-day. In the future, it will become token costs plus a management fee.

02 What This Actually Means

On the surface, this looks like " AI is killing SaaS."

But what is actually happening is not that S aaS disapp ears — it's that the S aaS value stack is being unb undled.

The pricing logic of enterprise software in the past worked roughly like this: encode complex processes into UI , permissions , reports , and database schemas , then lock customers in through implementation and switching costs. You were selling a finished product — cl unky, perhaps , but controll able, aud itable, and deploy able.

Now Agents have split that logic into three layers .

The first layer is model access — tokens themselves .
The second layer is workflow orchest ration: multi -Agent routing , permissions , context injection , tool calls, and aud iting.
The third layer is what was always the most visible part of S aaS — the shell : interfaces , forms , work ben ches, and reports .

When Wu Minghui says "the software shell's value is approaching zero," he is not saying software has no value at all. He is saying the third layer is being rapidly commodit ized. What will actually command a price is the second layer: who can orchestrate people , Agents, context , permissions , and objective functions together .

That is what he is really talking about.

If this judgment holds , the mo at for enterprise software companies will also migrate . It will no longer be primarily about feature depth , but about four things:

  1. Distribution within real workflows
  2. Long -accumulated organizational context and permission graphs
  3. Routing capability across token cost , quality , and lat ency
  4. Management capability that connects outcomes to business accountability chains

" Token costs plus a management fee" sounds a lot like consulting , and a lot like cloud .

Cloud converted server procurement CAP EX into pay -as-you-go OPEX. Agents are converting a portion of knowledge work 's human OP EX into token OP EX plus an orchest ration fee.

There is one thing I may be under estimating: the data models and compliance barriers of certain vertical S aaS products may not collapse as quickly as he suggests. Especially in finance , healthcare, and government enterprise , the " shell" actually enc odes institut ionalized processes — not just a frontend interface .

But the direction is clear: software used to extract a premium on workflow digit ization. Going forward , platforms will extract a premium on " letting models do the work without things going wrong" — a management premium .

03 Historical Analog ies and Structural Parall els

This looks more like AWS circa 2014 than Chat GPT in 2022 .

Chat GPT was a demand -side shock : everyone suddenly realized that natural language interfaces were us able .
Signals like M ininglamp's are a supply -side and organizational shock: enterprises are starting to treat tokens as raw materials, Agents as labor , and internal collaboration systems as production lines .

The most important change AWS brought was not simply "servers got cheaper. "
It was that start ups could , for the first time, skip buying machines entirely and call compute on demand, turning infrastructure into an API . The result was a cliff - drop in the formation cost of software companies , faster startup velocity , and an explosion in software supply .

What Agents are doing to S aaS today looks a lot like that process .
It is not simply replacing a few job roles . It is driving down the formation cost of " building a working enterprise software workflow ." Previously , you needed a PM , several frontend and backend engineers, Q A, implementation , training , and operations just to encode a customer requirement into a product module . Now many of those steps can be compressed into tokens and a small number of high -leverage people .

So I prefer to call this the API - ification moment of enterprise software.

There is another , more fitting analogy: the iPhone versus carriers .
Carriers used to control distribution , and hand set makers orb ited around their channels . After the iPhone, what was truly valuable was the operating system, the developer ecosystem, and the user entry point. Carriers were dem oted to "data pipes . "

Similarly , if traditional S aaS continues to understand itself as a "business process UI provider ," it risks being demoted to a data and permissions pipe .
The value above it will be captured by Agent r untimes, workflow operating systems, and model routing layers.

I cannot confirm whether Octo will ultimately become this kind of runtime- layer product, because internal tools rarely extern alize as smooth ly as imag ined. Products like Slack , Notion, and Linear expanded not just because they were used internally , but because the internal experience was re plic able. Enterprise collaboration systems , by contrast, are often deeply tied to company culture and org design.

But looking at historical patterns: when the underlying cost curve shifts ab ruptly, what col lapses first is usually not demand — it is the existing extraction model .

04 What This Means for AI Builders

If I were an AI builder reading this interview , I would adjust four things in the short term.

First, rec alc ulate unit economics.
Stop treating model costs as "R & ;D experimental expenses ." If you are already in real delivery , tokens are COGS. Prompt caching, bat ching, routing, fallback, context trunc ation, tool call rate limiting — if you haven 't done these , do them now.
Wu Minghui's team of fewer than 10 people spending $37,000 per day on tokens shows that high -intensity Agent development will quickly flip the question of "are people expensive or are models expensive?"

Second, stop obs essing over a complete S aaS shell . Priorit ize workflow control points .
That means : appro vals , permissions, records, replay capability , context management , tool integration , and team collaboration .
The reason is simple : model capabilities will keep fluct uating, frontend styles will be copied , and what is genu inely hard to replace is whoever has embedded themselves into the enterprise 's daily action chain . In other words, switching costs no longer come primarily from data migration — they come from organizational migration .

Third , shift product packaging from seat logic to output logic.
If customers already perce ive that "one engineer burning $ 8,000 in tokens per day is still worth it," that means the market is accepting output leverage , not seat count .
You don 't necessarily have to bill by token , but your sales narrative , pricing model, and margin management all need to be able to explain token pass-through and orchest ration premium.

Fourth, reass ess the role of open source .
Wu Min ghui's claim that "closed -source is a death sentence" carries some emotional charge , but it has a real - world im plication for the application layer: rel ying solely on a closed -source UI or propri etary workflow makes it very hard to resist pressure from underlying models and open-source Agent frameworks.
A more reasonable strategy may not be fully open or fully closed, but rather open -sour cing the replac eable shell while placing the real mo at in hosted services , data conn ectors, aud iting, deployment, SL As, and organizational network effects .

For model API consumers, this is also a reminder : high -value customers in the future will not want " the most powerful model" — they will want "the most stable , most cost -efficient , most controll able combination of models and routing . " This is actually a tail wind for gateway platforms . Models are increasingly like electricity. What actually needs to be managed is load , switching, peak - and -t rough , and failure .

I may be wrong here too .
If frontier labs quickly product ize long context , tool calls, Agent memory, and enterprise connectors, many middle layers will be sw all owed directly . The window for builders to capture value will be narrower — more like a reseller than a platform.

05 Counter arguments and Risks

The place most worth challenging my own judgment is this : at its core, this interview is the narrative of a founder who is highly excited and has a strong incent ive to frame a transformation story .

Excitement is not evidence .

First, spending a lot on tokens does not mean real economic value has been created.
" Ten thousand dollars creating ten billion dollars in value" reads more like a founder 's subj ective estimate than an aud itable ROI. The most common mistake AI teams make right now is conf using a sense of velocity with output , conf using parallel generation with organizational capability , and confusing line count with product complet eness.

Second, multi -Agent collaboration is not inher ently superior to single-Agent.
Wu Minghui's anal ogy of M oA to MoE is thought -provoking, but may also be an overext ension. The experts in M oE are train able, meas urable, and optim izable. Multi -Agent networks , by contrast, frequently suffer from coordination overhead, hall ucination propag ation, unclear accountability, and evaluation difficulty .
I have not seen long -term stability data from his system , so "emerg ent collective intelligence" is currently more of a promising pattern than a settled conclusion .

Third , "the software shell's value trends toward zero" may be too aggressive for many industries .
In information - dense contexts like advertising , retail , and analytics , Agents can cut in more easily .
But in E RP, finance , healthcare, supply chain, and industrial software, the shell was never just an interface — it is process constraints , compliance mapping , exception handling, and accountability assignment . Those things are unlikely to be quickly fl attened by tokens .

Fourth , "closed-source is a death sentence" is probably wrong .
More precisely : closed-source generic apps face greater danger , but closed -source systems with propri etary data , strong distribution, deep service ecosystems, and compliance advantages can still th rive.
What will actually die is not closed -source per se, but the middle layer that has neither exclusive inputs nor organizational emb edd edness.

Finally , there is a more practical question : can human organizations actually sust ain this pace ?
If token costs exceed labor costs, the management challenge is no longer "should we adopt AI ?" — it becomes "who has permission to burn how many tokens, toward which objective , and who is account able when things go wrong?"
In other words, what AI may kill first is not S aaS, but the old bud geting system and middle- management methods .

This is also the point I found most striking after reading the full interview.
The issue is not that AI makes software cheaper.
The issue is that once software and execution become cheap enough to approach infinite supply, the thing that is truly scar ce — the thing that will be repr iced — is organizational judgment.