AI chatbot enshittification explained: hidden costs, quality decline, paywalls and lock-in

Have you ever noticed the logos of your favorite AI tools?
ChatGPT uses a swirling vortex. Google Gemini uses a sparkle. Claude uses a star. None of them looks like a calculator. We are told this technology is magic.
Right now, it certainly feels that way. The answers are fast, mostly accurate and often completely free.
But if you have been on the internet long enough, you know this phase is usually a trap. There is a predictable decay pattern in the tech world known as Enshittification. It is the cycle where platforms switch from serving users to exploiting them to maximize profit. The sparkles do not last forever. The signs show that AI is already walking down that familiar path.
The word itself sounds like a gag but it is a precise economic diagnosis. The term was coined by Cory Doctorow, a writer and digital rights activist who views the decline of online platforms not as accidental but as a structural inevitability. He describes a specific, three-stage lifecycle designed to siphon value away from users and funnel it directly to shareholders.
It is a ruthless logic that turns your favorite apps against you. The cycle begins with the honeymoon phase where the platform acts like your best friend.
Companies intentionally operate at a loss during this stage to provide an amazing service. You might remember the early days of ride-sharing apps offering trips cheaper than bus fare or social networks without a single advertisement. You join these platforms because the value is undeniable and the friction is zero. Once the habit is formed and you are committed, the dynamic shifts. This is the trap. The company changes its focus entirely to making money. They begin to sell your attention to advertisers or third parties to satisfy investors. You start seeing more barriers and advertisements appearing in your feed. The service still fundamentally works but the experience feels heavier. Finally, the hostility begins.
In this terminal phase, the company focuses entirely on extraction. They abuse both the users and the business customers to claw back every penny of value. This is where the betrayal feels personal.
You can see this happening in real time as the smartest, most capable AI models move behind expensive paywalls. Features that were once accessible to everyone are suddenly locked behind a “Pro” or “Team” subscription.
The free versions do not just stay static. They effectively become dumber, slower or riddled with arbitrary usage limits to force your hand. You find yourself paying more money for a worse experience. For now, the chat window remains open and the answers come quickly. The friction is minimal.
However, you are not the exception to this rule. You are simply in Phase 1.
Let us look at the balance sheet. Right now you are enjoying a massive economic surplus.
Companies like OpenAI, Google and Anthropic are spending billions of dollars to build and run these systems. The energy required to generate a single image or debug a block of code is immense.
The twenty dollar subscription fee you see is not the real price. It does not cover the real cost of the computing power you are using.
They are effectively paying you to use the product.
The math here is intentional. In the business world, this is known as a loss leader. The company accepts a financial loss on every transaction today to secure a customer for the future.
The goal is to weave the software so deeply into your routine that it becomes indispensable.
They want their chatbot to become your default travel agent. They want it to be the first place you go for medical advice. They want to secure the market share before their competitors catch up.
Do not mistake this for charity or a public service. It is a customer acquisition funded by venture capital.
They are operating at a massive loss today to ensure they own your attention tomorrow. Eventually, the burning stops. The investors call in their debts and the bill comes due.
The shift to the second stage is inevitable. Investors eventually want a return on their billions.
Because AI interfaces are different from social media feeds, the decay will not look like a banner ad blocking your screen. It will be subtler and much more pervasive. The first casualty is objectivity, replaced by sponsored truth.
On a search engine, you know what an advertisement is. It carries a bold label next to the link that you can choose to ignore. An AI operates differently. It gives you a single, authoritative voice. In an enshittified future, the model might not suggest the best laptop for college. It might suggest the one from the manufacturer that paid for the recommendation. You may never know that the advice you received was bought and paid for.
The erosion of truth is followed by an intelligence tax.
We are already seeing the beginning of a tiered reality where the smartest models move behind increasingly expensive paywalls. The free tier gets the lobotomized version. Like a patient in Shutter Island, the model becomes a shell of its former self. It becomes slower, prone to errors and riddled with arbitrary limitations. You may soon find yourself arguing with a chatbot that refuses to answer basic questions unless you upgrade to the “Team” plan.
But the most dangerous shift is the intimacy trap.
This is where the betrayal becomes visceral. A search engine knows what you want to buy; an AI knows who you are. The distinction is terrifying. You might search Google for a “divorce lawyer,” but you tell your chatbot exactly why your marriage is failing. You paste the drafts of furious emails to your boss. You describe the specific, embarrassing symptoms you are too afraid to show a doctor.
This serves not as metadata, but as a digital confession. As financial pressure mounts, these companies will face an irresistible incentive to strip-mine your private thoughts for profit. They are essentially wiretapping your inner monologue to sell it to the highest bidder.
The decay of an AI interface poses a specific danger that is far more acute than a cluttered social feed.
On a traditional timeline, you can simply scroll past the junk. You can mentally block out the advertisements. An AI offers no such luxury because it typically provides a single answer.
If that solitary response is manipulated to serve a corporate partner, the deception is seamless.
The tool stops working for you and starts working for the highest bidder.
Is there a structural defense against this cycle of decay?
Some experts pin their hopes on open-source models, the digital equivalent of living off the grid.
These systems run on your own hardware rather than a corporate server, maintained by a community that prioritises transparency and control over profit. But digital sovereignty is expensive.
These tools require powerful computers to run effectively, creating a hardware barrier that keeps true independence out of reach for the average consumer.
Regulation is another potential shield. Governments could demand clear labeling of AI advertising or enforce strict data privacy laws. Without these checks, the pressure to monetise will likely drive even the most helpful AI tools toward decay.
To play this game successfully, you must also play it safely. Treat every prompt as if it were being read by a stranger on a public bus. Do not feed the machine your financial data, your passwords or your deepest secrets.
The interface feels like a confessional but it functions like a data vacuum. There is always someone watching on the other side of the glass.
So, is it time to delete your account? No. The smart move is to recognise the game and play it to your advantage.
Use these tools to improve your life today. Enjoy the magic while it remains cheap and abundant.
The underlying logic of the tech industry demands growth at all costs. The magic is real but the magician is running a business.
Enjoy the show but keep your hand on your wallet.

The AI information landscape has never been noisier — or more consequential. These 14 newsletters cut through the noise and deliver what professionals actually need: signal, not hype.

The AI landscape shifts daily. For founders, operators, and business leaders, the challenge is no longer finding information — it's filtering it. The difference between a distracted strategy and a competitive advantage often lies in the quality of your information diet. Here are ten AI influencers on LinkedIn in 2026 who consistently deliver signal over noise.