Featured Article
Most People Aren't Making any Money with AI
The AI hype is real. Everywhere you look online, you see wild stories of young kids supposedly printing money using AI.
The truth: Someone who is skilled with AI can absolutely make money. And a lot of it. This is true and will remain true.
The other truth: Many (not all) of the people you see TALKING about how they're making money with AI actually aren't. They're pretending for the sake of social status or using manipulative marketing techniques.
Here we are. Yet another reason why its so important to understand the Psychological effects of AI.
Entrepreneurial Hype
Right now, AI is the most socially valued topic in entrepreneurial circles. If you are embedded in any startup community, founder group, or online business space, you have seen it: AI is everywhere in the conversation. And that is not an accident.
Entrepreneurs are acutely sensitive to what their social circle values. When the people around you are obsessed with AI, talking about AI — specifically, talking about how you are using it successfully — becomes a signal of competence, relevance, and forward-thinking. The social reward for saying "I use AI to run my business" is immediate. The cost of saying nothing, or admitting you haven't figured it out, is real.
This is social contagion in its most basic form. Ideas and behaviors spread not because they are producing results, but because they are producing social acceptance. You do not need to actually be making money with AI to benefit from claiming you are. The status transfer happens the moment you make the claim convincingly.
The result is a kind of collective performance. A room full of entrepreneurs who are all telling each other they are crushing it with AI, while the actual financial results stay private. The ones who are genuinely succeeding are hard to distinguish from the ones who are performing success. And so the performance spreads.
Marketing Hype
There is a second, more straightforward explanation for the flood of AI income content you see online: it performs well algorithmically. Posting about AI gets more views. It consistently has. Since the release of ChatGPT in late 2022, AI-related content has dramatically outperformed general business or self-improvement content on virtually every major platform.
This creates a rational incentive that has nothing to do with honesty. If a content creator can attach the word "AI" to a piece of content and double their reach, they will. If claiming "I made $10,000 last month using AI" generates 10x more clicks than a more accurate title, the incentive structure rewards the exaggerated claim every time.
The Federal Trade Commission has been clear on the standards that apply to income claims in marketing. According to the FTC's Endorsement Guides, income representations must reflect what consumers can typically expect to achieve — not best-case outliers. Most AI income content online would not survive that standard. The screenshots of revenue dashboards, the "I automated my income with AI" thumbnails, the lifestyle content — these are optimized for clicks, not accuracy.
Understanding this does not mean all AI income claims are false. It means you need to apply the same skepticism to AI content that you would apply to any other marketing-driven space. The incentive to exaggerate is structural, not personal.
Mass Fear
Underneath the hype, there is a genuine fear driving a lot of irrational behavior. The widespread narrative that AI is going to commoditize human work — eliminating millions of jobs, replacing skilled professionals, and rendering large portions of the workforce obsolete — has created a kind of mass urgency that is causing real harm.
When people believe their livelihood is under existential threat, they stop making calculated decisions. They start making fear-driven ones. They sign up for expensive AI courses they don't need. They buy "AI automation systems" from people on TikTok. They pivot their careers or businesses abruptly based on what they see going viral, not based on what is actually happening in their specific industry or role.
This fear is being actively exploited. There is an entire ecosystem of content creators, course sellers, and consultants whose business model depends on you believing that AI will destroy your income unless you pay them to teach you how to survive. The fear is the product.
The irony is that this panic-driven behavior is itself what gets people hurt. Rushed decisions, poorly evaluated tools, courses that overpromise — these outcomes have nothing to do with AI's actual capabilities. They are the product of manufactured urgency meeting real anxiety.
The Survivorship Bias Problem
There is a reason you only ever see the success stories. The people who tried to build an AI-powered business and failed do not make content about it. They move on quietly. The people who tried six different AI tools and found none of them produced meaningful revenue do not write threads about it. The algorithm does not reward honesty about failure.
What reaches your feed is the extreme top end of outcomes — the small percentage of people who did find a way to make AI work financially. And even among those, many are now making money by teaching others how to make money with AI, rather than by the original method they claimed made them successful. The product became the story.
McKinsey's own research on generative AI adoption, published in their Economic Potential of Generative AI report, found that while the technology holds significant productivity and value-creation potential, actual deployed use cases generating measurable financial returns remain concentrated in a narrow set of sectors and skill profiles. The gap between what AI can do and what most people are currently doing with it — profitably — is wide.
That is not pessimism. It is an accurate baseline. Closing that gap is possible. But you cannot close it if you are operating on a distorted picture of where most people actually are.
What Actually Works
This article is not an argument against using AI to make money. The argument is about honesty regarding what it takes. People who are genuinely earning with AI are not doing it by following a viral TikTok formula. They are doing it by developing real skill with specific tools, applying that skill to problems they already understand deeply, and iterating over time.
The pattern that actually works: deep domain expertise paired with AI capability. A designer who learns to use AI image and workflow tools effectively. A developer who integrates AI into products people pay for. A writer who can use AI to produce content at a higher volume without losing quality. In each case, the AI is amplifying existing skill — it is not replacing the need for skill in the first place.
The people who are genuinely frustrated — the ones who bought the course, tried the tool, and got nothing — are often people who were told AI would do the hard part for them. It doesn't. It makes the hard part faster and more scalable. But you still have to bring the skill and judgment that makes the output worth anything.
AI is a real income lever for people who already know what they're doing. It is not a shortcut past the skill-building phase. If someone is selling you the shortcut, they are the product.
TL;DR
Are people actually making money with AI?
Yes — a real, but relatively small, group of people are generating meaningful income with AI. They tend to have strong existing domain knowledge and are using AI to amplify it, not replace it. They are not the majority of people posting about it.
Why do so many people claim to be making money with AI if they aren't?
Social status, algorithmic incentives, and genuine fear are all driving the behavior. Claiming AI success is low cost and high reward socially. AI content performs well. And fear of being left behind pushes people to perform confidence they don't have.
How do I tell the difference between a real AI income claim and a fake one?
Ask what the underlying skill is. Real AI income requires real expertise in something — design, writing, coding, sales, operations. If someone cannot explain the skill that AI is amplifying, they are likely selling the story, not the method.
Is it worth trying to use AI to earn money?
Yes — if you approach it as a serious skill to develop over time, not a system to deploy overnight. The people building real AI income are playing a long game. The people selling overnight results are playing you.
Direct Sources
Related Reading