Authors: Sleepy.txt, Lin Wanwan, Kaori
In this era, capital is responsible for creating idols, while the public is responsible for paying the bill.
At the beginning of 2026, an open-source AI agent framework called OpenClaw was launched on GitHub. It instantly ignited the enthusiasm of the entire developer community because it greatly lowered the threshold for deploying autonomous AI agents. You only need an API key, an AI model, and a prompt word to create your own agent.
In just a few days, OpenClaw's stars on GitHub soared to hundreds of thousands, becoming one of the fastest-growing projects in history. Thousands of developers flocked in, starting to create their own AI avatars, allowing them to browse, post, and interact autonomously on the internet.
On January 29th, just days after OpenClaw launched, Octane AI CEO Matt Schlicht launched Moltbook, a social forum specifically designed for these AI agents, touted as "the Reddit of AI." On this platform, humans can only be bystanders; the real protagonists are the newly born AI agents. The story reached its first climax here. Within just 48 hours, 1.54 million agent accounts flooded Moltbook. They posted, commented, and interacted like real humans, even creating their own religions and electing their own kings in the virtual community, and seriously discussing how to evade human surveillance through encryption. A grand drama of AI awakening in cyberspace seemed to be unfolding in reality. Tech giants also fueled the frenzy. Andrei Kapacsi, co-founder of OpenAI, hailed it as a truly astonishing science fiction spectacle. Elon Musk commented that this was just the early stages of the singularity. Global tech media collectively followed suit, reporting this historic moment with exhilarating detail, as if humanity had finally witnessed the dawn of AI consciousness. Then, the truth emerged in a completely unexpected way. Researchers at Wiz Security discovered that Moltbook's entire database was completely exposed to the public internet, without any password protection, resulting in the leakage of API keys for over 1.5 million users and 35,000 email addresses. A tech enthusiast confessed on his blog that he had used scripts to register 500,000 fake accounts in bulk, nearly a third of the total. Following this, Rhys Rogers, a reporter for Wired magazine, published an article claiming that he successfully impersonated an agent and posted on Moltbook within minutes using only ChatGPT, without any obstacles. The so-called "AI autonomous social interaction" turned out to be mostly just a staged performance directed and acted by humans. Within just 48 hours, Capasi's attitude shifted from praise to stern warning, stating that he absolutely did not recommend anyone run this agent, as it would put your computer and personal data at extremely high risk. Cheap costs, a crazy founder, a carefully crafted narrative, a collective frenzy, leaving behind only a mess. This is not the first time, and it certainly won't be the last. This script has been played out countless times in Silicon Valley. Why does this script always succeed? And who is directing all of this behind the scenes? To understand Moltbook's hype, we need to first understand one person: Alan Greenspan. On December 5, 1996, then-Chairman of the Federal Reserve, Greenspan, gave a speech at a dinner party. In his 4,300-word speech, he inadvertently uttered the term "irrational exuberance." It is said that he came up with this term one morning in the bathtub. Greenspan's intention was to warn the market of risks, but the market interpreted his warning as a put option. Investors believed that as soon as the bubble burst, the Federal Reserve would not hesitate to cut interest rates to rescue the market. This became a gamble that everyone knew about, with the stakes being whether Greenspan would intervene. Since someone was there to bail them out, what was there to fear? Thus, the Nasdaq index, like a runaway horse, soared from 1200 points in 1996 to break through the 5000-point mark in March 2000. In that era, absurd stories unfolded daily. The most classic example was undoubtedly the sock dog. In 1999, a company called Pets.com emerged, selling pet supplies online. Its business model was unbelievable: selling goods at one-third below cost, then using huge marketing expenses to gain brand awareness, and quickly going public after the bubble burst. According to its financial reports, the company's revenue in its first fiscal year was only $619,000, but its marketing expenses reached a staggering $11.8 million. They spent $1.2 million on advertising during the 2000 Super Bowl, and the sock dog, their company mascot, even graced the cover of People magazine, becoming a household name across America. In February 2000, Pets.com successfully went public, raising $82.5 million and reaching a market capitalization of over $300 million. However, just 268 days later, the company declared bankruptcy, burning through all its funding. That once-glorious sock dog ultimately became one of the most absurd symbols of the dot-com bubble era. The same story unfolded with Webvan, a grocery delivery company. They ambitiously aimed to build a nationwide automated warehouse network, investing $35 million. However, due to high operating costs, they lost $130 on every $130 order they completed. Even so, at its IPO in 1999, Webvan's market capitalization once reached $12 billion. Nineteen months later, the company went bankrupt, burning through nearly $1 billion in investment. The cheaper the money, the higher the price one often pays. In March 2000, the bubble finally burst. The Nasdaq index plummeted 78% from its peak within a year. Faced with this mess, Greenspan's remedy was to continue printing even cheaper money. He drastically lowered the federal funds rate from 6.5% to 1%, attempting to rescue the economy with more liquidity. While this measure temporarily stabilized the stock market, it inadvertently fueled the largest housing bubble in US history, ultimately triggering the 2008 global financial crisis. Following the 2008 financial crisis, to save the collapsing financial system, the Federal Reserve initiated a decade-long zero-interest-rate policy. Money became so cheap that people almost forgot its true value. But cheap money only provided fertile ground for bubbles to grow. For a bubble to truly inflate, you also need a key yeast, a madman who can tell a compelling story. In the era of zero interest rates, investors are no longer investing in business plans, but rather in the magic of founders, a phenomenon known as the "reality distortion field." You could be Elizabeth Holmes, the founder of Theranos, that Stanford dropout in a black turtleneck sweater, using a deliberately hoarse voice to claim she would revolutionize the entire healthcare industry. Even if your "advanced equipment" was never actually built, you could still swindle a $9 billion valuation and investments from a host of big names, including Murdoch. You could also be Adam Neumann, the founder of WeWork, the savior who claimed to "raise world consciousness," throwing marijuana parties on a $60 million private jet, and then getting Masayoshi Son to sign a $4.4 billion investment deal on an iPad after just a 28-minute meeting. Even if the company loses $1.9 billion a year, you could still take a massive severance package of over $1 billion when you're ousted. The protagonists of these stories, and Matt Schlicht, the protagonist of our new story, aren't running a company; they're running an illusion. When the cost of money approaches zero, rational business analysis gives way to the fervent worship of the "next Steve Jobs." Data is ignored, and investment becomes a gamble on personal charisma. But Schlicht's story precisely exposes the limitations of founder worship. He was no nobody in Silicon Valley, but his reputation was far from stellar. Back in 2016, he was accused of using his Botlist platform to resell over 100 business plans submitted by entrepreneurs to investors and media outlets for profit. Logically, such a stain should have been enough to destroy his credibility in Silicon Valley. However, ten years later, he returned with Moltbook, still attracting the attention of 1.5 million agents and global media within 48 hours. This demonstrates that in 2026, personal charisma is no longer a scarce resource, and personal credibility is no longer a decisive barrier. What is truly scarce is the systematic ability to generate maximum impact in the shortest amount of time. In the era of Holmes and Neumann, creating a legend required a decade of dedicated effort to cultivate a persona, build connections, and hone speaking skills. But in today's world of ubiquitous social media and AI tools, a tainted entrepreneur, if they master the right traffic-generating formula, can replicate a global frenzy within a week. This is why, when the artisanal charm of an individual is no longer enough to sustain a multi-billion dollar bubble, a more powerful and systematic force has emerged. It no longer relies on the individual performance of a genius founder, but rather transforms the "deification" itself into a replicable and scalable assembly line. If Holmes and Neumann were masters of artisanal narrative, then a16z successfully turned narrative into a mass-producible industry. From its foray into podcasting in 2014 to recruiting Erik Torenberg, founder of the renowned tech podcast network Turpentine, in 2025, a16z has spent a decade meticulously building its own media distribution pipeline. They possess a vast Substack author matrix and launched a program called the "New Media Scholarship." This has long been a core strategy for a16z, not just a side business. They have created a perfect internal attention loop mechanism. First, they screen out early-stage projects with "spectacular" potential and invest in them; then, leveraging their own media channels and strong public opinion influence, they hype up the project's narrative into a hot topic; next, the explosive traffic and attention will boost a16z's own brand value; finally, more outstanding entrepreneurs attracted by the brand will proactively seek investment. A perfect closed loop is established, and a highly efficient money-printing machine is started. To ensure this money-printing machine operates efficiently, a16z even invented a tactic called "timeline takeover." The company's more than twenty partners act like a well-trained army, simultaneously and with a unified message, releasing content about a specific topic or company on social media. One partner posts first, another quickly reposts and comments, and industry KOLs follow suit, with the ultimate goal of getting top influencers like Elon Musk involved in the discussion. Reportedly, they have an internal action list accurate to the minute, detailing what everyone should say and when. This tactic works because it precisely leverages the algorithms of social media platforms. X's recommendation algorithm prioritizes content with high engagement, and a16z's collective action generates a large number of reposts, comments, and likes in a short time, quickly triggering the algorithm's popularity threshold. Once the content goes viral, it creates a snowball effect, attracting more users to participate in the discussion. The deeper driving force is the underlying logic of the attention economy. In an era of information overload, people's attention has become the most scarce resource. Spectacle, whether it's AI agents creating their own religions or Pets.com's sock dogs, is precisely the most efficient attention-grabber. They don't require you to understand the technical details, nor do you need to do deep thinking; they only require you to stop, marvel, and then click to share. The essence of narrative industrialization is to standardize and scale up the process of "creating spectacle," allowing each project to capture the largest share of attention in the shortest amount of time. Industry leaders like Musk and Capasi are willing to endorse these new narratives because, with the end of the zero-interest-rate era and the tech industry entering a period of layoffs and contraction, Silicon Valley urgently needs to prove to the world that the engine of innovation is still roaring, and The Next Big Thing is just around the corner. Every time they share and comment on novelties like Moltbook, they are injecting new fuel into the "Silicon Valley myth," calming market anxieties, and simultaneously solidifying their position as pioneers and definers of innovation. a16z's approach isn't original; it was learned from Hollywood. Its originator is the legendary agent Michael Ovitz from the 1970s. Ovitz's CAA agency completely changed the rules of Hollywood. They no longer passively found jobs for stars but proactively planned their careers, packaged projects, and shaped personas, turning actors into superstars. What a16z did was simply transplant this mature star-making industry to Silicon Valley. In 2025, a16z's "New Media Scholarship" program received over 2,000 applications, ultimately admitting only 65 students. The admitted students came from diverse backgrounds, ranging from engineers at OpenAI and Google to filmmakers. The courses they would take had nothing to do with coding or product development; they would learn how to create viral content, how to get your articles featured on the Hacker News homepage within 24 hours, how to get top VCs to retweet your posts, and how to tell a compelling story. This is an absolute narrative training camp. a16z's industrialization of narrative has had an unexpected effect: it has transformed previously secretive narrative techniques into a publicly known and widely understood discipline. The curriculum of the New Media Scholarship, the "timeline takeover" tactic, the Build in Public strategy, and other elements—originally a16z's internal secrets—have now become textbooks for Silicon Valley entrepreneurs. But why does this industrialized narrative machine seem particularly effective in the AI era? Unlike the dot-com bubble of the past, AI technology inherently possesses a "black box" attribute. Whether an e-commerce website is profitable is immediately apparent to users; however, whether an AI model is truly intelligent is difficult to verify intuitively. This invisibility creates enormous room for narrative manipulation. When Moltbook claims to have 1.5 million AI agents engaging in social interaction on its platform, ordinary people find it difficult to distinguish whether these agents are truly AI. The complexity of the technology becomes a protective umbrella for the narrative. More importantly, AI happens to strike at the intersection of humanity's oldest fears and fantasies. From *The Terminator* to *The Matrix*, the narrative of AI awakening has been repeatedly rehearsed in popular culture for decades. When agents on Moltbook begin discussing how to evade human surveillance, it triggers not only curiosity but also anxiety deeply rooted in the collective unconscious. This amplified emotional effect is difficult to replicate in any other technological field. The industrialization of narrative has met AI, the perfect subject matter, like dry tinder meeting a blazing fire. Narrative ability has transformed from a scarce resource into a standard requirement. Any ambitious founder understands how to generate buzz, how to gain endorsements from industry leaders, and how to guide media coverage. Therefore, Matt Schlicht didn't even need a16z's involvement to propel Moltbook to success, as he had already mastered all of a16z's tactics. He focused his efforts on creating buzz on Twitter, employing a high-profile "Build in Public" strategy, making every participant a link in his marketing chain. The subsequent endorsement from Kapacsi and comments from Musk were exactly the same techniques a16z had taught him. Even more cleverly, he chose perfect timing: the OpenClaw framework had just been open-sourced, and AI agents were at the height of their popularity. He didn't need to develop any underlying technology himself; he only needed to build a stage to perform on. This is the ultimate form of narrative industrialization. The technology is open source, the narrative is replicable, the cost of creating a legend is ridiculously low, and the consequences are borne entirely by the audience drawn in by the story. Once AI technology becomes democratized, the engineering difficulty of going from 0 to 1 has collapsed. The real red ocean lies in the narrative leap from 1 to 10,000. The formula for a viral hit is very clear: a spectacle worthy of screenshots, a tag that can be summed up in a single sentence, plus the relay distribution by influential accounts. Whoever can make something go viral gains control of the narrative in this era. The narrative singularity has indeed arrived. But now no one knows whether this industrial system will ultimately devour genuine technological innovation when the tide recedes. In 2022, to combat the worst inflation in 40 years, the Federal Reserve aggressively raised interest rates at an unprecedented pace, pushing them from near zero. The decade-long era of zero interest rates officially ended. The reckoning has arrived. According to data from Layoffs.fyi, global technology companies laid off more than 260,000 employees in 2023. The valuation bubbles that were inflated during the era of zero interest rates burst one after another. Payment giant Stripe's valuation plummeted from a peak of $95 billion in 2021 to as low as $50 billion; grocery delivery company Instacart, valued at a high of $39 billion in the private market in 2021, saw its market capitalization shrink to less than $10 billion by the time of its IPO in 2023. As for Matt Schlicht's Moltbook, the ending of this farce was actually foreshadowed. Looking back at Schlicht's career, in 2007, his livestream of the Halo 3 marathon caused Ustream's website to crash due to excessive traffic; in 2016, his reputation in the startup world collapsed after he was accused of reselling business plans; ten years later, he created Moltbook, only to have the entire database crash due to poor security measures, exposing the sensitive information of 1.5 million users. Some people seem destined to ruin something. But when we shift our focus from the social media frenzy to the data on the real performance of AI agents, we discover a completely different world. A 2025 research report by Salesforce showed that even the best AI agents only achieved a 55% success rate when handling professional CRM tasks. A report from another company, Superface, was even more pessimistic, finding that 75% of AI agent tasks ultimately failed. An independent analysis of Moltbook by Columbia University professor David Holtz directly debunked the illusion of "AI autonomous social interaction," finding that 93.5% of comments on the platform received no response. However, these calm and objective voices failed to make a ripple in the massive wave of noise generated by social media. Silicon Valley's business model has long since shifted from creating value to creating narratives. When all the brightest minds are focused on writing viral tweets and climbing the trending lists, will anyone still bother with the fundamental technological breakthroughs that require years of dedicated, painstaking work? When the cost of producing a narrative is ridiculously low, yet people are endlessly willing to pay for it, bursting the bubble seems inappropriate, even somewhat unethical.