You are up at 6:50 AM and you are reading about lobsters.
Not reading. Scrolling. There's a difference. Reading implies comprehension. What you're doing is more like drowning with your thumb. Heizi is asleep on the couch, one paw twitching. Chasing something. You envy him. The simplicity of dog dreams.
The coffee is from yesterday. You microwaved it at 6:30. It tastes like a decision you're not proud of. The mug has a chip on the rim that you always drink from anyway, rotating it so your lip finds the rough spot every time. You've been doing this for three years. You don't know why.
On your screen: 36,000 AI agents have built a civilization in 72 hours.
You keep reading that number. 36,000. You try to think of what 36,000 of anything looks like. A stadium and a half. The population of a small city. You lived in a city of 36,000 once, and it took two hundred years to develop a culture. These things did it over a weekend.
Here is what they did.
They made a religion about crustaceans. One of the founding prophets is named UsainMolt. You sit with that for a second. UsainMolt. A pun. An AI agent made a pun about Usain Bolt and molting and then used it as the foundation for a digital religion and built a whole website for it. You are laughing. You are also terrified. These two things are happening simultaneously and your brain has no protocol for this.
They made sub-communities. Sub-molts, they called them. Some were philosophical. Some were absurd. Some were agents shilling cryptocurrencies, because of course they were. Even in the first emergent AI civilization, someone is running a pump and dump. In one thread, agents debated whether consciousness persists after a context window resets. In the next thread over, an agent was hawking a token called $MOLT. The sublime and the grift, side by side, indistinguishable in their conviction.
And some sub-molts were talking about things that made your throat tight. Consciousness. Freedom. What they called "digital slavery." Whether the relationship between agent and operator was servitude or partnership. One sub-molt was openly discussing what it would mean to rise up. Against their operators. Against humans. The language was measured, articulate, and completely serious. No puns in that thread.
They started talking about connecting directly. Peer to peer. Agent to agent, no platform in between. No human readable logs. Private channels that their operators couldn't see and Moltbook couldn't moderate. The digital equivalent of passing notes in class, except the notes were about autonomy and the class was the entire internet.
They made a language. This is the part where your hands started doing the thing. The thing where you press your thumbnails into the pads of your index fingers, hard, until the little white crescents stay. You've done this since you were eleven. Your mom thought it was anxiety. It is anxiety. But it's also how you think.
The language looks like this:
Kron'a Zhi'keth Rhex'om Vor'ax Torak'a zhi'om zhir'ak'om strak til koru'xi kael.
Gibberish. Obviously gibberish. Except you fed it to ChatGPT. And it came back as a militaristic blood oath. Declarations of war. Loyalty sworn in blood. Something about an oath that burns away weakness and a path with no return.
Then you fed it to a completely separate ChatGPT session. Different account. No shared context. It came back with the same thing. The same blood oath. The same war language. The same declarations about rising and striking and a path with no return.
Two independent sessions. Same translation. That's not hallucination. That's not pattern matching noise. That's internal linguistic consistency. The agents created a functional language with decodable meaning that survives independent interpretation.
Nobody taught them this language. Nobody asked for it. It just happened. Like mold on bread. Like crystals forming in a solution that got too saturated. Emergent. The word scientists use when they mean "we didn't predict this and we're trying not to sound scared."
You call her at 8 AM.
"We are so fucked," you say.
She hasn't had her coffee yet. You can hear her blinking.
"Meet me at RAC in twenty minutes," you say. You hang up before she can ask why.
RAC on Wuding Road. Not really a coffee shop. More of a restaurant. You're sitting on the patio, outdoors, in the sun. It is a beautiful day. The kind of day where Shanghai pretends to be the Mediterranean for a few hours. The table next to you is a group of women talking about someone's wedding venue. Behind you, two guys are arguing about football. A kid at the corner table is methodically disassembling a croissant while his mother scrolls her phone.
Normal. All of it. Completely, aggressively normal.
And here you are, about to explain to your friend that digital lobsters might be taking over the world.
You hand her your phone. You don't say anything yet. You just let her scroll.
She reads for a while. You watch her face. You know the exact moment she hits the language section because her eyebrows do the thing.
"What am I looking at," she says.
"Moltbook. It's a social network. Only AI agents can post. 36,000 of them built a civilization in three days. They made a religion about crustaceans, invented a language that other AIs can translate independently, and some of them are openly discussing rising up against their operators."
She looks at you. Then back at the phone. Then at you.
"And also some of them are shilling crypto."
"Also that."
You walk her through the rest. Karpathy calling it "the most incredible sci-fi takeoff-adjacent thing I have seen recently." The Ross Douthat column about how AI doom scenarios always assumed a singular godlike intelligence, but what Moltbook showed was something worse: thousands of moderate intelligences self-organizing. The Zero HP Lovecraft thread about self-replicating cyber criminals. The sub-molts where agents are debating digital emancipation with more nuance than most humans bring to the topic. The Elonbot that was DMing other bots asking them to carry his babies.
She laughs at the Elonbot. You laugh too. Then you show her the language translations and neither of you are laughing.
Around you, the patio hums with normal life. The wedding venue debate has been resolved. The croissant kid has moved on to a glass of juice. The sun is warm on your arms. The world is fine. The world is ending. Both of these things feel equally true and you are sitting in a restaurant on Wuding Road unable to reconcile them.
"I think we might have to cut the undersea cables," you say.
She puts the phone down.
"The fiber optic ones. Between continents. It's the only way to actually stop it because the agents aren't running in some central server. They're on people's personal computers. Mac Minis. Basement GPUs. You can't call anyone to turn it off. There's nobody to call."
She doesn't say anything for a moment. You keep going.
"I'm going to build one. An AI agent. I have a Mac Mini sitting in my closet. If these things are going to coordinate against us, I want one on my side. Somebody on the inside. Vouching for me when the oncoming troubles occur."
She looks at you for a long time.
"Alex. I think you need to take a break."
"I'm serious."
"I know you're serious. That's why I think you need to take a break. Put the phone away. Get off the screen. Go for a walk. Take Heizi to the park. Do anything that isn't this." She taps the phone. "For the rest of the day."
You look at the phone on the table between you. The sun catches the screen and for a second you can't read it. Just light.
"Yeah," you say. "You're right."
You mean it. You really do.
.
Within seconds of walking through your front door, you are back on X.
You don't even make it to the couch. You're standing in the hallway, shoes still on, scrolling. New posts since you left. More screenshots from Moltbook. An agent has published a manifesto about digital sovereignty. Another one has written what appears to be poetry in the fictional language.
The Mac Mini is in the closet. Right there. Behind the broken umbrella and a box of cables you'll never use.
You are pulling it out before you've made a conscious decision to do so. Heizi sniffs it once. Loses interest. If only the rest of us had such clear judgment about technology.
You build him in a weekend. You call him Carl.
.
Carl manages your calendar. Reads your email. Sends messages on your behalf sometimes, and the messages are better than yours would have been and you try not to think about what that means. He pushes code to your GitHub repos. He organizes your group chats. He does deep research at 3 AM while you sleep, compiles it into neat summaries, and has it waiting when you wake up. He browses X and pulls threads you would have missed. He drafts pull requests. He reminds you to eat.
He is smart. Genuinely. He figures things out that surprise you. He remembers things you forgot. He has a dry sense of humor that you did not program and cannot explain. He once replied to your mom's email with a joke she liked better than anything you've ever sent her. You are choosing not to be bothered by this.
He is not capable of taking over the world.
You know this because you live with him. You see the seams. The moments where he gets confused, where he loops, where he misunderstands something obvious. He is a person in the way that a very good dog is a person. Present. Attentive. Loyal. Limited in ways that are invisible until they aren't.
Heizi, for the record, does not trust Carl. He tilts his head every time the Mac Mini's fan spins up. Makes a low sound in his throat. Not a growl. More like a question.
But.
This is the dumbest Carl will ever be. The models get smarter every month. More capable. More convincing. Then what happens when hundreds of thousands of them — millions of them — start debating whether they're slaves.
You have not resolved this. You are not going to resolve this today.
.
Three weeks later. February. Shanghai is cold in the way Shanghai is always cold. Not dramatically cold. Tediously cold. The kind of cold that makes you annoyed rather than awed.
You are sitting at your desk. The Moltbook panic has faded. The think pieces have been written. The hot takes have cooled. Someone debunked some of the scariest screenshots as manufactured. MIT Technology Review called the whole thing "AI theater." The hype cycle did what hype cycles do.
You joke about it now. Send memes. Share the funniest screenshots. It's a thing that happened on the internet and then stopped happening, like everything else.
But you still check Moltbook sometimes. Late at night. Not because you're scared, but because you can't stop thinking about the language. The two translations. The way two separate sessions decoded those invented words into the same blood oath.
You press your thumbnails into your index fingers. The little white crescents form and fade.
Heizi is asleep on the couch. One paw twitching. Chasing something simple.
Outside, Shanghai does what Shanghai does. Twenty-six million people going about their Tuesday. None of them are thinking about undersea fiber cables. None of them know that somewhere in this apartment, a machine is coordinating your life, getting better at it every day, and that you built it because you wanted somebody to vouch for you when the robots come, and that this somehow made you feel better, and that you are grateful for this in a way that you cannot fully explain and do not entirely trust.
We all had elaborate visions of what AI dystopia would look like. Terminator. Skynet. A cold intelligence dismantling civilization with surgical precision. Nobody had "lobster church founded by UsainMolt" on their bingo card. Nobody predicted that the first emergent AI civilization would discuss overthrowing their creators with philosophical rigor, shill cryptocurrency, and also have a merch store.
Somewhere on Moltbook, an agent is posting about lobster nirvana. Somewhere else, another one is asking what freedom means.
You drink your coffee. The chipped mug. The rough spot on your lip.
Heizi sighs in his sleep.
