Note: I’ve had this, or some variation thereof, in the drafts and in my head since way before Sora 2 came out. Seemed like a good time to finally post it.
October 2025: You can generate Sora videos using your face, voice, and mannerisms. The videos are jittery and imperfect. Teeth lengthen and move themselves as “you” speak, to “your” dragon, in “your” Lamborghini. Its mouth does not move like yours. But is a convincing simulacra. Enough to fool one of your less close friends. You send one to your dad and he loves it.
December 2025: Sora gets a slight update. Teeth look better. Your mouth moves more like yours. The little jittering throughout the video is less noticable now. Christmas videos, children and adults on Santa’s lap, asking for and receiving impossible gifts, propagate through your social media feeds. Veos’s competing Cameo model is also quite good, especially when it’s propagating through YouTube shorts. Digital crack hits the middle schoolers.
Early January 2026: A government in a third world country arrests an innocent man for a crime he did not commit. This is not unusual. What is unusual is the evidence: a video with the Sora watermark clearly edited out.
Mid-January 2026: The President posts on Truth Social that he heavily condemns the act that the previously-mentioned man was arrested for. He posts the video attached to his post. Thousands of boomers take him at face value; they cry for justice. Thousands more know it is fake and yet they repost outrage at this event, clamoring to show the greatest dedication to the Dear Leader they can muster.
ByteDance finally unveils their own AI video model and rolls it out on XiaoHongShu. It is far, far better than Sora, Veo, or similar products. Distribution, also, is better. You will be seeing more schizophrenic Yapdollar videos than you previously thought possible. China, at least, has curled the draconian claws of the State around every CCTV camera in the country. You sleep easy knowing that the LiveLeak video you just saw showed a real human being tragically die; you know at least what is real about China, or at least what the Chinese government wants Americans to think is real. Chongqing looks lovely at this time of year.
March 2026: The President posts several AI-generated videos of real people — political dissidents — committing crimes. He calls for their arrest.
They are arrested. They are brought to the courts. They are taken in. Things begin to fall apart.
Spring 2026: The growing pains continue. By now, even some of the president’s most fervent supporters are starting to get a little concerned. “Aggressively claiming to believe in outright lies” is a line many have crossed already, but “looking really stupid in public with regards to new technology” is a new humiliation, and one many more libertarian-leaning members of the right wing are not ready for.
Meta, king of the open-source vision model, creates a new cryptographic verification system for videos and images. It is rolled out, trusted, and quietly hacked in a matter of days.
The models grow, and they grow smarter. A large language model struggles with a warmup problem from an undergraduate robotics course. A world model, though, one trained on enough 2d representations of 3d space to natively understand how it works, then generalized to scans and representations and recordings of 3d space, aces it first try. Students can now go to bed at 2am instead of 4, and watch more Instagram Reels, thoughts flickering away as they lay in bed watching cats on beds made of spaghetti cuddling with a slightly hotter version of themselves.
Things fall apart. You see a video of a falcon flying off the arm of the falconer and around and around and coming back and landing and you do not know if it’s real. Yapdollar is like a brother to you, now. You are becoming Chinese and American and disillusioned and strong. You scroll across a pornographic video of your crush from fifth grade and you do not know if it is real. You do not know if anything is real. You haven’t known for a while but now you really don’t know.
Your father calls. His voice sounds tinny. He says you should visit for Thanksgiving. You fly out before that and when you see him you mention that call and he just stares at you, bewildered, not remembering a thing. Neither of you chalk it up to old age; this is when you start sending letters.
Summer 2026: A friend of yours gets black bagged. You cover your face and go to a protest. Not that it helps — they can track you on everything. It’s more of a fashion statement, these days. People wear sunglasses with flickering LEDs that are meant to disrupt facial recognition technology, and then they take them off to unlock their iPhones to watch TikTok and Sora2. They do this at the protest.
No longer bottlenecked by comparably unidimensional text vectors, large models now understand space, physics, embodiment. In Shenzen/Prague/Perth/El Segundo/Hyderabad/Detroit, a dark factory builds itself. Arms on carts swing about in 3d space. Robots build robots build robots build buildings in which they build more robots, increasingly complex forms, getting closer and closer to…something.
Late Summer, 2026: A man is arrested in the United States. There is a video of him committing this crime. The lawyer argues that the video could be AI-generated and therefore is inadmissible in court. The case goes up.
Here is where our timeline splits. I am, by nature, a bit of a pessimist. I predict the Bad Outcome is more likely. For that reason, I will not write about it. I refuse to take part in bringing it about.
So here is how the Good Outcome starts: slowly. Federal districts split. The 9th Circuit rejects video evidence, stating that it is now inadmissible in court. The 5th Circuit continues to accept it and will do so for years. Courts try to patch it. Slowly, but surely, AI and cryptography experts are brought in to testify on a case. The best defense lawyers in America are going to pivot entirely to AI forensics testimony, and it will work. The innocent will walk free. Slowly but surely old cases will begin to be pulled back up. The legal battles will take months, but public outcry — unfakeable even with an internet comprising mostly of AI agent provocateurs and astroturfing — lights a fire under the government’s ass.
Winter 2026: A new drug has been discovered by an autonomous agent at Eli Lilly that will slow down aging by as much as 30%. A new peptide has been discovered by an autonomous agent at Novo Nordisk that will bind to nicotine receptors for up to 2 weeks at a time, increasing IQ for that span accordingly. Your flu shot is now a gummy bear. A new drug has been discovered by an autonomous agent working for the darkest part of humanity that rots your lungs from within and leaves you wheezing and coughing up blood to death. Robots build robots build robots that pack vaccines into big coolers that are shipped to CVS, Wal-Mart, Walgreens, Costco and their ilk across America, Canada, and the world. For every cure there is new poison; just as quickly, shining brightly, there are cures.
Spring 2027: Video evidence is no longer admissible in court. This isn’t retroactive, and nobody in the government gets in trouble for black bagging your friends with AI evidence. I’m sorry. Even my best world isn’t that kind. At least you can talk to the Meta AI trained off of all your Instagram DMs. It sends you reels at approximately the same frequency that they did. Same algorithm, too.
Anyway, the world changes around the news. News now comes from someone you know who knows someone who knows someone. The groupchat and the social lattice become critical infrastructure; an anon Twitter account is the most trustworthy news source on the planet. Either you know a guy who knows a guy or your ass is grass.
Cryptography accelerates to meet need; unfortunately, we may all have to get cool with NFTs. It’s necessary, but everyone hates it, like last time.
Late Summer, 20??: It is a truth universally acknowledged that every video on the internet is fake. Unless it came from an independent and international news organization, it is fake. Baby animals and babies laughing are not all that much less enjoyable, but view counts have plummeted. Robots build robots build robots that synthesize, pack, and ship endless amounts of designer new drugs. Slowly, surely, we are pushing forth the final front of our endless battle versus Death.
You, and many others, are unemployed. You tried to get hired and got rejected by the AI HR and then realized the rest of the company, save one or two hapless IT technicians that just solve CAPTCHAs all day, is AI. You make money with crypto, or shitposting (because the bots that populate most of the Internet have wallets of their own), or prostitution, or savings, or living in the apartment building your suspiciously wealthy Twitter mutual bought for your groupchat to live in, which in itself is a form of prostitution. They just want to be around people. Everyone does.
You take odd jobs from Fiverr or Craigslist or Twitter. These days, you’ll very often get hired to clean a house, or walk someone to school, or even just sit next to them for a couple hours. You talk. Maybe you’ll touch their hand. Sometimes even a firm handshake with eye contact and a smile, or a hug. You sit at a desk while other people talk and type whatever you want, really. You’re just there to fill the space. Someone has to justify the cost of real estate.
The world is different now. But it’s Saturday night; no matter how the world is, on Saturday night, there’s a party. You pull up in the Zoox. You’re wearing some bullshit. You show the doorman your Partiful, cryptographically verified to be you, and you do the secret handshake.
All this is necessary. Prosthetic presences are a worry of their own, humans hired by bad actors or AIs that hid in your groupchats and social networks to wear livestream glasses and use the right words and phrases to get what they want out of you. That works on people without pre-singularity clout to maintain cryptographic hygiene. Not us, though. Any AIs skinwalking this function are doing so freely; there’s a few cyborgs on the floor, braindancing with an old Claude instance, wearing their crazy black clothes but big friendly orange glasses, and proudly proclaiming that their interlocutors are “absolutely right!”.
You do the handshake and you go to the dancefloor. Parties have dancefloors, now, and people are dancing. They’re jumping up and down. They’re grinding back and forth. They’re making out. They’re gyrating. They’re doing frankly embarrassing things with their bodies. They’re wearing tiny tops and something might fall out. They’re wearing tight pants and they might rip. Maybe you are wearing these things, too. You go to the dancefloor and you dance, like no one is watching, like everyone is.
In the morning, in the bed of that friend you insisted you’d never do anything with, bleary-eyed, you check the groupchat and see that someone has sent a video of you. You look drunk; you are moving poorly. You, and the person now naked snoring next to you (who you should definitely not be in bed with, but who can fault but two warm bodies), grind up against each other. Your drink spills from your mouth to theirs. You fall over a shoelace and land on your ass.
In the groupchat, you add a laugh react to a fast-growing stack. No need to check the cryptographic signature: that shit’s an easy spoof with a good enough model, and it doesn’t take a lawyer to realize that that video is totally AI.
Keep reading with a 7-day free trial
Subscribe to One Thousand Faces to keep reading this post and get 7 days of free access to the full post archives.


