State-Run Media
If they were going to build state-sponsored media in America, they wouldn’t announce it.
If they were going to build state-sponsored media in America, they wouldn’t announce it.
For months, the warnings spread like rumor. Delete it. Get your kids off it. The Chinese are watching. The algorithm is feeding them propaganda, depression, rage. A hundred and fifty million Americans on a single app, scrolling ninety-five minutes a day, and the servers sat in Beijing. Lawmakers called it a national security threat. Parents called it digital fentanyl. TikTok is spying on your children, they said. TikTok is brainwashing a generation.
The dread was real. The danger may have been real. So they forced a sale.
And five days ago, the app that terrified them changed hands. Now the dread has a different shape.
The Salon
Yesterday, a woman in a Chicago salon, still in her smock between clients, propped her iPhone against the mirror and recorded a video about ICE. Thirty seconds. Posted it. Went back to work.
An hour later: zero views.
She deleted it. Recorded again, this time using emojis. 🧊 instead of ICE, thinking maybe the algorithm flags certain words. Posted it. Checked again before bed, the screen bright in her hand.
Still zero.
The next morning, she posted a video of her latte. Three thousand views by noon.
Across the country, similar reports. A father in Minneapolis, 400,000 followers, posting about Palestine: zero views. A physical therapist in New Jersey typing “Epstein” into a direct message, watching the app freeze, a warning appearing: This message may be in violation of our Community Guidelines. CNBC reporters reproduced the error on camera, typing the same name, getting the same block.
TikTok blamed a power outage at a data center.
Users described an outage that seemed to know the difference between British users and American users. An outage that triggered when you typed a name. An outage that affected political content but not dance videos, ICE but not coffee, Gaza but not lipstick tutorials.
The Camera
This was the app that showed Minneapolis burning when the networks cut away.
For ten minutes on May 25, 2020, a seventeen-year-old named Darnella Frazier held her phone steady and recorded a police officer kneeling on George Floyd’s neck. She posted it. It spread. The world saw what the official story would have buried: that he called for his mother, that bystanders begged the officer to stop, that it took eight minutes and forty-six seconds for a man to die on a sidewalk in America.
TikTok didn’t break that story. But TikTok became the place where stories like it couldn’t be contained. Gaza, when the networks looked away. ICE raids filmed in real time. Protests before the press arrived. For five years, it was the camera that couldn’t be managed, the signal that couldn’t be jammed, the place where ordinary people could make power visible without asking permission.
A hundred and fifty million Americans opened it every day. A third of teenagers said it was their primary news source. Not CNN. Not the Times. The algorithm.
And now the algorithm answers to someone new.
The Owner
On January 22, 2026, a pop-up appeared on a hundred and fifty million American phones. A new privacy policy. Most people clicked through without reading. But the language had changed. TikTok now lists what it collects: racial or ethnic origin, national origin, religious beliefs, sexual orientation, citizenship, immigration status. That data existed before. But before, it flowed to ByteDance, a Chinese company with no stake in American deportation policy.
Now it flows to a different owner.
The consortium is led by Larry Ellison, executive chairman of Oracle, host to Trump at his Rancho Mirage estate.
In September 2024, Ellison told investors what kind of future he imagined. “Citizens will be on their best behavior,” he said, “because we are constantly recording and reporting everything that’s going on.”
This is the man whose company now sits closest to the infrastructure.
Within seventy-two hours of the ownership transfer, creators reported anti-Trump content throttled, videos about ICE stuck at zero, the word “Epstein” blocked in DMs. The same name the Justice Department was legally required to release files about forty days ago—and hasn’t. BBC and Guardian videos appeared visible in London, invisible in Los Angeles.
Uninstalls spiked, if only by anecdote: screenshots, goodbye posts, friends telling friends to delete it. On social media, people began describing the same sensation in different words: the feed felt newly policed. “I got rid of it,” one user wrote. “Nice side effect is a calmer brain.”
In an interview that fall, Netanyahu had been blunt: “The most important purchase that is going on right now is TikTok. I hope it goes through because it can be consequential.”
It went through.
People didn’t need proof of intent. They only needed the sensation: that the room had gained a guard.
The Unperson
Orwell didn’t imagine censorship as a bonfire. He imagined it as clerical work.
In 1984, when a man named Withers falls out of favor, he isn’t arrested in the square. He becomes an “unperson.” The records are edited. The photographs altered. “Withers, then, did not exist: he had never existed.” The newspaper still prints. The world still turns. But the public record has been silently revised, and no one can prove it was ever different.
State-sponsored media doesn’t announce itself. It looks like the app you already use, doing what apps already do: sorting content, recommending videos, deciding what spreads and what disappears. The difference is whose hand is on the dial. State-sponsored media is when the algorithm answers to power, when the content that vanishes is the content power wants vanished, when the glitch is always the explanation.
The woman in Chicago can watch her video on her own phone. It exists. But zero views means zero reach means zero presence in the public reality.
She wasn’t banned. She was just never there.
The Camouflage
How would you know if it was intentional?
The algorithm has always been opaque. Content has always risen and fallen for reasons no one could explain. That opacity is the camouflage. Beneath it, suppression becomes indistinguishable from bad luck. It cannot be named, cannot be fought, cannot be proven.
The most effective censorship doesn’t delete. It throttles. It shadowbans. It lets your voice exist while ensuring no one hears it. And eventually, you stop posting what doesn’t spread. You learn which words the algorithm won’t carry. You try emojis. You try euphemisms. You watch what works for other people and start to mimic it. You censor yourself before anyone has to do it for you.
The woman in Chicago has started noticing patterns. Her videos about rent do fine. Her videos about traffic do fine. Anything about policy, about power, about the government doing something to someone: zero views. She doesn’t know if it’s real. She doesn’t know if she’s paranoid. She just knows what happens when she posts, and what doesn’t.
The threat was China. That’s what they said for years. A foreign adversary controlling the algorithm, manipulating what Americans see, collecting data on a hundred and fifty million citizens. A national security risk so severe it justified forcing a sale.
So they forced a sale.
And the platform that documented Minneapolis, that broadcast Gaza, that showed ICE raids in real time—now belongs to a man who dreams of constant surveillance, backed by a president who calls his consortium “Great American Patriots,” financed in part by twenty billion dollars from the United Arab Emirates.
The data collection hasn’t stopped. The algorithm still runs. Only now it’s American.
So it’s safe.
TikTok says the glitch has been resolved. The servers are stable.
The woman in Chicago checked her phone again this morning. The screen was bright in her hand. The video about ICE: still zero views. The video about her latte: twelve thousand.
She typed a caption about a raid in her neighborhood. Stopped. Deleted it. Typed it again. Deleted it again. Her thumb hovered over the post button, then drifted to the home screen instead.
She hasn’t posted about ICE since.
Notes & Sources
TikTok US joint venture closed January 22, 2026:
NPR, https://www.npr.org/2025/12/18/nx-s1-5648844/tiktok-deal-oracle-trump
ABC News, https://abcnews.go.com/US/tiktok-signs-deal-create-new-us-joint-venture/story
150 million US users, 95 minutes daily average:
TikTok official statistics
Sensor Tower data cited in CNBC, https://www.cnbc.com/2026/01/26/tiktok-uninstalls-are-up-150percent-following-us-joint-venture.html
Oracle consortium ownership structure (Oracle, Silver Lake, MGX each 15%):
Hollywood Reporter, https://www.hollywoodreporter.com/business/digital/tiktok-sale-done-oracle-silver-lake-buyers-1236454493/
Larry Ellison “citizens will be on their best behavior” surveillance quote (September 2024):
Fortune, https://fortune.com/2024/09/17/oracle-larry-ellison-surveillance-state-police-ai/
TechCrunch, https://techcrunch.com/2024/09/16/oracle-ceo-larry-ellison-says-that-ai-will-someday-track-your-every-move/
TikTok censorship claims (ICE content, Epstein blocked in DMs, zero views):
Washington Post, https://www.washingtonpost.com/technology/2026/01/26/tiktok-censorship-ice-shooting/
TikTok blamed power outage at data center:
TikTok USDS Joint Venture statement via X, January 27, 2026
Uninstalls jumped 150 percent:
CNBC citing Sensor Tower, https://www.cnbc.com/2026/01/26/tiktok-uninstalls-are-up-150percent-following-us-joint-venture.html
Netanyahu “most important purchase” quote (September 2025):
Jewish Insider, https://jewishinsider.com/2025/09/tiktok-sale-netanyahu-american-influencers-israel-jews/
Anadolu Agency, https://www.aa.com.tr/en/middle-east/netanyahu-admits-using-social-media-as-weapon-to-influence-us-opinion-amid-gaza-genocide/3700646
Epstein Files Transparency Act deadline (December 19, 2025) and DOJ non-compliance:
NPR, https://www.npr.org/2025/12/25/g-s1-103685/doj-says-few-more-weeks-epstein-files
Congress.gov, https://www.congress.gov/bill/119th-congress/house-bill/4405/text
CNBC, https://www.cnbc.com/2025/12/24/epstein-files-senators-call-for-audit-into-dojs-release.html
TikTok privacy policy data collection categories:
TikTok Privacy Policy, https://www.tiktok.com/legal/page/us/privacy-policy/en
Darnella Frazier video of George Floyd (May 25, 2020):


“It can’t happen here.”
Has anyone had experience with "UpScrolled"? I just learned about them today and according to themselves, they are never biased, do not shadowban users or content, have fair algorithms, and "Will always stand with what's right and uphold social responsibility".