In the mundane course of modern life, you might occasionally find yourself glimpsing the dark abyss — that is, catching a few seconds of a stranger’s phone screen. A peek over a shoulder, through a car window at a dash-mounted phone, or perhaps, pointedly, at the hands of a distracted person whose eyes you’re trying to raise, will likely reveal a version of the same thing: not, in 2026, a wall of text, a feed, or even a slideshow of stories, but an endless scroll of tall videos.

Taken together, the videos sometimes tell a simple story in algorithmic silhouette, with one bikini video after another, a cascade of talking sports heads, an unbroken flow of clothes to buy, or influencers talking over the News with a particular political orientation. Just as often, though, a stolen screen glance reveals a dismal anti-story, in which an AI model has guessed, in a whole bunch of different directions, what the user is “most likely to be interested in or engage with.” It has done so by following, to borrow Meta’s language, a series of steps: “Gather inventory,” “leverage signals,” “make predictions,” and “rank reels by score.” To the intended viewer, the resulting mix of videos is incoherent but probably also intuitive, at least on an individual basis. To you, the screen peeper, the spectacle is profoundly uncomfortable, not just because you’re eavesdropping the output of an intimate machine-learning profile gleaned from private and often passive “signals” provided by someone you don’t know, but because you know, deep down, that your chained-together clips — and the chains of people you know, love, and respect — would be exactly as alien, and alienating, to anyone outside of your narrow algorithmic cone. You’re seeing yourself. You’re really just sitting there, recreationally A/B testing content for hours in order to help Meta and TikTok find videos — any videos, about anything — that are a little bit harder for us not to watch.

This is, according to various attempts to capture usership habits, one of the main things people do on their phones, with TikTok accounting for substantial shares of social-media watch time and Instagram not far behind. In the five years since Meta, then still called Facebook, added TikTok-style Reels to Instagram (and then Facebook), the feature has arguably become the most important part of the entire company’s portfolio, according to new data from Sensor Tower:

More than half of all ads on Meta’s Instagram ran in the service’s short-form video product Reels in 2025, up from 35% in 2024 … In the U.S., Reels accounted for 46% of time spent on the Instagram app in 2025, up from 37% in 2024, according to the data that Sensor Tower showed CNBC. On the Facebook app, that figure reached 29% in 2025, up from in 2024.

If internet firms are defined by their fastest-growing monetized products, well, Meta is basically a Reels company, one that successfully chased TikTok into continued relevance, allowing Mark Zuckerberg to throw money at his next big chase (into generative AI). This isn’t just a formal change from the News Feed to Stories to predictively recommended vertical videos, though. It’s a long (and nearly complete) process of platform desocialization. Platforms originally defined by keeping up with people you know, or have at least heard of, become something fundamentally different.

In 2022, in a memo outlining its post-TikTok strategy, Meta laid out how thoroughly it was shifting the emphasis of so-called “social” media toward what it had termed “unconnected” content, not just with Reels but across other platforms. In a 2023 post providing some technical background on the “AI behind unconnected content recommendations on Facebook and Instagram,” the company described how showing users “highly personalized recommendations from the tens of billions of pieces of content that are outside of a person’s network” actually “enhances their experience.” Around the same time, Mark Zuckerberg was telling investors that, already, 20 percent of content Facebook and Instagram users were seeing came from “people, groups, or accounts they don’t follow.”

This was a clear and telegraphed switch in priorities, not a conspiracy, although it does seem notable that Meta has mostly stopped emphasizing “unconnected” content, except in its routine Widely Viewed Content reports about Facebook, where it now accounts for a clear plurality of posts people see, more than twice as much as original posts from friends. (That said, it was a funny choice of term for a company that once promised to “connect the world” and which filed an IPO prospectus describing its core value as allowing users to “stay connected with their friends, family, and colleagues.”) Nor do I think there’s much to be nostalgic about from Facebook or Instagram’s pre-TikTok era of dominance, during which social media had already grown into something that promoted a deeply strange and distorted sense of sociality and public discourse.

But I do suspect that the shift from massive platforms full of people seeing things mostly on purpose to platforms full of people seeing things mostly because an algorithm thinks they might engage with them remains an underrated factor in just how strange the internet — and downstream entertainment, and media, and politics — can feel in 2026. A lot of intuitions about what might have been wrong with social media, or at least what effects it might have been having on the world around it, feel less applicable to the systems that emerged in its place, where a vision of social and parasocial connection has been supplanted by a program of systematic desocialization.

If the recent story of social media could be told as billions of people being thrown together into a new shared and monetized environment and driving each other slightly insane, the next era of the big platforms is shaping up to one of increasing isolation, of passive consumption stripped of any sense of shared culture, and of, to borrow a term, comprehensive “unconnection,” not as a condition to be corrected — at least by other people — but as a goal nearly achieved.