For heavy Facebook users, let alone social media gurus, the idea that Facebook’s news feed is filtered by an algorithm is very, very old news. But a majority of everyday Facebook users in a recent study had no idea that Facebook constructs their experience, pushing certain posts into their stream and leaving others out. And worse, many participants blamed themselves, not Facebook’s software, when friends or family disappeared from their news feeds.
“In the extreme case, it may be that whenever a software developer in Menlo Park adjusts a parameter, someone somewhere wrongly starts to believe themselves to be unloved,” wrote a team of researchers led by University of Illinois at Urbana-Champaign doctoral student Motahhare Eslami, in a new paper on Facebook’s news feed algorithm.
The new qualitative research study sampled 40 Facebook users and ran them through an in-depth examination of the ways that Facebook filters their experience. Twenty-five of the users—or more than 60 percent — had no idea that there even was a filtering algorithm, let alone one that looks at more than a thousand data signals to determine what to show a user. The researchers termed these innocent users “the unaware.”
One of the unaware participants told the researchers, “It’s kind of intense, it’s kind of waking up in ‘the Matrix’ in a way. I mean you have what you think as your reality of like what they choose to show you.”
Without understanding Facebook’s algorithm, these participants resorted to developing other theories for why their social lives changed on the site. Some blamed themselves for being bad at Facebook. “These participants felt that they missed friends’ stories because they were scrolling too quickly or visiting Facebook too infrequently,” the researchers write.
Others figured that their friends had stopped sharing with them. “I have never seen her post anything!” one study participant said of a friend. “And I always assumed that I wasn’t really that close to that person, so that’s fine. What the hell?!”
To let users see what the red-pill world was like, the researchers created a tool called FeedVis that tapped into Facebook’s API to show people an unfiltered feed. Using that data, they roughly categorized people’s friends based on how often Facebook was likely to surface that friend’s posts—”Rarely Shown,” “Sometimes Shown,” or “Mostly Shown.”
Many users were appalled to see how Facebook had judged their friendships. “Well, I’m super frustrated [pointing to a friend’s story], because I would actually like to see their posts,” one participant told the researchers. When presented with the opportunity to recategorize their friends, participants in the study moved 43 percent of their friends.
Interestingly, though—and here’s where the study starts to look better for Facebook—the researchers let people reclassify the posts that actually ran in their news feeds. And in that case, the participants mostly agreed with the Facebook algorithm’s assessment of what they should be shown. They only chose to change 17 percent of the content in their feeds.
While some participants were upset by the idea that Facebook was changing their social experience, more than half of the study participants “came to appreciate the algorithm over the course of the study.” Most came to think that the filtering and ranking software was actually doing a decent job. “Honestly I have nothing to change which I’m surprised!” one said. “Because I came in like ‘Ah, they’re screwing it all!'”
To continue The Matrix analogy, it’s as if you took the red pill and just realized that the unfiltered world is a lot like the normal one, but a little bit crappier and with more posts by your lame uncles.
I’d posit another theory for why understanding that Facebook’s algorithm exists would lead to higher satisfaction with the social network. It gives us a catch-all scapegoat to blame for anything. Haven’t seen Susie’s posts for a while? It’s probably the algorithm. Haven’t heard from Tony? The algorithm probably didn’t show him your posts.
“Because I know now that not everything I post everyone else will see, I feel less snubbed when I make posts that get minimal or no response,” one participant concluded in a follow-up survey. “It feels less personal.”
In other words: the information that Facebook’s unaware users might find most useful is that, sometimes, it’s not that the world ignored you. It’s that the algorithm did.
* This post has been updated to correct Motahhare Eslami’s institution.