Last week, a Facebook spokesperson told me that users’ smartphone location data was among the signals that the social network uses to suggest “People You May Know” in real life. But after I published a story about it (and people freaked out) Facebook retracted the statement and said that, actually, the company isn’t using location data to make friend suggestions.
Facebook’s communications team says the confusion arose because there was a brief time when the social network used location for friend suggestions. It involved a small percentage of Facebook users and stopped last year.
“We ran a small test to use city-level location to better rank existing [“People You May Know] candidates and not all were aware that the test had ended,” said a Facebook spokesperson by email. “The test ran for four weeks at the end of 2015.”
I am not privy to details about how the test went or why it wasn’t rolled out more widely.
Since my story came out, I’ve heard many, many anecdotes from people who say that Facebook’s friend suggestions has included people they had recently met or seen in real life. Many people are absolutely convinced that Facebook must be using their location, somehow, to figure this out. I asked Facebook if there was another way they were inferring location, beyond GPS coordinates from smartphones, that might explain this happening, such as the use of IP addresses (which can be geomapped) or the shared use of a wireless network.
“We are not using location data to suggest people you might know,” said the spokesperson. “This includes IP and Wi-Fi access point location information.”
So the mystery remains. The spokesperson called the formula behind “People You May Know” complex. Publicly, Facebook says only that it “show[s] you people based on mutual friends, work and education information, networks you’re part of, contacts you’ve imported and many other factors.”
I asked if I could have a list of those “other factors.” My fingers are crossed but I’m not holding my breath.
This is the strange thing about the world we live in now. Tech companies produce algorithms that have access to unknown, but potentially prolific amounts of data about us. Those algorithms produce mysterious and perplexing results. But we don’t get to look into the black box that produces them, so we’re left with a sense of wonder or unease, depending on your level of paranoia.