Mary Kay Beckman was looking for love, so she, of course, went searching for it on the internet. She signed up for Match.com in September 2010, and after a few weeks matched with a nice seeming guy. They went on a few dates before Beckman decided it wasn’t going to work out. She broke things off. Then he stabbed her.
Beckman, it turned out, was not the first Match date Wade Ridley had brutally maimed and assaulted. In a lawsuit she filed against the company, Beckman said she signed up for Match because she “wanted to experience the type of healthy and loving relationship the website claimed to foster.” But instead of “You’ve Got Mail,” she got “Psycho.”
In Beckman’s suit, she argued that the dating website was liable for a “failure to warn.” Match, in other words, should have informed her about the dangers of online dating. Specifically, it should have protected her against a man who had attacked multiple Match.com users.
Last week, the Court of Appeals for the Ninth Circuit in San Francisco overturned a lower court’s decision to dismiss the case. A Nevada district court had found that Match was protected by Section 230 of the Communications Decency Act, a 1996 law that holds that websites are not responsible for content provided by users on their sites. For the second time, the Ninth Circuit found that actually, in some cases, websites are. This is a pretty big deal.
In 1996, Congress passed Section 230 to protect the still nascent internet from being trampled by litigation. It was unreasonable, Congress felt, to expect websites to be able to police every single thing a user posted online. More importantly, such expectations would probably kill the young internet and stifle innovation. In a single, short statute, Congress protected websites from being sued or prosecuted for content posted by visitors. The Electronic Frontier Foundation has called Section 230 “the most important law protecting internet speech.”
In the 20 years since, though, things have become more complicated as more of the world has become filtered through the internet, and as the boundaries have blurred between online and off. In a 2014 opinion in a case involving a woman who was drugged, raped, and filmed by men she met through the website ModelMayhem.com, a job-seeking site for models, the Ninth Circuit wrote that the CDA was not “an all purpose get-out-of-jail-free card for businesses that publish user content on the internet.” The court found that Model Mayhem, too, could be sued for failure to warn as the site was aware of the model’s rapists because they were the subject of a criminal investigation for doing the same thing to other Model Mayhem users.
In the Model Mayhem case, the court found that Section 230 did not protect the website when it failed to do anything about the rapists it knew were prowling its site. Companies including Facebook, Craigslist and Tumblr launched a challenge to the 2014 ruling, but the Ninth Circuit ruled once again in May that the CDA does not protect Model Mayhem from being sued.
The Match.com case is very similar. Ridley, the suit claimed, had attacked other women using Match.com and the company had done nothing to warn love-seeking online daters about the possibility of attack. The Ninth Circuit upheld the dismissal of some claims, but it found that the logic supporting the court’s Model Mayhem ruling applied here, too.
“Nevada law provides that, when a defendant has actual knowledge of a specific harm, that defendant has a duty to warn known, foreseeable victims of known, foreseeable harms,” the court said.
Again, Section 230 did not protect it from a lawsuit alleging a failure to warn.
“Blanket immunity has allowed corporations to put profits ahead of people because they can,” Beckman’s attorney, Marc Saggese, told me. “They could know a guy is dangerous and still leave his profile up to collect their $30 a month. Internet dating service providers cannot bury their heads in the sand anymore.”
Eric Goldman, a Santa Clara University law professor and staunch defender of Section 230, was not a fan of the opinion. He fears that such lawsuits open the internet up to an onslaught of litigious chaos.
“There are a virtually infinite number of potential risks that a website could warn users about, and plaintiffs can always find *something* that wasn’t disclosed,” he wrote on his blog. “If you think online user agreements are already too long and filled with too many irrelevant disclosures, you ain’t seen nothing yet.”
The Ninth Circuit’s ruling, he told, me, will open the door to more lawsuits hoping to chip away at Section 230. Weakening the integrity of Section 230 would create a messy hierarchy of responsibility, forcing companies to spend resources and rely on third-party information to police their ranks.
“Section 230 is a pretty powerful shield,” he said. “But now we’ve found a hole in the shield.”
Over the past few years, companies have also struggled outside of court with exactly how much of their user behavior they are responsible for. Twitter has banned revenge porn, issued new anti-harassment rules, established a trust and safety council and banned high-profile users that it considers abusive. Reddit last year announced it would ban disturbing subreddits. Craigslists ads include a link to a page about common scams on the site and personal safety. Airbnb said publicly that it should play some part in its own regulation, and began sharing user data with cities.
In theory, this is how the CDA is meant to work. Websites should be responsible for regulating themselves. But figuring out how to balance freedom and safety has been messy. As we spend more of our time online, and encounter harms as a result, critics have been begging courts and lawmakers to rethink just how much immunity Section 230 should provide. Some feel that as websites have become platforms for not just online activity, but real-world interaction, a law written distinctly to protect online activity no longer makes sense.
“They’re arranging these real-world interactions and all of a sudden they say, ‘Oh, we have no responsibly for anything that goes wrong,'” Tom Slee, an author who writes about technology and politics, told Vice. “Section 230 has been stretched in so many different directions it’s becoming a travesty of what it was originally intended for.”
For example, people who run revenge porn websites have gleefully pointed toward Section 230 as protecting the content they put out, since it’s their users uploading naked photos of people against their will, not the people running the site.
Section 230 was now written 20 years ago, when interactions on the internet were rare and using the web to do things offline was even rarer. It allowed the modern internet to become the amazing thing that it is today. But do we really want a company to have no legal obligation to do what it can to protect its users from life-threatening injuries and rape?
“Online dating sites cannot continue to be bubble gum machines dispensing sexual predators,” Carrie Goldberg, an attorney specializing in internet privacy and sexual consent told me. “This ruling says it’s time for them to staff up their abuse and safety departments and start using due diligence to ban users who are reported as predators. They will also need to create safeguards to ensure that predators do not simply mask their identity and create new accounts.”
So far, few challenges to Section 230 have prevailed. In 2012, Washington State passed a law that made it a crime to publish sex ads depicting minors, with the intent of forcing the classified site Backpage to verify that all of the escorts who advertise on it are over 18. After Backpage sued, the state agreed to repeal the law and pay Backpage $200,000 in legal fees.
The next year, a group of State Attorneys General asked Congress for special exceptions to Section 230 to allow states to prosecute “internet companies, including potentially their executives, for violations of state criminal law for their online publication of third party content.” It also failed.
Even the Model Mayhem case may not ultimately prevail. The website’s parent company, Internet Brands, could still appeal the case to the U.S. Supreme Court, as could Match.com.
“So far, none of these things have had any traction,” said David Green, an attorney for the Electronic Frontier Foundation, which supports Section 230.
The success in creating a teeny loophole upheld by the same court in the law for failure to warn claims, he said, isn’t all that significant.
“What would be more significant,” he said, “is if another court adopts this.”
For now Beckman’s case will move forward, and will determine whether Match.com has any legal responsibility for the January evening when she was attacked. After months of harassing texts, Ridley hid in Beckman’s garage and waited for her to come home. Then he stabbed her 10 times in the head, face, and upper body. She suffered severe brain trauma and a seizure; after three surgeries, she is still undergoing treatment five years later.
So far, Beckman has not won anything other than her right to sue Match.com over one claim. She will have to establish that Ridley not only used Match to assault other women, but that Match knew that he did. Even then, a court may not find that Match had an obligation to its customer to do anything about it. If she does win, a higher court could still overturn the ruling on appeal.
In 2012, Match, which responded to my request for comment only to note that it had defeated Beckman’s three other claims, was among several online dating sites to publicly commit to educating users about online dating safety after pressure from California Attorney General Kamala Harris. The sites pledged to hunt down fake profiles and check sex offender registries to prevent registered offenders from signing up for their sites.
Now Match.com offers its users a thorough page of online safety tips. And on its terms of service page, it also offers them this disclaimer: “The Company is not responsible for the conduct of any Member.”