Home Human life Algorithms are failing Facebook. Can humanity save it?

Algorithms are failing Facebook. Can humanity save it?

0
Algorithms are failing Facebook. Can humanity save it?

Article from Quartz

The video was on Facebook for 20 hours. It showed Wuttisan Wongtalay, a 20-year-old man in Thailand, slipping a noose over his 11-month-old daughter’s neck and dropping her off the side of the building to her death. Off camera, Wongtalay then committed suicide, but his Facebook video outlived him, and continued to spread for nearly an entire day.

According to a Wall Street Journal tally, more than 50 acts of violence, including murders, suicides, and sexual assault, have been broadcast over Facebook Live since the feature launched 13 months ago. Until recently, it seemed the company’s only response to humanity’s inhumanity was the promise of better technology. After one user broadcast his intention to kill someone over Facebook Live and minutes later, on April 16, made a video in which he fatally shot a 74-year-old man, Facebook’s vice president of global relations, Justin Osofsky, declared that the company was “exploring ways that new technologies can help us make sure Facebook is a safe environment.”

But after Wongtalay streamed video of himself murdering his daughter, the response was somewhat different. In a status update on his personal Facebook profile, CEO Mark Zuckerberg, himself the father of a young girl, pledged that the company would, among other things, add 3,000 people to the team that reviews Facebook content for violations of the company’s policies.

Zuckerberg also announced the company’s intention to develop new tools to more quickly report trouble and to more easily loop in law enforcement when necessary. But the only concrete piece of the plan he laid out—the hirings—suggested the easiest, more obvious answer to Facebook’s newest people problems is not more technology, but more humans.

Facebook wanted to be pipes, but it’s also people

Early in the company’s history, Zuckerberg referred to Facebook as a “utility,” a piece of “information infrastructure.” In a letter to potential shareholders in 2012, he compared the social network to the printing press and the television.

But television manufacturers and printing press makers have no reason to understand the difference between a historical photo and a piece of pornography, to consider how to classify photos of breastfeeding mothers, or to debate whether an exception should be made for Donald Trump’s hate speech. They merely make the tools for distributing content.

Facebook, on the other hand, built both a content-distribution platform and a global community—”social infrastructure,” as Zuckerberg more recently described it—and its role in that community ended up being both toolmaker and governing institution. Facebook doesn’t just enable communication, but sets the boundaries and rules around it. And its influence—whether on culture, on elections, or on anything else beyond its own digital borders—means that those decisions impact us all, whether or not we use Facebook.

Had Facebook been thinking about Facebook Live as more than a neutral technology product, it may have anticipated what Zeynep Tufekci, an associate professor at University of North Carolina who studies online speech issues, told the New York Times that she anticipated: “It was pretty clear to me that this would lead to on-camera suicides, murder, abuse, torture,” she told the paper. “The FBI did a pretty extensive study of school shooters: The infamy part is a pretty heavy motivator.”

Facebook, in a civic mindset, could have put a plan in place for monitoring Facebook Live for violence, or waited to launch Facebook Live until the company was confident it could quickly respond to abuse. It could have hired the additional 3,000 human content reviewers in advance.

But Facebook “didn’t grasp the gravity of the medium,” an anonymous source familiar with Facebook’s Live’s development told the Wall Street Journal. Zuckerberg, excited about the feature’s potential to attract young users, reportedly put 100 employees on “lockdown” to rush Live out the door.

Facebook needs technical scale with human judgment

Guns can show up in all kinds of places. Cowboys have held them in western films. Police officers have been holding them in footage of unnecessary force. Murderers have held guns as they broadcast their crimes on Facebook Live.

Today’s technology is only capable of identifying the gun in each of these scenes, not the context around it. (The current state of artificial intelligence similarly comes up short when attempting to label fake news.) If Facebook did not understand what a product like Live would require of it in terms of precautions and safeguards, it at least would have understood this.

“Teaching a machine when something becomes a thing we pay attention to, because the right combination of elements is there, is really really difficult,” says Joe Moran, a product research specialist at Cogito, a company that makes a real-time “emotional intelligence” software. “I don’t believe the major insight needed to solve this context problem has occurred yet.”

Humans are better at understanding context. But they can’t solve Facebook’s problems on their own. Facebook’s active users comprise about a quarter of the world’s population and outnumber the combined populations of the US and China. Adding another 3,000 workers to the mix to monitor content simply isn’t going to make a meaningful difference. As Zuckerberg put it during a phone call with investors, “No matter how many people we have on the team, we’ll never be able to look at everything.”

In other words, the task of removing harmful content from Facebook is too complicated for current technology, and the scale is too big for a team of humans.

For now, the only solution is a hybrid one—a mix of people and algorithms. When Facebook attempted to address fake news, for example, it changed its “trending topics” algorithm but also lined up help from fact-checkers at outlets like Snopes, the Associated Press, and PoltiFact.

The combination may be fleeting—the technology will catch up eventually—but it’s also quite fitting given that so many of the problems Facebook is now confronting, from revenge porn to fake news to Facebook Live murders, are themselves the result of humanity mixing with algorithms.

LEAVE A REPLY

Please enter your comment!
Please enter your name here