shot-button
Subscription Subscription
Home > Lifestyle News > Health And Fitness News > Article > Who will watch Facebook

Who will watch Facebook?

Updated on: 12 April,2020 12:00 AM IST  | 

A new book digs deep into the world of the largest social network, its founder, the company-s struggle with privacy, freedom of speech

Who will watch Facebook?

Facebook Chief Executive Officer and founder Mark Zuckerberg at a Dublin hotel in April 2019, after meeting with Irish politicians to discuss regulation of social media, transparency in political advertising and the safety of young people and vulnerable a

Facebook had a torturous time with trash-talking provocateurs from the right. When the white-nationalist conspiracy monger Alex Jones repeatedly posted comments that seemed to violate Facebook-s hate-speech rules, the company was loath to ban him. Complicating matters was that Jones was an individual, and his Facebook page, InfoWars, was an operation staffed by several people.

The situation was radioactive. Fringe as he was, Jones had a huge following, including the president, who had been a guest on the InfoWars radio show. Did Jones-s newsworthiness make him a figure like the president, worthy of a hate-speech free pass? During the summer of 2018, the controversy raged, as reporters kept citing hateful posts. Ultimately, it was the pressure. Within hours after Apple took down his podcast, Zuckerberg himself pulled the plug on InfoWars. Jones was suspended for thirty days, and ultimately Facebook would ban him as "dangerous." It did this in tandem with the expulsion of the fierce-tongued Nation of Islam leader Louis Farrakhan, in what appeared to be an unmistakable play for balance.


When I pressed Zuckerberg in early 2018 about Facebook-s delicacy in handling GOP complaints, he bent over so far backward in respecting their point of view that I worried his chair would hit the floor. "If you have a company which is ninety percent liberal—that-s probably the makeup of the Bay Area—I do think you have some responsibility to make sure that you go out of your way and build systems to make sure that you-re not unintentionally building bias in," he told me. Then, ever balancing, he mentioned that Facebook should monitor whether its ad systems discriminated against minorities. Indeed, Facebook would commission studies of each of those areas. Part of Zuckerberg-s discomfort arises from his preference for less oversight. Even while acknowledging that content on Facebook can be harmful or even deadly, he believes that free speech is liberating. "It is the founding ideal of the company," he says. "If you give people a voice, they will be able to share their experiences, creating more transparency in the world. Giving people the personal liberty to share their experiences will end up being positive over time."


Still, it was clear that Zuckerberg did not want the responsibility of policing the speech of more than 2 billion people. He wanted a way out, so he wouldn-t have to make decisions on Alex Jones and hate speech, or judge whether vaccines caused autism. "I have a vision around why we built these products to help people connect," he said. "I do not view myself or our company as the authorities on defining what acceptable speech is. Now that we can proactively look at stuff, who gets to define what hate speech is?" He hastened to say that he wasn-t shirking this responsibility, and Facebook would continue policing its content. "But I do think that it may make more sense for there to be more societal debate and at some point even rules that are put in place around what society wants on these platforms and doesn-t."
As it turns out, Zuckerberg was already formulating a plan to take some of the heat off Facebook for those decisions. It involved an outside oversight board to make the momentous calls that were even above Mark Zuckerberg-s galactic pay grade. It would be like a Supreme Court of Facebook, and Zuckerberg would have to abide by the decisions of his governance board.


Setting up such a body was tricky. If Facebook did it completely on its own, the new institution would be thought of as a puppet constrained by its creator. So it solicited outside advice, gathering a few hundred domain experts in Singapore, Berlin, and New York City for workshops.

After listening to all these great minds, Facebook would take the parts of the recommendations it saw fit to create a board with the right amounts of autonomy
and power.

I was one of 150 or so workshop participants at the NoMad Hotel gathering in New York City-s Flatiron district. Sitting at tables in a basement ballroom were lawyers, lobbyists, human rights advocates, and even a couple of us journalists. For much of the two-day session we dug into a pair of individual cases, second-guessing the calls. One of them was the "men are scum" case that had been covered a few times in the press.

A funny thing happened. As we got deeper into the tensions of free expression and harmful speech, there was a point where we lost track of the criteria that determined where the line should be drawn. The Community Standards that strictly determined what stood and what would be taken down was not some Magna Carta of online speech rights but a meandering document evolved from the scribbled notes of customer support people barely out of college.

The proposed board would be able to overrule something in that playbook for the individual cases it considered, but Facebook provided no North Star to help us draw the line—just a vague standard touting the values of Safety, Voice, and Equity. What were Facebook-s values? Were they determined by morality or dictated by its business needs? Privately, some of the Facebook policy people confessed to me that they had profound doubts about the project.

I could see why. For one thing, the members of this proposed body—there will be forty members, chosen by two people appointed by Facebook—can take on only a tiny fraction of Facebook-s controversial judgment calls. In the first quarter of 2019, about 2 million people appealed Facebook content decisions. Facebook would have to abide by the decisions on individual cases, but it would be up to Facebook to determine whether the board-s decisions would be regarded as precedent, or simply limited to the individual pieces of content ruled on, because of expedience or because they were lousy calls.

One thing seems inevitable: an unpopular decision by a Facebook Supreme Court would be regarded just as harshly as one made by Zuckerberg himself. Content moderation may be outsourced, but Facebook can-t outsource responsibility for what happens on its own platform. Zuckerberg is right when he says that he or his company should not be the world-s arbiter of speech. But by connecting the world, he built something that put him in that uncomfortable position.

He owns it. Christchurch and all.

Excerpted with permission from Facebook: An inside story by Steven Levy, published by Penguin Random House UK

Catch up on all the latest Mumbai news, crime news, current affairs, and a complete guide from food to things to do and events across Mumbai. Also download the new mid-day Android and iOS apps to get latest updates.

Mid-Day is now on Telegram. Click here to join our channel @middayinfomedialtd and stay updated with the latest news

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!


Mid-Day Web Stories

Mid-Day Web Stories

This website uses cookie or similar technologies, to enhance your browsing experience and provide personalised recommendations. By continuing to use our website, you agree to our Privacy Policy and Cookie Policy. OK