Connect with us

Technology

Facebook Employees Frustrated With Zuckerberg After Kenosha

Published

on

Following days of violence and civil unrest, Facebook employees wonder if their company is doing enough to stifle militia and QAnon groups stoking violence on the social network.

Last updated on August 28, 2020, at 3:10 p.m. ET

Posted on August 28, 2020, at 1:07 p.m. ET


Facebook / Via Facebook: zuck

Mark Zuckerberg at Facebook’s all-hands meeting.

Frustrated Facebook employees slammed CEO Mark Zuckerberg on Thursday during a companywide meeting, questioning his leadership and decision-making, following a week in which the platform promoted violent conspiracy theories and gave safe harbor to militia groups. The billionaire chief executive was speaking via webcast at the company’s weekly all-hands meeting, attempting to address questions about violence in Kenosha, Wisconsin, and the QAnon conspiracy that has proliferated across Facebook.

The meeting came one day after the Verge reported that a self-proclaimed militia group calling itself “Kenosha Guard” had used its Facebook page to issue a “call to arms” — violating Facebook’s own policies — and remained online even though at least two people reported it before the shooting. It also followed weeks of employee unrest, in which the company’s rank and file have urged the CEO to combat the spread of QAnon-related content on its platform.

All of Facebook’s more than 50,000 employees can watch and comment on the stream during the meeting, or view a recording after its conclusion — and as Zuckerberg spoke, angry comments poured in.

“At what point do we take responsibility for enabling hate filled bile to spread across our services?” wrote one employee. “[A]nti semitism, conspiracy, and white supremacy reeks across our services.”

After this story was published, Facebook made video of the meeting public.

“At what point do we take responsibility for enabling hate filled bile to spread across our services?”

There has been increasing internal strife at the social network that came to a head when the company failed to take action on a May post from President Donald Trump that suggested state violence would be used against people protesting the police killing of George Floyd. As internal morale has plummeted, some employees have openly challenged Zuckerberg, who maintains majority shareholder voting control and complete decision-making power at Facebook. The level of employee pushback, which included a virtual walkout in June, is unprecedented in the company’s 16-year history.

Zuckerberg opened his Thursday address by discussing the police shooting of Jacob Blake and subsequent violence in Kenosha. When he said images from Wisconsin were “painful and really discouraging,” employees jumped in the comments section to ask why Facebook had been slow to react, particularly after at least one Kenosha militia page had remained on the platform after a 17-year-old shot and killed two protesters on Tuesday night.

While Zuckerberg said the shooter’s Facebook and Instagram accounts, which showed no clear connection to the Kenosha Guard page, were taken down, he admitted Facebook had made “an operational mistake.” The page had violated Facebook’s new rules introduced last week that labeled militia and QAnon conspiracy theory groups as “Dangerous Individuals and Organizations” for their celebrations of violence.

The company did not catch the page despite user reports, Zuckerberg said, because the complaints had been sent to content moderation contractors who were not versed in “how certain militias” operate. “On second review, doing it more sensitively, the team that was responsible for dangerous organizations recognized that this violated the policies and we took it down.”

That answer didn’t satisfy some employees.

“We need to get better at avoiding mistakes and being more proactive,” one wrote. ”Feels like we’re caught in a cycle of responding to damage after it’s already been done rather than constructing mechanisms to nip these issues before they result in real harm.”

Do you work at Facebook or another technology company? We’d love to hear from you. Reach out at [email protected] or via one of our tip line channels.

Facebook spokesperson Liz Bourgeois called the shooting in Kenosha “painful for everyone, especially our Black community.”

“We have not found evidence on Facebook suggesting the shooter followed the Kenosha Guard Page or that he was invited on the Event Page they organized,” she said in a statement. “However, the Kenosha Guard Event and Page violated our new policy on militia organizations and have been removed on that basis. We launched this policy last week and we’re still scaling up our enforcement of it by a team of specialists on our Dangerous Organizations team.”

In discussion groups on Workplace, Facebook’s employee discussion board, workers had expressed their frustration prior to the all-hands meeting. “Feeling especially sad to be a FB employee today,” one person wrote on a post featuring the Verge article in an internal group for community standards feedback. “When will we finally start to take hate on our platform seriously?”

“When will we finally start to take hate on our platform seriously?”

Not all employees took issue with the police shooting of Blake and subsequent militia violence. Some expressed sympathy for the police or posted slogans meant to denigrate Black Lives Matter, illustrating the culture war playing out within the social network’s ranks.

“What are your thoughts about our employees posting all lives matter or blues lives matter after shooting to [sic] Jacob Blake?” asked one employee during the all-hands. Zuckerberg did not respond to the question.

In a Workplace post that was seen by thousands on Wednesday — hours after two people were killed in Kenosha during protests against police violence — one employee asked his colleagues to show their “Support for Law Enforcement.” The employee, who posted the pro-police image of the American flag with a thin blue line, wrote “mourning the death of those who serve our communities does not mean supporting injustice by anyone in society.”

Some responses argued that the post was “deliberate trolling” and a “symbol of racial injustice.”

“I’m extremely appalled of the use of our internal tools to spread such a divisive message that lacks any ounce of unity,” one person wrote in the comments of the post, which was eventually removed.

Ifeoma Ozoma, a former associate manager on Facebook’s global policy team, told BuzzFeed News that Facebook’s inability to stop the perpetuation of hate on its site was evidence of “the values of those in leadership.”

“Violent white supremacists thrive on the platforms run by people who are bought into the perpetuation of a system of white supremacy, or at a minimum, refuse to reckon with it,” she said.

At Thursday’s meeting, Zuckerberg said the company was “proactively” searching for and removing content that praised the Kenosha shooting, though the Guardian showed that fundraisers for and memes of the shooter continued to proliferate on Facebook.


Scott Olson / Getty Images

A Donald Trump supporter holding a QAnon flag visits Mount Rushmore National Monument in Keystone, South Dakota, July 1.

Facebook’s inability to enforce its own policies also came into sharp focus as its CEO discussed its changed approach and crackdown on QAnon, a pro-Trump conspiracy theory that holds that a secret cabal of pedophiles and cannibals control the government and other aspects of society. Zuckerberg pointed to an enforcement action taken last week in which the company removed over 790 groups, 100 pages, and 1,500 ads tied to QAnon, noting that the reason why the company was acting now was because “we’re seeing QAnon evolve” from misinformation to the promotion of violence.

“As people start discussing violence or something that has the potential to lead to violence, especially as we enter a very fraught and charged period and the likelihood of potential civil unrest leading up to the election and after, I think that that warrants having a somewhat different balance of free expression and safety,” he said. (There have been multiple incidents of QAnon followers inciting violence over the last year, and the FBI labeled it a domestic terrorism threat in May 2019.)

Still, some employees wondered why it had taken the company so long to act.

“Our fact-checking and takedowns have only ramped up in the past few months, but QAnon has festered for 3 years,” one person wrote. “I think the critique is our reactive vs. proactive approach.”

Other workers reported they still saw dozens of active QAnon groups, with one noting that they had reported a related page but were told it did not go against Facebook’s community standards. “Our enforcement makes no sense,” they said.

A BuzzFeed News search using Facebook-owned tool CrowdTangle on Thursday showed dozens of new posts associated with popular QAnon hashtags and slogans, while one reporter’s Facebook account received a notification for a new photo that had been added to a group called “QAnon Movement.”

As Zuckerberg spoke on Thursday, he did little to assuage the flood of complaints that streamed in over the live feed. He called the company’s approach to tackling QAnon “a more sophisticated program on this than any other company.”

One employee who spoke with BuzzFeed News after the event was not comforted by their CEO’s words.

“He seems truly incapable of taking personal responsibility for decisions and actions at Facebook,” they said.

With reporting from Jane Lytvynenko and Pranav Dixit.

UPDATE

After this story was published, Facebook made video of part of the employee meeting public.

Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

Parler’s website shows signs of life after AWS fallout

Published

on

By

The Parler website home screen on a laptop computer arranged in the Brooklyn borough of New York, U.S., on Friday, Dec. 18, 2020.

Gabby Jones | Bloomberg | Getty Images

The website of Parler — a social media platform popular with conservatives and supporters of President Donald Trump — is back online, albeit in a very limited form.

Unlike for much of last week, the website now loads and displays a brief message from Parler CEO John Matze that reads: “Hello world, is this thing on?”

The Parler website dropped offline on Jan. 11 after Amazon withdrew its support in the wake of the deadly U.S. Capitol riot. The website was reliant on cloud computing power provided by Amazon Web Services.

AWS withdrew its support for Parler on Jan. 10 after concluding that posts on the company’s website and apps encourage and promote violence.

“It is clear that there is significant content on Parler that encourages and incites violence against others, and that Parler is unable or unwilling to promptly identify and remove this content, which is a violation of our terms of service,” said an AWS spokesperson.

They added: “We made our concerns known to Parler over a number of weeks and during that time we saw a significant increase in this type of dangerous content, not a decrease, which led to our suspension of their services.”

Matze said in a statement that Parler removed violent content and added that its community guidelines don’t allow Parler to be knowingly used for criminal activity.

The Parler mobile apps are still nowhere to be seen. Google and Apple removed the Parler app from their app stores on Jan. 8 and Jan. 9 respectively.

Technical difficulties

While the Parler website is no longer completely offline, it’s still experiencing technical difficulties and Parler users can’t use it in the way that they did previously.

Below Matze’s brief message is a post about the company’s ongoing technical difficulties.

“Now seems like the right time to remind you all — both lovers and haters — why we started this platform. We believe privacy is paramount and free speech essential, especially on social media. Our aim has always been to provide a nonpartisan public square where individuals can enjoy and exercise their rights to both,” the post reads.

It continues: “We will resolve any challenge before us and plan to welcome all of you back soon. We will not let civil discourse perish.”

Amazon-Parler Lawsuit

Parler has sued Amazon for withdrawing its support for the company. In a lawsuit filed Jan 11. in U.S. District Court in Seattle, Parler accused Amazon Web Services of breaking antitrust laws.

“AWS’s decision to effectively terminate Parler’s account is apparently motivated by political animus,” the lawsuit said. “It is also apparently designed to reduce competition in the microblogging services market to the benefit of Twitter.”

It continues: “This emergency suit seeks a Temporary Restraining Order against defendant Amazon Web Services to prevent it from shutting down Parler’s account. Doing so is the equivalent of pulling the plug on a hospital patient on life support. It will kill Parler’s business — at the very time it is set to skyrocket.”

An AWS spokesperson told CNBC there’s no merit to the claims and Twitter declined to comment.


Source link

Continue Reading

Technology

The silencing of Trump has highlighted the authoritarian power of tech giants | Social media

Published

on

By

It was eerily quiet on social media last week. That’s because Trump and his cultists had been “deplatformed”. By banning him, Twitter effectively took away the megaphone he’s been masterfully deploying since he ran for president. The shock of the 6 January assault on the Capitol was seismic enough to convince even Mark Zuckerberg that the plug finally had to be pulled. And so it was, even to the point of Amazon Web Services terminating the hosting of Parler, a Twitter alternative for alt-right extremists.

The deafening silence that followed these measures was, however, offset by an explosion of commentary about their implications for freedom, democracy and the future of civilisation as we know it. Wading knee-deep through such a torrent of opinion about the first amendment, free speech, censorship, tech power and “accountability” (whatever that might mean), it was sometimes hard to keep one’s bearings. But what came to mind continually was H L Mencken’s astute insight that “for every complex problem there is an answer that is clear, simple and wrong”. The air was filled with people touting such answers.

In the midst of the discursive chaos, though, some general themes could be discerned. The first highlighted cultural differences, especially between the US with its sacred first amendment on the one hand and European and other societies, which have more ambivalent histories of moderating speech. The obvious problem with this line of discussion is that the first amendment is about government regulation of speech and has nothing whatsoever to do with tech companies, which are free to do as they like on their platforms.

A second theme viewed the root cause of the problem as the lax regulatory climate in the US over the last three decades, which led to the emergence of a few giant tech companies that effectively became the hosts for much of the public sphere. If there were many Facebooks, YouTubes and Twitters, so the counter-argument runs, then censorship would be less effective and problematic because anyone denied a platform could always go elsewhere.

Then there were arguments about power and accountability. In a democracy, those who make decisions about which speech is acceptable and which isn’t ought to be democratically accountable. “The fact that a CEO can pull the plug on Potus’s loudspeaker without any checks and balances,” fumed EU commissioner Thierry Breton, “is not only confirmation of the power of these platforms, but it also displays deep weaknesses in the way our society is organised in the digital space.” Or, to put it another way, who elected the bosses of Facebook, Google, YouTube and Twitter?

What was missing from the discourse was any consideration of whether the problem exposed by the sudden deplatforming of Trump and his associates and camp followers is actually soluble – at least in the way it has been framed until now. The paradox that the internet is a global system but law is territorial (and culture-specific) has traditionally been a way of stopping conversations about how to get the technology under democratic control. And it was running through the discussion all week like a length of barbed wire that snagged anyone trying to make progress through the morass.

All of which suggests that it’d be worth trying to reframe the problem in more productive ways. One interesting suggestion for how to do that came last week in a thoughtful Twitter thread by Blayne Haggart, a Canadian political scientist. Forget about speech for a moment, he suggests, and think about an analogous problem in another sphere – banking. “Different societies have different tolerances for financial risk,” he writes, “with different regulatory regimes to match. Just like countries are free to set their own banking rules, they should be free to set strong conditions, including ownership rules, on how platforms operate in their territory. Decisions by a company in one country should not be binding on citizens in another country.”

In those terms, HSBC may be a “global” bank, but when it’s operating in the UK it has to obey British regulations. Similarly, when operating in the US, it follows that jurisdiction’s rules. Translating that to the tech sphere, it suggests that the time has come to stop accepting the tech giant’s claims to be hyper-global corporations, whereas in fact they are US companies operating in many jurisdictions across the globe, paying as little local tax as possible and resisting local regulation with all the lobbying resources they can muster. Facebook, YouTube, Google and Twitter can bleat as sanctimoniously as they like about freedom of speech and the first amendment in the US, but when they operate here, as Facebook UK, say, then they’re merely British subsidiaries of an American corporation incorporated in California. And these subsidiaries obey British laws on defamation, hate speech and other statutes that have nothing to do with the first amendment. Oh, and they pay taxes on their local revenues.

What I’ve been reading

Capitol ideas
What Happened? is a blog post by the Duke sociologist Kieran Healy, which is the most insightful attempt I’ve come across to explain the 6 January attack on Washington’s Capitol building.

Tweet and sour
How @realDonaldTrump Changed Politics — and America. Derek Robertson in Politico on how Trump “governed” 140 characters at a time.

Stay safe
The Plague Year is a terrific New Yorker essay by Lawrence Wright that includes some very good reasons not to be blase about Covid.



Source link

Continue Reading

Technology

Revealed: Tory MPs and commentators who joined banned app Parler | Conservatives

Published

on

By

At least 14 Conservative MPs, including several ministers, cabinet minister Michael Gove and a number of prominent Tory commentators joined Parler, the social media platform favoured by the far right that was forced offline last week for hosting threats of violence and racist slurs.

Parler was taken offline after Amazon Web Services pulled the plug last Sunday, saying violent posts and racist threats connected to the recent attack on the US Capitol violated its terms.

Analysts from the London-based Institute for Strategic Dialogue (ISD) said that Parler had become a platform where the ideas of mainstream Conservative MPs coalesced with those of extremists.

Milo Comerford, senior policy manager at ISD, said: “By positioning themselves as a safe haven for free speech and an alternative to the alleged ‘liberal bias’ of social media giants such as Facebook and Twitter, platforms like Parler attracted a motley crew of ultra-libertarians, violent extremists and conspiracy theorists, as well as more mainstream ‘free speech fundamentalists.’”

At least nine of the Tory MPs on Parler joined the platform in an apparent show of support for free speech following Donald Trump’s clashes with Twitter over remarks he made following the death of George Floyd in Minnesota last year.

The social media giant warned that one of Trump’s tweets “glorified violence”, the first time it had applied such a warning on any public figure’s tweets. Twitter’s row with Trump prompted a campaign by American rightwing voices to move en masse to Parler, which encouraged Trump followers to join on 15 June with a declaration for internet independence.

Days later, Foreign Office minister James Cleverly along with Brexiter Steve Baker MP and Ben Bradley MP, who was recently accused of linking free school meals with “crack dens”, joined Parler. Far-right provocateur Katie Hopkins joined on the same day, after her Twitter account was permanently suspended. Hopkins, who on Thursday joined Ukip in time for the party’s leadership contest, owned Parler’s largest UK account with 435,000 followers when it was taken offline.

Other Tory MPs to join Parler on the same day as Hopkins include Mark Jenkinson, who last year alleged that food parcels were sold or traded for drugs in his Cumbrian constituency without offering any proof, and trade minister Ranil Jayawardena.

Tory MP Nadine Dorries.
Tory MP Nadine Dorries. Photograph: Will Oliver/EPA

Health minister Nadine Dorries joined Parler on 21 June. Dorries had weeks earlier been reprimanded by Downing Street for sharing a video from a far-right Twitter account that falsely claimed Keir Starmer blocked the prosecution of grooming gang members when he led the Crown Prosecution Service.

The most prolific Tory MP on the site was Bradley who sent 52 “parleys” and had more than 12,000 followers. Gove sent at least 26 parleys and had more than 5,000 followers. There is no evidence any Conservative MP posted anything untoward or what might be considered extremist or far right. Some of the accounts had been hardly used and some of those activated in June 2020 appear to have been set up only to support the free speech protest.

Other notable Conservative figures on the site include pro-Brexit campaigner Darren Grimes. In June he told his followers on Parler that “it’s about time we fought back against big tech’s assault on free speech, free expression and freedom of association”.

Comerford added: “Platforms like Parler must be understood as part of a broad online extremist ecosystem, ranging from mainstream social media platforms, imageboard sites like the chans, to encrypted-messenger apps like Telegram, all of which play roles in helping extremists to mobilise, organise and propagandise.”

• This article was amended on 16 January 2021 to remove the statement that Maria Caulfield was on Parler. There is a fake account in the MP’s name on the site.


Source link

Continue Reading

Breaking News

Shares