[ad_1]
Despite what the company’s stock prices might tell you, Facebook is a company with an image problem. At best, critics view Facebook with suspicion as a company that ruthlessly undresses users for their data so that it can curb competitors in increasingly hostile ways. At worst, people call the company a threat to democracy itself. And as the case against Facebook continues to grow, employees scramble to figure out how the heck this can win back the public – and come back empty-handed.
At least, that’s what some internal research suggests in September of last year that attempted to measure Facebook’s âperceived legitimacyâ in the eyes of the public and stakeholders. The full document, which you can read here, asked a handful of journalists, regular users and (strangely enough) actors about their general perceptions of Facebook.
The results were pretty much what you expected: Confidence in the company was low, confusion over the company’s content moderation processes was high, and anybody believed Facebook was motivated by anything but big stacks of money. The approach envisaged by the researcher to resolve this public relations crisis? âBuild Trust Through Product Experiences,â get more people of color on your staff and, uh, not much else.
âUsers don’t trust us to do the right thing because they think we prioritize revenue and growth over security and society,â explained an anonymous member of a âlegitimacy teamâ internal Facebook company whose stated mission is to increase the legitimacy of the company. in the public eye.
While the search took place over a year ago, we heard that same story repeated over and over again from the source of these documents – Facebook whistleblower Frances Haugen – just this month. CEO Mark Zuckerberg, meanwhile, reluctant to the idea despite all imaginable evidence that this is the case and the company has reportedly considered a full name change.
âBecause users don’t trust FB due to past incidents, they don’t believe we have good intentions or motives for integrity efforts,â the report said. “Users don’t see our content regulation system as legitimate because they don’t trust our motives.”
Ignoring the fact that Facebook is a business and that businesses typically exist to generate profit, the report notes that users “perceive [Facebookâs] systems are ineffective and biased towards minority groups, âciting the experiences of Facebook users who are LGBTQ +, as well as people of color and other marginalized groups. The report states that these users feel that “FB is censoring or applying too many measures to minority groups” and described being banned from the site “for speaking to their communities about their lived experiences.”
While Zuckerberg and his ilk would have spent a lot of time ignoring the very apparent fact that its hate speech detection systems tend to unfairly target marginalized groups, the company has since come to the idea that, hey, maybe- to be should do something about the problem. Last December, the company launched an internal effort to revision moderation systems involved, but this report (rightly!) acknowledges that that might not be enough.
“Many participants recognized that a large part of this application is carried out by automation and algorithms,” the report reads. At the same time, they âthink the people who built the algorithms are naive at best and racist at worstâ. (Disclose: Both can be true!)
Facebook has yet to respond to a request for comment on the internal report.
Artificial intelligence – and the algorithms that drive much of Facebook’s moderation efforts – are often built by white guys, with white prejudices. The report recommends bringing more members of “minority groups” to the table when developing its algorithms to mitigate induced biases, as well as “[conducting] audits of actions’ taken on the content of people of color. Two very good ideas! Unfortunately, everything goes downhill from here.
Most of the other suggestions for restoring business confidence involve few details. Recommendations such as “continue to invest in restoring trust in the FB brand” and “build trust by ensuring that what we ship is neat”, for example, are just plain nonsense. When users surveyed said a company the size and scale of the Facebook giant should devote more of its resources to moderation, the report dismissed any idea of ââmoney, focusing instead on, uh, at how difficult content moderation is.
“The narrative that regulating content is difficult and complex may not appeal to users well,” the report read. âInstead, we should understand if we are focusing on highlighting what we are are doing to solve the problems would be more efficient.
How? ‘Or’ What? With the same weirdly aggressive public relations tactics taken by Facebook staff in contact with the public? With the same fashionable deceptive blog posts and company policies? I don’t know, and the report doesn’t say so. But it looks like Facebook’s plan to tackle all this bad press is just to⦠keep doing what it always has.
This story is based on Frances Haugen’s disclosures to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were obtained by a consortium of news agencies, including Gizmodo, The New York Times, Politico, The Atlantic, Wired, The Verge, CNN, and dozens of other media outlets.
[ad_2]
No Comments Yet