Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.
In 2004, Mark Zuckerberg built an app to connect Harvard undergrads to one another. By 2006, it was available to anyone over the age of 13. Soon thereafter, his Facebook (FB) social media firm was animated by the concept that connectivity was a human right for the world’s billions. FB is now visited by almost 3 billion distinct users each month. The firm has become a monopoly, counting Instagram and WhatsApp among its divisions. (Further details appear in Chapters 11 and 17 of Digital Dreams Have Become Nightmares: What We Must Do.)
FB’s dominance has led to serious problems which are well known. Its news feed widely shares toxic material — misinformation, hate speech, and fake news. People post private information which FB exploits commercially through surveillance capitalism. Fake social media participants constructed by Russia in the 2016 US presidential election and other elections has skewed the results. Children’s addiction to social media harms their sense of self-worth and their physical and mental health and well-being.
Mark Zuckerberg had to testify before Congress after its role in the 2016 election became understood, uttering mea culpas, admitting that Facebook has issues of public trust, and promising to do better. Yet this has generated little faith that Facebook can regulate itself, especially when the systematic reviews of its actions in books by Zuckerberg advisor Roger McNamee and journalists Sheera Frenkel and Cecilia Kang document that FB has been aware of the evil uses of its platform and has been unwilling to stop them.
Hence there have been calls for increased government regulations and even the dismantling of FB’s monopolistic hold on the social media market. Recent hires by the Biden administration suggest that antitrust actions will happen soon. Such steps will no doubt be hastened by the landmark investigating reporting of Jeff Horwitz, published in the Wall Street Journal in mid-September 2021. Among his allegations are that FB has been well aware that almost six million VIPs are given special dispensation to violate their content standards; criminals use the platform to recruit women, incite violence against ethnic minorities, and support government action against political dissent; that Instagram is toxic to many young girls in terms of poor self-image, mental health, and suicidal thoughts has been documented by internal company research; and COVID misinformation has been rampant and corrosive. The company has responded by asserting that the series is full of ‘deliberate mischaracterizations’, but it has given few details.
The Facebook Files report eliciting the most concern was that linking the use of Instagram with the assault of the self-image of young girls. Although there have been methodological concerns that the study shows only correlation and not causality, the results are consistent with those of other research studies. The company’s suppression of this work and its failure to act are the most damning evidence of its reckless monopoly power, as is the recent revelation that the company views young users a growth engine. It has therefore been investing almost its entire global marketing budget, almost $400 million per year, to target teenagers.
Shortly after the Wall Street Journal articles, FB was dealt another severe blow, the leaking by a whistleblower who had previously been a product manager, Frances Haugen, of thousands of pages from internal documents. Much of what the documents and her testimony before the U.S. Congress and interview with 60 Minutes revealed had already been leaked to the Journal and conveyed the message of a company putting profits before safety. New insights included the allegation that the firm relaxed its safeguards too soon after the U.S. election, thus contributing to the January 6 riot, and that the firm is incapable of suppressing most election and vaccine misinformation. FB denied the allegations, then tried in October 2021 to calm employees and to reduce discussions on internal message boards.
It also embarked on a dramatic attempt to shift attention from its failure to provide safe and healthy online platforms and the looming threat on antitrust action by changing its corporate name to Meta and its corporate focus to the metaverse. (I shall discuss the metaverse in a future post.) This was also done to better appeal to young users, who have been abandoning FB for platforms such as Snapchat and TikTok, owned by the Chinese multinational internet company ByteDance; and to reduce its dependency on Apple and Google operating systems.
FOR THINKING AND DISCUSSION
Should Meta be broken up on antitrust grounds? What principles would you apply to guide a break-up? Or would you take another less dramatic approach to making social media safer?