Emerging—and existing—technologies are bringing us closer to the brink. And even if they turn out to be more benign, envisioning some technological advance as our salvation will waste precious time as the ecosystems upon which we rely move closer to collapse and the violent forces of authoritarianism gain power.
All technology, from hammers and hummers to routers and killer robots, is intended to increase power: to do something cheaper, easier, faster, with more entertainment value, with stronger impact, at greater distances, in more places, or with greater stealth. Technological power, like economic, political, cultural, institutional, or physical power, is distributed unevenly. It tends to be accumulated by people and organizations who already have too much. Algorithmic power has accelerated those differences; the computer has helped create today’s staggering economic divide. Many of the world’s richest people gained their fortunes through such algorithms, and it is their ideologies as well as the computer systems themselves that are taking us in dangerous directions.
[WE WILL PUBLISH YOUR MOST THOUGHTFUL RESPONSES. Send to email@example.com, 300-1000 words, include hyperlinks.]
In a blog posted two days ago, I highlighted phrases and sentences from Mark Zuckerberg’s recent keynote speech sketching his vision of Meta’s intended metaverse. Here are thoughts triggered by his words:
1. “ you’re going to be able to do almost anything you can imagine … “This isn’t about spending more time on screens … [include] communities whose perspectives have often been overlooked … consider everyone …”
No, Mark, be honest. This isabout getting more people into Meta, and about getting them to spend more time in the metaverse, because that’s the only way you can sustain the growth your shareholders expect, and the only way you can withstand the onslaught of firms like Tiktok that now have greater appeal to the next generation of users.
Brett Frischmann is the Charles Widger Endowed University Professor in Law, Business and Economics, Villanova University. His most relevant book to his thoughts below is Re-Engineering Humanity (Cambridge University Press 2018).
Recently, I’ve received multiple invitations to leave Facebook and Twitter and join a new social network that promises to not destroy democracy. I’m tempted. I’m also tempted to delete my accounts and abandon social media altogether. The decision got me thinking, not about democracy but instead about how social media affect my behavior and relationships.
Social media promise and deliver social networks with better or at least bigger scale and scope. Essentially, this means you can connect to many more people from many different places to relate on a wider variety of interests. To socialize is a core human need. The difficult question is whether social media improve our capability to relate to each other.
Technologists are creating increasingly more sophisticated digital technologies capable of monitoring us.
The most mature technology is that of RFID tags.Now as small as grains of rice, RFID tags typically track the location and movement of items through an assembly line, warehouse, store, or library. The tags can also be attached to personal possessions such as clothing, passports, or cash. RFID tags can be and are implanted in animals in order to track them in the wild. This is not now done to humans, although people may be carrying items with RFIDs and be tracked without realizing it.
Other location tracking uses the Global Positioning System(GPS) of satellites. It allows mobile devices to know where on earth they are located, and also allows location tracking on those devices, and hence to monitor the whereabouts of a person carrying the phone. A chilling example of this occurred in a political protest in Ukraine in January 2014, when individuals who were in the barricaded city centre of Kiev received text messages saying ‘Dear subscriber, you are registered as a participant in a mass disturbance’.
ContributedJudith A. Langer, who is a Distinguished Professor Emeritus of Education, a researcher who has specialized in language, literacy, and learning, and one of the co-authors of The COVID-19 Solutions Guide.
August and early September of 2020 were extremely difficult times for everyone who had a stake in education: parents, teachers, school administrators and local officials. In June and July, most people hoped school could resume in the ways it always had and this, I think, may have held them back from creating a fully planned “new normal.” Many early scenarios contained some online teaching in the event that in the future schools might need to be shuttered for periods of time, but they were hoping an overall easing of cases would permit in-class instruction. Most models contained scenarios for all in-class, hybrid and fully on-line to cover the unknown range of needs, but many did not. Unexpected spikes in Covid-19 in heretofore low-case regions escalated uncertainty about what the future might hold. Sizable ranges in the intensity of new cases within states and communities pointed to the need for more locally determined options.
Contributed by Ronald Baecker, who is an Emeritus Professor of Computer Science at the University of Toronto, co-author of The COVID-19 Solutions Guide and author of Computers and Society: Modern Perspectives (OUP, 2019).
Readers of my blog will recall what I describe as digital dreams and digital nightmares.
Our world has been enriched by digital technologies used for collaboration, learning, health, politics, and commerce. Digital pioneers imagined giving humanity greater control over the universe; augmenting knowledge and creativity; replacing difficult and dangerous physical labour with robot efforts; improving our life span with computationally supported medicine; supporting free speech with enhanced internet reason and dialogue; and developing innovative, convenient, and ideally safe products and services. Online apps and resources are proving very valuable, even essential, in the era of COVID-19.
Contributed by Masashi Crete-Nishihata. Masashi is the Associate Director of The Citizen Lab at the University of Toronto.
The Citizen Lab just published a report: Censored Contagion: How Information on the Coronavirus is Managed on Chinese Social Media, authored by Lotus Ruan, Jeffrey Knockel and Masashi Crete-Nishihata.
Among the key findings in this report, we show that YY, a popular live-stream platform based in China, began to censor keywords related to the coronavirus outbreak on December 31, 2019, only one day after doctors (including the late Dr. Li Wenliang) tried to warn the public about the then unknown virus.
Nosedive was the first episode of the third season of the British science fiction television anthology Black Mirror. In this episode, everyone has a mobile phone which, when pointed at another person, reveals his or her name and rating. Everyone has a rating, which ranges from 0 to 5. The following happens continually as you are walking down a street or along the corridor of a building. You give a ‘thumbs up’ or ‘thumbs down’ to each person you pass, based on your instantaneous impression of that person and the nature of the encounter, no matter how trivial or quick the encounter is. A ‘thumps up’ raises that person’s rating a tiny bit; a ‘thumbs down’ lowers it. The other person concurrently rates you. Ratings determine one’s status in life, and the ability to get perks such as housing and travel. Therefore, people are on a never-ending, stressful, and soul-destroying quest to raise their online ratings for real-life rewards. Heroine Lacie desires a better apartment; she has a meltdown as she deals with unsurmountable pressure in the context of her childhood best friend’s wedding.
There is still time to buy a substantive book for the thoughtful techie or concerned citizen in your life. Allow me to recommend two choices that were published in 2019. One good option is my wide-ranging textbook Computers and Society: Modern Perspectives, enough said …. But an unbiased choice is Shoshana Zuboff’s monumental The Age of Surveillance Capitalism. The author signals her intentions with the book’s subtitle: The Fight for a Human Future at the New Frontier of Power.
Zuboff, the Charles Edward Wilson Professor Emerita, Harvard Business School, defines and describes surveillance capitalism (p. 8):