K-12 Education Pods: Problems, Fears and Options

Contributed Judith A. Langer, who is a Distinguished Professor Emeritus of Education, a researcher who has specialized in language, literacy, and learning, and one of the co-authors of The COVID-19 Solutions Guide.

August and early September of 2020 were extremely difficult times for everyone who had a stake in education: parents, teachers, school administrators and local officials. In June and July, most people hoped school could resume in the ways it always had and this, I think, may have held them back from creating a fully planned “new normal.” Many early scenarios contained some online teaching in the event that in the future schools might need to be shuttered for periods of time, but they were hoping an overall easing of cases would permit in-class instruction. Most models contained scenarios for all in-class, hybrid and fully on-line to cover the unknown range of needs, but many did not. Unexpected spikes in Covid-19 in heretofore low-case regions escalated uncertainty about what the future might hold. Sizable ranges in the intensity of new cases within states and communities pointed to the need for more locally determined options. 

Read More »

Pandemic models must be transparent and their creators must explain them publicly


Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of
Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.

forecasting model is a prediction of how the world will evolve, of what will happen in the future with respect to some phenomena (such as the motion of objects, the financial health of a business) or the spread of an epidemic.

Read More »

Digital collaboration technologies flourish during COVID-19


Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of
Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.

For most of human history, dyads and groups were only able to work and play together if they were collocated.  All of this changed in the 19th century, when the first remote collaboration and entertainment technologies — the telegraph, the telephone, and the radio — were developed and widely commercialized.  These were joined in the 20th century by television.  By the middle part of the century, medical images were being transmitted over phone lines; soon thereafter, 2-way television was being used for remote medical consultations.

Read More »

Intelligent tutors


Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of
Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.

In this column, in my textbook, and in a speech “What Society Must Require from AI” I am currently giving around the world, I document some of the hype, exaggerated claims, and unrealistic predictions that workers in the field of artificial intelligence (AI) have been making for over 50 years.  Here are some examples.  Herb Simon, an AI pioneer at Carnegie-Mellon University (CMU), who later won a Novel Prize in Economics, predicted in 1958 that a program would be the world’s best champion by 1967.   Marvin Minsky of MIT, and Ray Kurzweil, both AI pioneers, made absurd predictions (in 1967 and 2005) that AI would achieve general human intelligence by 1980 and by 2045.  John Anderson, discussed below, made the absurd prediction in 1985 that it was already feasible to build computer systems “as effective as intelligent human tutors”.   IBM has recently made numerous false claims about the effectiveness of its Watson technology for domains as diverse as customer support, tax filing, and oncology.

Read More »

Ethics throughout a Computer Science curriculum


Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of
Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.

Every Computer Science student should get significant exposure to the social, political, legal, and ethical issues raised by the accelerating progress in the development and use of digital technologies.

The standard approach is to offer one undergraduate course, typically called Computers and Society or Computer Ethics.  I have done this during the current term at Columbia University, using my new textbook, Computers and Society: Modern Perspectives (OUP, 2019).  We meet twice a week for 75 minutes.  In class, I present key topics covered in the book, and welcome a number of guest speakers who present their own experiences and points of view.  Every class is interactive, as I try to get the students to express their own ideas.  There have been four assignments: a policy brief, a book report, a debate, and a research paper.  Such courses are typically not required by major research universities, which is a mistake, but they are often required by liberal arts colleges.

Read More »

Diverse design thinking in technology

Contributed by Muriam Fancy. Muriam is a masters student at the Munk School of Global Affairs and Public Policy. She recently completed her BA in Peace, Conflict, and Justice with a double minor in Indigenous Studies; Diaspora & Transnational Studies. She runs Diverse Innovations (@diverseinnovat1), a platform discussing social good technology.

Amazon launched an artificial intelligence (“AI”) system in efforts to revolutionize its recruitment strategy, and found that their AI program was discriminatory against women. A Chicago court implemented an AI system called COMPAS to do a predictive risk analysis of the chances offenders are to re-offend either by committing the same crime that they were charged for or committing a more significant offense. However, the AI system used discriminated against black defendants noting that they will most likely commit a more significant offense in comparison to white defendants – read more in Chapter 11 of Computers and Society: Modern Perspectives

Read More »

The importance of research


Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of
Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.

Many issues discussed in Computers and Society: Modern Perspectives suggest a need for legal remedies, such as the case of monopoly power in digital technology industries.  Other issues raise ethical quandaries, such as the cases of employees of such firms who find actions of their employers immoral.  In almost all cases, such as technology addiction, fake news, and unjust algorithms, wise legal actions and informed moral choices depend upon having good information about what, how, and why things are happening.  This requires research.  In an excerpt from his excellent recent book The New ABCs of Research: Achieving Breakthrough Collaborations, published by Oxford University Press, Emeritus Prof. Ben Shneiderman suggests that what is needed is applied research illuminating context and situations coupled with basic research illuminating causes.

Read More »

Must computer science students learn about ethics?


Ron Baecker is Emeritus Professor of Computer Science at the University of Toronto, author of
Computers and Society: Modern Perspectives, co-author of thecovidguide.com, organizer of computers-society.org, and author of the recently published Digital Dreams Have Become Nightmares: What We Must Do.

My textbook — Computers and Society: Modern Perspectives — may be used in a variety of courses and contexts, but is intended primarily for use by Computer Science (CS) Departments, as they attempt to educate and train tomorrow’s software professionals, managers, and IT leaders. If we want to monitor how well departments are doing this job, we should ask is if they are sensitizing their students to the ethical responsibilities of the profession. It is useful to contrast the attitudes and performance of CS Departments, typically situated in science faculties, with departments in Faculties of Engineering.

Concern over ethics in Engineering began after several major disasters late in the 19th century and early in the 20th century, notably several bridge failures and the Boston molasses disaster, in which a flood or molasses wreaked havoc on nearby building and train systems.  There already had been created professional societies such as the American Society of Civil Engineers and the American Institute of Electrical Engineers.  These societies then moved quickly to introduce Codes of Ethics and requirements for licensing and accreditation, which ultimately caused university departments and faculties to include some learning about and practice with ethical concerns as part of their curricula.  A later development was the creation in 1954 by the National Society of Professional Engineers of a Board of Ethical Review.

Read More »