Computer Ethics

Norbert Wiener and James Moor

Christopher L. Holland

Saint Louis University

August 27, 2024

Norbert Wiener (1894–1964)

  • Born in Columbia, Missouri
  • BA in mathematics at 14 from Tufts
  • Philosophy at Cornell at 17
  • 18, PhD from Harvard

After the WWII

Wiener wrote Cybernetics (1948), The Human Use of Human Beings (1950), and God and Golem, Inc. (1963).

Topics included:

  • computers and security
  • computers and unemployment
  • responsibilities of computer professionals
  • computers for persons with disabilities
  • information networks and globalization
  • virtual communities
  • teleworking
  • merging of human bodies with machines
  • robot ethics
  • artificial intelligence
  • computers and religion

Wiener’s Four Principles of Justice

  • The Principle of Freedom
  • The Principle of Equality
  • The Principle of Benevolence
  • The Principle of Minimum Infringement of Freedom
The Principle of Freedom
Justice requires “the liberty of each human being to develop in his freedom the full measure of the human possibilities embodied in him.”
 
The Principle of Equality
Justice requires “the equality by which what is just for A and B remains just when the positions of A and B are interchanged.”
 
The Principle of Benevolence
Justice requires “a good will between man and man that knows no limits short of those of humanity itself.”
 
The Principle of Minimum Infringement of Freedom
“What compulsion the very existence of the community and the state may demand must be exercised in such a way as to produce no unnecessary infringement of freedom”

Wiener’s Methodology

  1. Identify an ethical question or case regarding the integration of information technology into society. Typically this focuses upon technology-generated possibilities that could affect (or are already affecting) life, health, security, happiness, freedom, knowledge, opportunities, or other key human values.
  2. Clarify any ambiguous or vague ideas or principles that may apply to the case or the issue in question.
  1. If possible, apply already existing, ethically acceptable principles, laws, rules, and practices (the “received policy cluster”) that govern human behavior in the given society.
  2. If ethically acceptable precedents, traditions and policies are insufficient to settle the question or deal with the case, use the purpose of a human life plus the great principles of justice to find a solution that fits as well as possible into the ethical traditions of the given society.

James H. Moor

  • Professor of Intellectual and Moral Philosophy at Dartmouth College
  • Highly influential article: “What is computer ethics” (1985)
  • Developed an ethical theory known as Just Consequentialism

Moor on Logical Malleability

 

Computers are logically malleable in that they can be shaped and molded to do any activity that can be characterized in terms of inputs, outputs, and connecting logical operations. … The computer is the nearest thing we have to a universal tool. Indeed, the limits of computers are largely the limits of our own creativity.

Moor’s Methodology

  1. Identify a policy vacuum generated by computing technology.
  2. Eliminate any conceptual muddles.
  3. Use the core values and the ethical resources of just consequentialism to revise existing–but inadequate–policies, or else to create new policies that justly eliminate the vacuum and resolve the original ethical issue.

Important Terms

The terms and definitions here comes from Rehg (2017, ch. 1). We will use these terms throughout the semester.

 

Moral Triggers
Morally problematic uses and effects of ICTs in social-institutional domains and practices, those places where new cyberpractices call for moral inquiry. An ICT should trigger moral reflection not only if it creates injustice or doubts about right conduct, but also if it threatens virtuous character or flourishing.
Conceptual Muddles
New cybertech creates a conceptual muddle when it changes a social practice or activity in such a way that some concept or set of concepts connected with the practice or activity becomes unclear or contentious for members of the practice.
 
Policy Vacuums
A policy vacuum exists when two conditions come together: (1) some regular activity or area of social practice is of moral concern (2) standards of appropriate behavior and technological design for that activity or practice are non-existent, outdated, or poorly conceived.
Moral Opacity
Conceptual challenges in cyberethics are exacerbated by the “moral opacity” of many cyberpractices. A cybertechnology and its associated practices are morally opaque insofar as the people involved are unaware of morally problematic features, either because they lack knowledge of the technology itself or because they fail to notice the values embedded in the ICT design or use.

Sources

Bynum, Terrell. 2001. “Computer Ethics: Basic Concepts and Historical Overview.” In The Stanford Encyclopedia of Philosophy, edited by Edward N. Zalta, Winter 2001. Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/win2001/entries/ethics-computer/.
———. 2020. “Computer and Information Ethics.” In The Stanford Encyclopedia of Philosophy, edited by Edward N. Zalta, Summer 2020. Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/sum2020/entries/ethics-computer/.
Moor, James H. 1985. “What Is Computer Ethics?” Metaphilosophy 16 (4): 266–75. https://doi.org/10.1111/j.1467-9973.1985.tb00173.x.
Rehg, William. 2017. Cogent Cyberethics. Unpublished manuscript.