Search results
Results from the WOW.Com Content Network
A platform called Moral Machine [44] was created by MIT Media Lab to allow the public to express their opinions on what decisions autonomous vehicles should make in scenarios that use the trolley problem paradigm. Analysis of the data collected through Moral Machine showed broad differences in relative preferences among different countries. [45]
The Heinz dilemma is a frequently used example in many ethics and morality classes. One well-known version of the dilemma, used in Lawrence Kohlberg's stages of moral development, is stated as follows: [1] A woman was on her deathbed. There was one drug that the doctors said would save her.
Today, moral psychology is a thriving area of research spanning many disciplines, [9] with major bodies of research on the biological, [10] [11] cognitive/computational [12] [13] [14] and cultural [15] [16] basis of moral judgment and behavior, and a growing body of research on moral judgment in the context of artificial intelligence. [17] [18]
The Defining Issues Test is a component model of moral development devised by James Rest in 1974. [1] The University of Minnesota formally established the Center for the Study of Ethical Development [2] as a vehicle for research around this test in 1982. The Center relocated to larger premises within the University of Alabama and is now located ...
Such examples are quite common and can include cases from everyday life, stories, or thought experiments, like Sartre's student or Sophie's Choice discussed in the section on examples. [10] The strength of arguments based on examples rests on the intuition that these cases actually are examples of genuine ethical dilemmas.
Their Daedalus article became the first statement of moral foundations theory, [1] which Haidt, Graham, Joseph, and others have since elaborated and refined, for example by splitting the originally proposed ethic of hierarchy into the separate moral foundations of ingroup and authority, and by proposing a tentative sixth foundation of liberty.
Moral luck, the tendency for people to ascribe greater or lesser moral standing based on the outcome of an event. Puritanical bias, the tendency to attribute cause of an undesirable outcome or wrongdoing by an individual to a moral deficiency or lack of self-control rather than taking into account the impact of broader societal determinants . [133]
A related field is the ethics of artificial intelligence, which addresses such problems as the existence of moral personhood of AIs, the possibility of moral obligations to AIs (for instance, the right of a possibly sentient computer system to not be turned off), and the question of making AIs that behave ethically towards humans and others.