Philos 1332 Week 2 Notes
Philos 1332 Week 2 Notes Philos 1332
Popular in Engineering Ethics
Popular in PHIL-Philosophy
This 2 page Class Notes was uploaded by Tara Zahnke on Tuesday February 23, 2016. The Class Notes belongs to Philos 1332 at Ohio State University taught by Kate McFarland in Spring 2016. Since its upload, it has received 13 views. For similar materials see Engineering Ethics in PHIL-Philosophy at Ohio State University.
Reviews for Philos 1332 Week 2 Notes
If you want to pass this class, use these notes. Period. I for sure will!
-Mr. Eunice Halvorson
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/23/16
Lecture 4 January 20 , 2016 -Stevenson on Disagreements -Attitude (typically on normative claims) -Belief (typically on factual claims) -Types of Claims a) Factual/Descriptive b) Normative -Disputes are resolved when speakers agree on attitude. -Attitudes/Values: determine what beliefs and facts are relevant. -George The Chemist Case: -George should take job with chemical weapons -If he does take the job, he can use his morals to develop more ethical weapons -If does George does not take the job, the position is not eliminated. (Factual) -If he does not, he will be poor and hungry. -He has an obligation to provide for his family. -George should not take the job with chemical weapons. *(Philosopher’s stance) -He should stand by his beliefs (Normative) -He opposes chemical weapons (Factual) <- Start with this claim* -Would violate international agreement on banning chemical weapons (Factual) -He may not do the job to the best of his abilities (Normative) -Not doing his job well is unfair to his employer and peers. Lecture 5 January 22 , 2016 Moor’s Levels of Ethical Agents Ethical Impact Agents: Can be assessed/evaluated for ethical impact. Implicit Ethical Agents (Operational): Designed with ethical principles in mind, product reflects the ethical values of the designer. Explicit Ethical Agents (Functional): Programmed to reason through ethical questions, manipulates moral concept. Full Ethical Agents: Freely choose to act on moral principles or against them, and justify the decision. P1) If one is a full ethical agent, one must have free will. P2) Robots can only do what they’re programmed to do. P3) Robots don’t have free will. C) Robots can’t be full ethical agents. Asimov’s Three Laws 1. ) Must not harm humans or let humans be harmed. 2.) Obey human commands 3.) Should preserve itself.
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'