Devoid of Humanity? Ask this "Ethics" App

The Marrkkula Center for Applied Ethics at Santa Clara University in Silicon Valley held a contest asking students to develop an Ethical Decision making app that incorporated values presented in the "Framework for Ethical Decision Making":
"Making good ethical decisions requires a trained sensitivity to ethical issues and a practiced method for exploring the ethical aspects of a decision … The more novel and difficult the ethical choice we face, the more we need to rely on discussion and dialogue with others about the dilemma. Only by careful exploration of the problem, aided by the insights and different perspectives of others, can we make good ethical choices in such situations. We have found [this] framework for ethical decision making a useful method for exploring ethical dilemmas and identifying ethical courses of action."
See any problems with this brand of moral decision making? What discussion and dialogue or insights of different perspectives can you get from your phone?

Why did the Markkula Center for Applied Ethics want to develop such an app? According to the Chronicle of Higher Ed "The Santa Clara ethicists hope that people who make decisions that will change lives—business leaders, hospital administrators, and school officials, for instance—will use the app as a guide."

Really? We are going to offer our public leaders an app in order to make moral and ethical decisions?

As you can imagine there are some skeptics--and they are hashing it out on social media. The moral decision making app “references terms the noneducated in ethics won’t understand & is hilariously oversimplistic for those who are,” wrote one skeptic on Twitter. This just keeps getting better and better. Discussion and dialogue about moral ethics in 140 characters or less.
But simplicity is part of the idea, says Miriam Schulman, assistant director of the applied-ethics center. 
“We tend to work with people where the rubber meets the road,” she says. The point is not to get a client up to speed on thousands of years of moral philosophy, says Ms. Schulman. Instead, it’s to get him or her to deliberate in a slightly more organized way.
Are we really that devoid of morals?  Do we really need to study "thousands of years of moral philosphy" in order to make an ethical decision? Don't these people have parents or kids or even pets?

Your capacity to make deliberate choices is what makes you human.  Are you going to abandon your humanity to an app?  If so, why should you be allowed to vote? Or raise children? Or decide who gets medical treatment? Or who should attend college? What does the college-educated "app" between your ears think about that?

Comments

  1. In this day and age, we rely on technology far too much. We are meant to be taught our morals and ethics through experience and environmental influence, not some device lacking an actual human heart or real experiences. An app hasn't seen a baby born, or watched a stranger get mugged on the street. It has no experience in what is right or wrong. It is a chunk of heartless facts and knowledge, not human emotion. An app is incapable of making a bond. I believe that what is right comes from what you know as true verses what fits the public's idea of right. Is it really that hard to grasp morality that we leave technology to do it for us?

    ReplyDelete
  2. First off I would like to say that machines have been taking over our burdens and responsibility’s for some time now. Most Manuel labor can be easily and more efficiently done by machines, we no longer need to know math because that’s what calculators are for, so how many more pains can machines relive us of?, how about making moral decisions? Those pesky moral decisions are such a nuisance. I don’t run around town looking for restaurants, I have an app for that. When I need to drive somewhere I don’t take out a map, compass, and a protractor, I have an app that too. So why not have an app that steers me in the right moral direction? As for school officials and public administrators using this app, it’s actually a great idea, as long as it says somewhere in their contract they have to apply that moral decision. I’m sure when school officials and public administrators make immoral decisions they do so knowingly. No one embezzles state money or has inappropriate relations with a student thinking “it was the moral and ethical thing to do”, they may justify their actions, but they still know it’s wrong. So if a computer made these decisions for them it would say “hey! No! Bad, don’t do that”. Also a good chunk of the population could possibly benefit from such an app, not everyone was raised the same, so how do you obtain morals when you have no idea what they are? The same way you find out what xenotransplantation is, you use a computer. (Yes, I know you could also use a dictionary, but those are outdated, pretty soon kids will be googling the word dictionary).

    ReplyDelete
  3. Philosophers have defined ethics in many different ways over time. Utilitarianism resolves issues based on the outcome that is for the benefit of everyone; on the other hand Kant resolved issues by focusing on the intentions of humans actions no matter what the outcome. Based on this article, the app guides ethical decisions when we are faced with these ethical dilemmas. The creators of the app simply state that they want to guide people or organize their thoughts so they can make better choices. However there are different ways of viewing issues in our world. For example does the app recommend not lying to a murderer who is trying to find your friend? Not reporting your longtime friend of sexual harassment in the workplace? In both cases a human might lie to protect the friend (utilitarianism), but isn’t lying morally wrong because it is irrational (Kant)? Ethical issues are very situation specific in our world and I doubt that an application can account for all situations. Also, does the application lean towards more universal or situational specific ethical issues? Either way the application undermines the power all humans possess to act in certain ways, to cheat or not, to lie or tell the truth, this so-called decision making application could be used for entertainment purposes but intended for people to make ethical choices in their profession is ridiculous.

    ReplyDelete
  4. The most difficult part of our modern society is defined one true person’s character. Even if they are to create an app making decision app based on ethical and moral, then who is this person to define what is moral and ethical. There are 7 billion people on this Earth, are we all going to believe and follow one person philosophy on ethical and moral? So am I going to abandon my humanity to an app? No, I will absolutely not. We’re all leading a different life because of the choices we make to fit to our limited resources we have. We follow our own philosophy and adapt to new one as life goes on to protect our family and love ones.
    What makes human species so special is the flexibility ability to adapt for our survival. The moment that humanity accepts this app is the moment we lost what make us human.

    ReplyDelete
  5. Automizing biased decisions to best support all hierarchal classes is dangerous and impossibly daunting. How can an iPhone app be programmed to factor in all variables regarding the decision and even if it could, how then could people trust the programmer's biases? Production of this app and large scale use could produce a new generation of human void of critical thinking skills and decisiveness. Promoting this app to people in high social and economic positions targets power spots. If even one big business boss adhered to this app, millions of lives would be affected by an emotionless, brainless judge lacking jury and trial.

    ReplyDelete

Post a Comment

Popular posts from this blog

What Superpower Would You Choose?

MLA Citations

17 Majors Where you Might Not Find a Job