Page 1 of 1

Moral Machines?

Posted: Wed Apr 01, 2009 6:37 pm
by philofra
PN is out with issue 72. Its topic is moral machines> robots. My initial reaction was why do we need moral machines? Don't humans have enough problems with morality without replicating that problems in robots?

I think it should be the robots' creators that should be moral in making them, making it unnecessary for them to be moral. And for robots to be moral they would have to be able to empathize. Could they be made to do that?

Perhaps we should clone people instead. That might be less of a problem.

Re: Moral Machines?

Posted: Thu Apr 02, 2009 5:48 pm
by John W. Kelly
We could make them as Greek gods. They certainly had faults and problems with morality!
If such a machine did exist, would we listen to it? Would we take it's advice?

Re: Moral Machines?

Posted: Sun Apr 05, 2009 9:41 pm
by philofra
I am surprised this topic has gotten so little attention, about why humans would want moral machines, let allow think about them. I guess some philosophers are viewing it as a challenge or because they have nothing better to do.

I think, if anything, people should be working to create saver and better machines and leave the moral thinking to humans, who have problems enough with morals. For machines or robots to be moral I would think you would have to replicated the entire human condition in those creatures. I am thinking of Data in Star Trek. I mean, although he is a very advanced machine even he is not capable of moralized because he isn't human, who doesn't even eat or go to the bathroom (I am thinking that is a prerequisite for being able to moralize).

We create machines and robots to do jobs either because we are incapable of doing them, for convenience or because we don't want to do a particular task. If robots someday become moral and thinking creatures they might revolt and say that if we don't like doing a certain task why should they. They might also prevent us from doing things that need doing because they decided it's immoral. They might decide we can't eat meat anymore.

Many people get upset about the fact that corporations are treated as artificial individuals> agents, in the eyes of the law but are not held up to the same standards as humans. Could that happen with robots who, in all argument, would also be classified as artificial individuals? Would they also want the the right to vote? Might some of them become benevolent dictators?

Re: Moral Machines?

Posted: Sun Apr 05, 2009 9:47 pm
by Psychonaut
I don't want to respond until I have read the mag.

Re: Moral Machines?

Posted: Sun Apr 05, 2009 10:22 pm
by Jack
There can be no type of authentic morality outside the confines of an authentic intelligent Being with Free Will.

Re: Moral Machines?

Posted: Thu May 07, 2009 7:47 pm
by Nisus
Jack wrote:There can be no type of authentic morality outside the confines of an authentic intelligent Being with Free Will.
I agree. A 'moral' machine is a contradiction in terms.

By the way, I do not believe in 'authentic' morality. Morality is a matter of convention and convenience.

Re: Moral Machines?

Posted: Sat May 16, 2009 3:08 am
by Rortabend
By the way, I do not believe in 'authentic' morality. Morality is a matter of convention and convenience.
© Friedrich Nietzsche

Re: Moral Machines?

Posted: Sat May 16, 2009 3:58 am
by Jack
Nisus wrote:
By the way, I do not believe in 'authentic' morality. Morality is a matter of convention and convenience.

I would call that a very pessimistic and naive viewpoint but who knows, there are probably many who would agree with you. Ignorance always has many comrades.