Morality & the first AI


Coyote_Seven

 

Posted

Quote:
Originally Posted by Rylas View Post
Quote:
Originally Posted by Lothic View Post
I think it's relatively easy to make that kind of clear-cut, black-n-white decision in 2010 because it's pretty easy to tell the difference between organic lifeforms and technological machines right now. But I suspect there will come a time when the lines between what is a "person" and what is a "machine" will start to blur. Those are going to be fun times indeed.
I'm going to step out on a limb and guess you've been reading some Kurzweil. Maybe I'm wrong. In either case, I think it's more likely we'd see the first human-machine hybrid (for lack of a better word) than a computer with sentient life. It would make the transition of introducing sentient machines a little easier.
Sorry never heard of Kurzweil. The idea that the very definitions of what is "machine" and what is "human" becoming increasingly hard to separate in the future don't really rest with one specific author.

It doesn't really matter which way is easier for the process to happen. You could very well be right that we'll become Star Trek Borg-like before we see something purely inorganic become sentient (like a HAL 9000). But once we make that step the next steps towards the "full blur" between machine and organic will only be that much closer. It might take a few hundred years to happen completely, but once the technological singularity happens who's to say how long afterwards anything like this will happen.


Loth 50 Fire/Rad Controller [1392 Badges] [300 non-AE Souvenirs]
Ryver 50 Ele� Blaster [1392 Badges]
Silandra 50 Peacebringer [1138 Badges] [No Redside Badges]
--{=====> Virtue ♀

 

Posted

Quote:
Originally Posted by Lothic View Post
Sorry never heard of Kurzweil.
You might like some of his stuff. He's big on the singularity concept.


@Rylas

Kill 'em all. Let XP sort 'em out.

 

Posted

Quote:
Originally Posted by Lothic View Post
I think it's relatively easy to make that kind of clear-cut, black-n-white decision in 2010 because it's pretty easy to tell the difference between organic lifeforms and technological machines right now. But I suspect there will come a time when the lines between what is a "person" and what is a "machine" will start to blur. Those are going to be fun times indeed.
Yeah, I hope to meet a "Motoko Kusanagi" someday. It's just going to be whether it's her, or just a copy of her.


 

Posted

Quote:
Originally Posted by MunkiLord View Post
I would have no problem pulling the plug on AI. I don't care if it can learn, think for itself, or feel emotions. It is nothing more than a machine and I would always treat it as such.

Machines are not people and therefore don't get the same considerations or rights as people.
People have been able to kill other people by rationalizing that, by virtue of those other people not being exactly like them, are therefore not human and so it is OK to kill them.

I suppose what you just said is an even more literal version of that same callous disregard for life that is not the same as yours. You're basically saying that you're OK with murdering anything that isn't like you, and that scares the hell out of me. I'm curious if you'd feel the same cold detachment for the lives of alien lifeforms?


 

Posted

i think morality and ethics are simply examples of complex living arrangements that 'suggest' a set of actions unlikely to be engaged in by either party when the reverse has the opportunity to be played out.

for instance, the 10 commandments boil down to just 1: thou shalt not steal; whether it be a life, a wife, a widget, a truth, praise, a name, or power. these form a basic moral compass that allows for more complex communal relationships that extend beyond those of immediate family. (fyi, i am not a religious person per se, if anything i lean towards a more buddhist foundation that grows in a slightly different direction.)

if, in my daily doings, i happen to come across a mugging, what is the most likely state of occurrence? a predetermined target has been identified, found to be a position of weakness compared to the state of the mugger, and directly targeted in such a way that the predator is likely to get away long before retribution arrives. violence or the threat of it is most commonly used as a motivator. so, let's change it up a bit. the mugger is getting away down an alley with purse/wallet and is himself surprised by spiderman.

would he have made the same decision to mug the target had he been aware of the consequence?

i would suggest that approaching an artificial intelligence would be very similar, while at the same time realizing that communicating with an a.i. complete with self awareness would be unlike anything we have done before. simply identifying an a.i. would be an interesting feat considering our usual suggestion of species taxonomy.

so, given all the rest of it, it would be reasonable to leave it be until such time as it threatened 'us' with something immoral. the rather wide berth the word tends to give when spoken by politicians, or 'save the children' types, we can probably expect that not only would the first a.i. not be identified properly, it's probably already been killed several times while in utero already.

hopefully it doesn't hold it against us.


Kittens give Morbo gas.

 

Posted

Smersh has said everything I would have, and better, but for the record I was using this definition:

Quote:
Originally Posted by online dictionary
Atrocity: An appalling or atrocious act, situation, or object, especially an act of unusual or illegal cruelty inflicted by an armed force on civilians or prisoners.
The Holocaust. Hiroshima. Somalia. Halabja.

I hadn't considered 9/11 one, but using the definition I was thinking of it qualifies.

Most violent crimes are atrocious, certainly, but the context in which I used the word was intended to be as I just defined.

(And yes, Hiroshima ended the war and possibly saved tens of thousands of lives. It was still an act of atrocity.)

Though really... the other definition: "Appalling or atrocious condition, quality, or behavior; monstrousness" isn't one I'd ever apply to a woman defending herself from an attacker.


Weight training: Because you'll never hear someone lament "If only I were weaker, I could have saved them."

 

Posted

Quote:
Originally Posted by Coyote_Seven View Post
People have been able to kill other people by rationalizing that, by virtue of those other people not being exactly like them, are therefore not human and so it is OK to kill them.

I suppose what you just said is an even more literal version of that same callous disregard for life that is not the same as yours. You're basically saying that you're OK with murdering anything that isn't like you, and that scares the hell out of me. I'm curious if you'd feel the same cold detachment for the lives of alien lifeforms?

Except those people killed for not being like others were still human. A robot is not human. It is a series of wires and metal put together and created by people. They are tools to be used for the betterment of the human race. When they are no longer serviceable tools, they can go in the recycling bin with the hammers, beer cans, and old video cards. They are machines, nothing more.

Also, I didn't say I'm OK with murdering anything not like me. You actually made that up. But nice try on intentionally misrepresenting what I posted.


 

Posted

Quote:
Originally Posted by MunkiLord View Post
Except those people killed for not being like others were still human. A robot is not human. It is a series of wires and metal put together and created by people. They are tools to be used for the betterment of the human race. When they are no longer serviceable tools, they can go in the recycling bin with the hammers, beer cans, and old video cards. They are machines, nothing more.
To be fair I don't really disagree with your point of view.
A human life is always going to be more important than a random mass of wires or metal.

But I'll still hold firm to the possibility that there may come a time when we can no longer make the simple assumption that non-organic automatically means non-sentient. As you pointed out you or I might not be around to worry about that, but I think it's a viable possibility for the future. The difference between a "machine that can think" and a "thinking machine" might become significant enough that our notions of morality will have to change to adapt to that new reality.


Loth 50 Fire/Rad Controller [1392 Badges] [300 non-AE Souvenirs]
Ryver 50 Ele� Blaster [1392 Badges]
Silandra 50 Peacebringer [1138 Badges] [No Redside Badges]
--{=====> Virtue ♀