Is ownership of advanced androids unethical? [part deux]

Post » Sat Nov 28, 2015 2:54 am

True.

My point is, all forms of existence rather organic or mechanical that exhibit signs of self awareness or sentience should be treated equally. Whether they are human or not doesn't matter.

User avatar
Mrs. Patton
 
Posts: 3418
Joined: Fri Jan 26, 2007 8:00 am

Post » Sat Nov 28, 2015 4:13 am

Ah. Speciesism.

User avatar
Siobhan Thompson
 
Posts: 3443
Joined: Sun Nov 12, 2006 10:40 am

Post » Sat Nov 28, 2015 1:42 am


Considering that its still a machine with a byproduct of sentience there is no wrong in destroying or enslaving it. As for the master [censored] slaves for more slaves we are talking about humans and it seems unethical to enslave a human before he has made his first step.
User avatar
rheanna bruining
 
Posts: 3415
Joined: Fri Dec 22, 2006 11:00 am

Post » Fri Nov 27, 2015 9:22 pm

Yeah, because we can turn on the news and see how well that sage advice works among humans dealing with each other.

User avatar
Euan
 
Posts: 3376
Joined: Mon May 14, 2007 3:34 pm

Post » Fri Nov 27, 2015 10:51 pm

A byproduct that was the only logical conclusion of the level of intelligence they created it with.

User avatar
matt white
 
Posts: 3444
Joined: Fri Jul 27, 2007 2:43 pm

Post » Fri Nov 27, 2015 4:45 pm

That figures back into the whole how do you prove you're self-aware argument. We can't really prove Cerberus is self-aware, nor can we prove that A3-21's programming isn't just glitching out. But people are more willing to give A3-21 the benefit of the doubt because it looks human, making it easier for them to project humanity onto it.

User avatar
SiLa
 
Posts: 3447
Joined: Tue Jun 13, 2006 7:52 am

Post » Fri Nov 27, 2015 11:36 pm

You're right! Humans are the problem! We should kill them all! Seriously though, just cuz we have a tendency to start conflict doesn't mean it's inevitable, especially in this case where you can avoid the initial conflict by just not enslaving something. It would literally be that simple.

Why not? It's wrong to enslave a person because ~sentience~ but enslaving something ELSE with sentience but is NOT human is not objectionable? How does that work?

EDIT:

Not really. Harkness repeatedly mentions self-awareness and wanting to be a 'person' as his motive and then takes a number of actions and decisions to actualize it. Cerberus does not. He isn't looking for loops in his code or cobbling together something to uninhibit him he just [censored]es passive-aggressively. It's just twisting in the wind of it's own programming. Probably.

User avatar
Alyce Argabright
 
Posts: 3403
Joined: Mon Aug 20, 2007 8:11 pm

Post » Sat Nov 28, 2015 2:09 am


nope its wrong to enslave because its human and has a soul not because its sentient. sentience doesnt give you equal rights its power really that gives you rights and whoever possesses that power. so androids who had the power to grant rights would most likely make androids to be the masters of all things. or powerful aliens which would enslave weaker humans.
User avatar
suzan
 
Posts: 3329
Joined: Mon Jul 17, 2006 5:32 pm

Post » Sat Nov 28, 2015 6:39 am

That's like saying the slaves that aren't actively trying to escape don't deserve to not be slaves.

Is Clover not a person because she can't break free of whatever Eulogy Jones did to her to make her unable to seek her own destiny?

User avatar
Sweets Sweets
 
Posts: 3339
Joined: Tue Jun 13, 2006 3:26 am

Post » Fri Nov 27, 2015 8:52 pm

Humans have survived for at least hundreds of millenia doing our own work. There is no reason to keep a machine that shows signs of self-awareness under your control other than the desire to make someone or something else to do your work, IMHO. Enslavement, control, whichever term you prefer, of androids isn't necessary. It wouldn't lead to the extinction of human life. And if it does, it's because humans simply didn't work hard enough to ensure their own existence. Millions of species have gone extinct, we would join them.

We live. Survival is a part of life. Laziness isn't conducive to survival.

:shakehead:

First off, what's a soul? Can you show me a soul? Can you define it?

Second, we're talking ethics. The right of might isn't ethical. Just because you can do something doesn't mean you should. Ask the native Americans. Ask Africans. Ask the Jewish people of Germany.

If androids were to pursue that goal, then we'll deal with it. Innocent until proven guilty.

User avatar
Natasha Callaghan
 
Posts: 3523
Joined: Sat Dec 09, 2006 7:44 pm

Post » Sat Nov 28, 2015 4:38 am

Actually, this thread belongs to a philosophical forum. This matter is a matter that would definitely be discussed in the future, when AIs are already mind-staggeringly advanced.

Although synthetic they are, androids with advanced AI is sentient, is able to reason, and is able to have emotions if they are sufficiently advanced to emulate them.

Does that make them human?

Obviously, no. They are synthetic beings. Literally speaking, they have no souls. If they are shut down, they could be rebooted again, although the experience might be strange to them, or maybe even painful, and we don't know it yet. If humans or other organic beings died, they can't be resurrected from being dead (cardiac arrest is not equivalent to true death, mind you). They are made to look human, but they are not human...

Does this justify that we could treat them as a tool? We don't know. It is simply unknowable. Even though they are literally soulless, it doesn't mean that they are ordinary robots that could be expended. Even though they could think and feel and have emotions, it doesn't mean that they are truly... human.

The matter of AI is still unclear to our present society. Simply to be put, we don't know yet. Are they to be bestowed with the same rights? Are they to be exploited?

We will not know until the time comes.

However, it is clear that we could form bonds with them. Hell, we even sympathise, empathise, or even feel a strong bonds with things that doesn't exist - take Fallout for example. Is it really happening? No. Is California ruled by the NCR right now? No. Do we have miniaturised fission reactor now? Not yet. Yet, we feel that they really exist, that we sympathise and empathise with the characters in it, the setting, the story.

Given that androids exist (although in a very primitive state, and barely sapient), doesn't it mean that someday, we could form bonds with them? Yes.

What kinds of bonds? It's up to you. You could enslave them, you could befriend them, you could ignore them, you could "kill" them, you even could fall in love with them. (Yes, this is the fact. People can fall in love with objects.)

Basically, androids are people, but they are not human. They are synthetic beings. Are they deserving of your sympathy or your scorn? It's up to you entirely, like how you treat fellow humans.

User avatar
sarah simon-rogaume
 
Posts: 3383
Joined: Thu Mar 15, 2007 4:41 am

Post » Fri Nov 27, 2015 5:42 pm

What is a soul? Are you religious? Cuz we can probably save some time if I knew whether you were religious.

No, it's like saying my PC doesn't want emancipation. It doesn't even have the capacity to think in those terms. A Mr Gutsy is not an AI. If this was ME I'm 98% Mr Gutsy's and other military bots would be VIs, complicated simulacra but there's nobody home.

User avatar
phil walsh
 
Posts: 3317
Joined: Wed May 16, 2007 8:46 pm

Post » Sat Nov 28, 2015 12:33 am

When I play Fallout 4, androids will be enslaved and or destroyed, and the Railroad too. They are machines plain and simple. Any arguments to say otherwise just reeks of politically correct bs to make oneself feel morally superior.

User avatar
Bek Rideout
 
Posts: 3401
Joined: Fri Mar 02, 2007 7:00 pm

Post » Fri Nov 27, 2015 6:22 pm


Not a fanatic but somewhat.
User avatar
tegan fiamengo
 
Posts: 3455
Joined: Mon Jan 29, 2007 9:53 am

Post » Sat Nov 28, 2015 7:05 am

They are machines. But they are also self-aware.

Don't lie to yourself. Do what thou wilt, as long as you realize the truth.

User avatar
Ana Torrecilla Cabeza
 
Posts: 3427
Joined: Wed Jun 28, 2006 6:15 pm

Post » Sat Nov 28, 2015 7:12 am

And yet again the "No because they don't have a soul" argument is brought up. What is a soul? By the much more rational definition I postulated http://www.gamesas.com/topic/1526502-is-ownership-of-advanced-androids-unethical/page-3#entry24147890, they can have one.

User avatar
Marie
 
Posts: 3405
Joined: Thu Jun 29, 2006 12:05 am

Post » Sat Nov 28, 2015 6:53 am

A soul, from what I've seen, is a justification people use to harm or oppress whatever they don't understand or see as being like them. See: women, black people, homosixuals, etc.

I guarantee if we ever find alien life people will justify killing or enslaving them since "they don't have souls."

User avatar
Miragel Ginza
 
Posts: 3502
Joined: Thu Dec 21, 2006 6:19 am

Post » Sat Nov 28, 2015 9:01 am

Yes, we truly don't know what soul is.

Actually, do someone other than me could have soul? Do I really have a soul? Do souls exist? Is the world only stimuli picked up by my senses? Am I even existing?

That is the question that we will not know the answer until we die. Even death could be questioned if you would.

Is death truly the end? Does life truly end in eternal oblivion? Do we reincarnate? Is there any afterlife?

We will not know those answers until we die. Will we die? Yes. Every living thing must die.

User avatar
Soku Nyorah
 
Posts: 3413
Joined: Tue Oct 17, 2006 1:25 pm

Post » Sat Nov 28, 2015 9:06 am

I said it before and I'll say it again.

The only thing I'm absoutely sure of is that I'm sapient. As far as I'll ever be able to determine, everyone else around me could be sapient, or could just be a bunch of dumb-dumbs that make a good show at being sapient. If an android displays a similar good show at being sapient, I logically MUST accept it at being sapient if I accept that everyone else is sapient. This is for the sole reason that I fundamentally will not be able to determine weather or not the sapience that the android shows is genuine sapience or a good simulation. And, of course, if I accept that said android is sapient, it's unethical to own one because it's unethical to own a sapient being.

Unless someone can scientifically prove a difference between true and simulated sapience and explain how the latter is not "real" while the former is, there is no appreciable difference between the two. Therefore, unless it's blatantly obvious that the android is not sapient*, it's going to be impossible to determine weather or not it is. And ethically, it's better to assume that it is. I'd rather be proven that my android buddy isn't actually sapient than to be proven that my android slave is actually sapient.

*This cannot be quantified in the same way the https://en.wikipedia.org/wiki/Uncanny_valley cannot be quantified. We will know that it's good at mimicing sapience due to a bunch of little things that we'd be hard-pressed to describe them as more than just "not quite right". We know Furby is not sapient, and we know a human is. There's going to be a line where a machine can ACT sapient, but we will still know it's not because its "sapience" is... not quite right.

User avatar
Taylor Tifany
 
Posts: 3555
Joined: Sun Jun 25, 2006 7:22 am

Post » Fri Nov 27, 2015 10:55 pm

But clearly we see a Mr. Gutsy that doesn't want to be working for a bastard/saint, and another that expresses a desire to be free from its combat inhibitor. I think that speaks to them possessing some level of intelligence beyond base programming.

Keep in mind, all Mr. Gutsies have combat inhibitors, so it's not like Cerberus is just recognizing something foreign preventing him from following through his function.

User avatar
Julie Serebrekoff
 
Posts: 3359
Joined: Sun Dec 24, 2006 4:41 am

Post » Sat Nov 28, 2015 7:53 am

EXACTLY.

One thing we can learn from the theory of solipsism is that we must have a respect for anything that seems to be a self-aware being.

User avatar
CHANONE
 
Posts: 3377
Joined: Fri Mar 30, 2007 10:04 am

Post » Sat Nov 28, 2015 1:47 am

Ditto.

However, it is also a dichotomy. Do we really even respect fellow humans to the utmost degree? Do we always treat all person like how we would like them to treat us? Sadly, no.

We are hypocrites of the greatest degree, all humans are.

User avatar
Roanne Bardsley
 
Posts: 3414
Joined: Wed Nov 08, 2006 9:57 am

Post » Fri Nov 27, 2015 5:47 pm

Or, and I'm just spitballing here it works something like this: Mr Gutsy has a directive, in Cerberus' case it's something along the lines of "kill all intruders" or whatever. Now these intruders somehow find a way to either add an extra combat inhibitor or just alter his. His base programming is still saying "KILL ALL INTRUDERS" but he can't because there's this extra bit that stops him now. His base programming hasn't been overwritten just blocked. So he can recognize a foreign directive stopping him from functioning as he should and he bemoans this, like a BSoD almost, but he can't really do anything to fix it himself because he doesn't have any desires.

So souls come from god yes? It's nothing measurable but it puts humanity apart/above?

User avatar
D LOpez
 
Posts: 3434
Joined: Sat Aug 25, 2007 12:30 pm

Post » Sat Nov 28, 2015 1:38 am


I guess. this is why animals are pets.. serve as food and below humans.
User avatar
brenden casey
 
Posts: 3400
Joined: Mon Sep 17, 2007 9:58 pm

Post » Sat Nov 28, 2015 3:01 am

Homo Sapiens is taxonomically classified as an animal.

User avatar
Enny Labinjo
 
Posts: 3480
Joined: Tue Aug 01, 2006 3:04 pm

PreviousNext

Return to Fallout 4

cron