Is ownership of advanced androids unethical?

Post » Fri Nov 27, 2015 10:19 pm

The idea is bound to pop up in FO4 and I saw something on Reddit that got me thinking. I think that if you've made an android so advanced that it can think and feel like a human can it basically is human and all rights humans have should go for it too. Or moral obligations. I'm not sure slavery is illegal in Boston. Point is if it's wrong to enslave a person it's wrong to enslave a fully cognizant android. Not wrong to own something that doesn't really think though, like Codsworth. Presumably. He's had 200 years on his own, maybe he's a person now.

But! I've seen people disagree. Apparently cuz androids are made they automatically fill a slot lower than human. I don't really get this but feel free to voice your agreement and/or explain why you think so and/or convince others.

User avatar
Adam Porter
 
Posts: 3532
Joined: Sat Jun 02, 2007 10:47 am

Post » Sat Nov 28, 2015 12:36 am

Pixels
User avatar
katsomaya Sanchez
 
Posts: 3368
Joined: Tue Jun 13, 2006 5:03 am

Post » Sat Nov 28, 2015 8:27 am

I do not recognize the supposed sovereignty of toasters.

Also, as far as pop-culture robots go, I'm tired of the "Robot gains self awareness and should be considered human" idea. It's so boring and played out that it just turns me off now. That's part of why I found Big Hero 6 so refreshing. Baymax has personality, but it's never treated as if he's becoming "alive" or "gaining a soul." He's just so freaking advanced that his programing allows us to see him as pretty much human.

User avatar
Lindsay Dunn
 
Posts: 3247
Joined: Sun Sep 10, 2006 9:34 am

Post » Sat Nov 28, 2015 2:37 am

Pixels that represent an idea and it's an idea I wanna discuss soooo...

User avatar
Jaki Birch
 
Posts: 3379
Joined: Fri Jan 26, 2007 3:16 am

Post » Sat Nov 28, 2015 3:52 am

I believe you're debating something that has no real answer. Science fiction has struggled with this ideal, for quite some time...

User avatar
Adam Baumgartner
 
Posts: 3344
Joined: Wed May 30, 2007 12:12 pm

Post » Fri Nov 27, 2015 6:50 pm

ST:TNG covered this pretty well I think, or at least in a way that was mostly agreeable to me. If an entity has a sense of self determination then I consider it to have free will.
User avatar
Jacob Phillips
 
Posts: 3430
Joined: Tue Aug 14, 2007 9:46 am

Post » Sat Nov 28, 2015 7:24 am

If you're against human slavery, then you're essentially a hypocrite if you refuse the same rights to a synthetic being with the same cognative power, conscious thought capacity, and intelligence as a human.

The question. however, depends entirely on the programming of the construct. If the robot is merely "simulating" human intelligence, then its basically just an advanced toaster. Its the classical Chinese Room problem, and its not always easy to figure out.

If the construct has actually achieved sentience, then you're dealing with a conscious entity. And once you've created such an entity, you can't just deny its intelligence blindly. anologously, what if we managed to "grow" human or sentient life in laboratories completely separate from normal biological processes. Does being "lab grown" automatically deny such creatures (assuming they are intelligent) normal rights? And why should that be different for robots.

User avatar
Natalie Harvey
 
Posts: 3433
Joined: Fri Aug 18, 2006 12:15 pm

Post » Sat Nov 28, 2015 12:18 am

I've always thought so, maybe that makes me a hypocrit. Biological life, even if artificially grown, is human life. Robots, even if they appear sentient, are not.

I don't think it's unethical to own even sentient machines, it's only a facsimile of life. You can't just wipe a human beings mind and reset them to factory settings to remove the fact that they are conscious.

User avatar
BlackaneseB
 
Posts: 3431
Joined: Sat Sep 23, 2006 1:21 am

Post » Fri Nov 27, 2015 11:42 pm

Three people (as of this posting) need to watch the Star Trek The Next Generation episode 2x09 "The Measure Of A Man".

User avatar
Melly Angelic
 
Posts: 3461
Joined: Wed Aug 15, 2007 7:58 am

Post » Sat Nov 28, 2015 4:21 am

Lobotomies. Maybe not even anything that drastic. Human brains are pretty plastic.

User avatar
luis ortiz
 
Posts: 3355
Joined: Sun Oct 07, 2007 8:21 pm

Post » Fri Nov 27, 2015 4:30 pm

Nope, its just a machine that functions on electricity. just as a toaster or computer and it isnt unethical to possess those.
User avatar
Bitter End
 
Posts: 3418
Joined: Fri Sep 08, 2006 11:40 am

Post » Sat Nov 28, 2015 12:59 am

What are we, but complex molecular machines?

User avatar
Emma
 
Posts: 3287
Joined: Mon Aug 28, 2006 12:51 am

Post » Sat Nov 28, 2015 1:15 am

That's not really the same though is it? Machines are just machines, they can be wiped, programmed to shut down. Look at how easily Harkness could be subdued, that's not life.

User avatar
OnlyDumazzapplyhere
 
Posts: 3445
Joined: Wed Jan 24, 2007 12:43 am

Post » Fri Nov 27, 2015 4:20 pm


See amnesia. Human minds are nothing more than an organic computer. Just because we haven't found a true way to reprogram them, doesn't mean it isn't possible.

Sentience demands equality, let's not piss off skynet.

A toaster is a toaster.

I imagine Codsworth sat next to the door for 200 years cracking jokes to himself while waiting.
User avatar
Lewis Morel
 
Posts: 3431
Joined: Thu Aug 16, 2007 7:40 pm

Post » Sat Nov 28, 2015 3:18 am

A human has a soul.

A machine, no matter how advanced does not. So owning a car that thinks its human is not unethical. Nor a computer or whatever.

Now someone like Vader is human so that would be very evil.
User avatar
Lisa Robb
 
Posts: 3542
Joined: Mon Nov 27, 2006 9:13 pm

Post » Fri Nov 27, 2015 11:56 pm

You can put bullets in people's heads too. And it's obvious that cloning exists in the FOverse as well ("Gaaaarryyyy!"). Harkness was killed when that happened. The difference I see between inflicting a death of personality on Harkness and putting a bullet in someone's head and then cloning them is mere process.

Mistreatment induced betrayal is, I believe, the prime reason why real-life scientists today (like Steven Hawking most prominently) believe true AI will be the doom of us. And mistreatment of humans encompasses dehumanization, and reduction to property.

User avatar
CHARLODDE
 
Posts: 3408
Joined: Mon Apr 23, 2007 5:33 pm

Post » Sat Nov 28, 2015 3:13 am

The hard part about that is how do you prove sentience? Fact is, you can only be sure that you are sentient, as the "I think therefore I am" idea only really proves it to yourself.

Regarding AIs, I think the way to see if they are actually self-aware and not the result of a bug is to perform a factory reset. If they revert back to normal, then something was wrong with their software. If they still exhibit this self-awareness after the factory reset, it's obvious that there's something beyond their programming that's causing it. If they transcend their programming, it's worth considering that they are on a different level than other machines and should be given extra considerations.

Also, I think using intelligence as the benchmark for deciding if people deserve rights is a little dehumanizing towards those with mental defects.

User avatar
Nany Smith
 
Posts: 3419
Joined: Sat Mar 17, 2007 5:36 pm

Post » Sat Nov 28, 2015 7:32 am

Maybe, maybe not. But that's also not really the important bit to me, the important bit is the capacity to feel and think and act on the same level. To be on the same level. The specifics for how you can be made to stop being seems somewhat tangential to that. Harkness got a word to shut down his complex, multifaceted life and turn him into something lesser. For humans the same could be accomplished by gongking them on the head. Whose to say which of those is lesser?

User avatar
butterfly
 
Posts: 3467
Joined: Wed Aug 16, 2006 8:20 pm

Post » Sat Nov 28, 2015 7:05 am


Unproven.


If they are just a machine sure.


Sentience and self determination would be a huge step forward. In essence, we would be obsolete.
User avatar
Ryan Lutz
 
Posts: 3465
Joined: Sun Sep 09, 2007 12:39 pm

Post » Sat Nov 28, 2015 4:49 am

The real question is whether what they 'feel' is actually 'real'.

User avatar
saxon
 
Posts: 3376
Joined: Wed Sep 19, 2007 2:45 am

Post » Sat Nov 28, 2015 8:46 am

Well, seeing the fact that synths are running away from such a life indicates pretty strong sentience. They realize they're slaves, don't want it, and run away. Self-preservation. Rather than realize this, The Institute put further measures in to kill them on voice command and bring their empty husks back.

User avatar
quinnnn
 
Posts: 3503
Joined: Sat Mar 03, 2007 1:11 pm

Post » Fri Nov 27, 2015 7:44 pm

i dont really want to get into this subject.... but what if we have souls?
User avatar
carla
 
Posts: 3345
Joined: Wed Aug 23, 2006 8:36 am

Post » Sat Nov 28, 2015 3:23 am

I say the difference is that the command override proves that A3-21 is still bound by its programming. It got shut down, and will be reset so that it's back to how it was when first assembled. Gongking someone on the head doesn't reset someone back to what they were earlier. It damages the brain.

User avatar
Eliza Potter
 
Posts: 3481
Joined: Mon Mar 05, 2007 3:20 am

Post » Sat Nov 28, 2015 12:05 am


The fact that a "android" made by a human have codes to feel and think like a human, makes the argument pretty straight forward.

It is not a human at all, it is a pre program object appear and created as a human to copy and act humans emotions. That tells you, that the "android" or anything created with science to simulate humanity it is a simulation for that porpoise.

The beautiful thing about humanity is the pure essence of humanity. The forgiveness love and porpoise in live with others, something science has always try to implement in this so called "androids" that are created to act the same feelings and thoughts that humans come as a nature.
User avatar
NEGRO
 
Posts: 3398
Joined: Sat Sep 01, 2007 12:14 am

Post » Sat Nov 28, 2015 6:15 am

i feel that well they shouldnt be considered full humans they also shouldnt be treated as objects. its fair to say that they cant just be given free reign but also that they should be treated fairly. if you made a robot with human feelings purely to mistreat it and force it to be your maid that is just despicable and shnouldnt be allowed. the point should be that if you went through that much extra trouble to giv eit those human emotions then you should treat it a bit more human like to

User avatar
Sandeep Khatkar
 
Posts: 3364
Joined: Wed Jul 18, 2007 11:02 am

Next

Return to Fallout 4