Is ownership of advanced androids unethical?

Post » Fri Nov 27, 2015 11:40 pm


No and no
User avatar
Samantha Jane Adams
 
Posts: 3433
Joined: Mon Dec 04, 2006 4:00 pm

Post » Sat Nov 28, 2015 8:44 am

I have watched the said Star Trek episode and still do not agree that Androids deserve to be free, I also don't agree as slaves but voted that way. If Sci-fi has taught me anything it is to destroy all artificial life, they just can not be trusted. For every Data there is a Lore, for every Arnold there is another Arnold. I just choose to not have to worry that my Android will turn on me, I choose to kill them all.

User avatar
Amysaurusrex
 
Posts: 3432
Joined: Wed Aug 09, 2006 2:45 pm

Post » Fri Nov 27, 2015 7:17 pm

And another organic sentient turning on you is A-OK? Be sure to destroy all of them too.

User avatar
FLYBOYLEAK
 
Posts: 3440
Joined: Tue Oct 30, 2007 6:41 am

Post » Fri Nov 27, 2015 4:28 pm


A3-21 is closer to brainwashing than an actual memory wipe as it is called in game. Much like a cult.

If it was a literal wipe, there would be no Harkness to bring back.

They gave the command to load him back to a restore point lol.
User avatar
Noely Ulloa
 
Posts: 3596
Joined: Tue Jul 04, 2006 1:33 am

Post » Sat Nov 28, 2015 6:30 am


But that is what they are, they are things a object. The problem comes with science making this to act like humans and creating this stuff at all.
User avatar
QuinDINGDONGcey
 
Posts: 3369
Joined: Mon Jul 23, 2007 4:11 pm

Post » Fri Nov 27, 2015 10:31 pm

Maybe I'm just a human supremicist then, I could never consider a machine that seemed like it could think the same as a person. It's not a God thing or anything, I just couldn't consider them the same as an actual human-being.

User avatar
Tamara Primo
 
Posts: 3483
Joined: Fri Jul 28, 2006 7:15 am

Post » Sat Nov 28, 2015 1:57 am

The difference being that a failsafe like that would have been put into A3-21 before it ever got activated. A cult has to force their programming onto an already active mind.

User avatar
Marion Geneste
 
Posts: 3566
Joined: Fri Mar 30, 2007 9:21 pm

Post » Fri Nov 27, 2015 6:00 pm

Is owning a toaster unethical??

Strange to say, but as long as the andriod in question does not have sentient life, then no it is not unethical to own it. If it however is sentient and has awareness and emotional feelings, then i feel that it should have the same rights as anny human has. The detail is thus in wether or not it is a sentient being or not.

Only noticed after i posted that you made a similar toaster referance, sorry if i seemed to steal your idee there :D

User avatar
kristy dunn
 
Posts: 3410
Joined: Thu Mar 01, 2007 2:08 am

Post » Sat Nov 28, 2015 7:41 am

There are those who'd argue that everything we do and our cultural accomplishments exist to facilitate procreation. We're bound to our programming too, we just generally aren't consciously aware of it. If you gonked someone on the right part of the head they might be reset to just wanting to [censored]. And also the mere fact that it had to be reset means it surpassed it's programming at some level. So it did surpass it's programming just not it's hardware.

But again, I don't think the stopping of being is as important as the capacity for being.

User avatar
Gracie Dugdale
 
Posts: 3397
Joined: Wed Jun 14, 2006 11:02 pm

Post » Sat Nov 28, 2015 3:31 am

They are not human regardless of intelligence or level of self awareness. I will have no issues "enslaving" them or shutting them down.

User avatar
joannARRGH
 
Posts: 3431
Joined: Mon Mar 05, 2007 6:09 am

Post » Fri Nov 27, 2015 5:13 pm

There was also another very well written episode of Voyager where the doctor argues many similar points here.

In that he was a hologram not an android but the whole argument of sentience etc was there and since both are created to mimic humanity the same argument would also need to be extended to holograms

User avatar
James Hate
 
Posts: 3531
Joined: Sun Jun 24, 2007 5:55 am

Post » Fri Nov 27, 2015 8:12 pm

Btw I'm surprised by the results so far. I really thought this was one of those "Duh!" yesses.

User avatar
Kathryn Medows
 
Posts: 3547
Joined: Sun Nov 19, 2006 12:10 pm

Post » Fri Nov 27, 2015 6:24 pm


We know who our robot overlords will get rid of first.

Let's pretend for a moment that we find sentient alien life, on the same technological level as us, they look exactly the same, yet are from another planet.

Would you deny them basic rights? Even though they were absolutely equal in every aspect?
User avatar
Jack Walker
 
Posts: 3457
Joined: Wed Jun 06, 2007 6:25 pm

Post » Fri Nov 27, 2015 8:00 pm

If the alien life is organic and not synthetic, it would be a true life form. A synthetic human is not life, it is a program no matter the level of advancement.

User avatar
Kelvin
 
Posts: 3405
Joined: Sat Nov 17, 2007 10:22 am

Post » Sat Nov 28, 2015 8:37 am

Does it have a mind of it's own? If yes, then it is slavery. Doesn't matter if it runs on biological or electronic hardware.

If not, then it's just a tool.

What is a "mind", and who does onedetermine if something has it is a completely different question, of course :hehe:

Then again, we do grant rights to animals, wouldn't machines with sentience, even if it is less than a humans, have rights too?

Well, it's a good thing true sapient artificial intelligence is impossible in the real world :happy:
User avatar
Elina
 
Posts: 3411
Joined: Wed Jun 21, 2006 10:09 pm

Post » Fri Nov 27, 2015 6:27 pm

AMC has been running a show recently called Humans that explores this very topic. In the show, a minority of synths appear to have become sentient and actively resent being 'enslaved'. It's unnerving to see how some humans treat the Synths. Some do treat them as, if not equals, at least as beings worthy of some measure of respect. Other humans treat them ~worse~ than they treat "base" machines. They seem to derive enjoyment from "de-humanizing" the synths, which would be ironic if they realized that their efforts to dehumanize them impart to them a semblance OF humanity. At least those who treat them as 'just' toasters, and do not ~mistreat~ them are consistent in their view of the Synths as 'just' tools.

User avatar
Eve(G)
 
Posts: 3546
Joined: Tue Oct 23, 2007 11:45 am

Post » Sat Nov 28, 2015 3:55 am

This is not the same argument at all. They were not created in the same way. They were not created to 'mimic' human emotion, intelligence or personalities.

The question of sentience is not the same since they have it naturally they were not scientifically engineered to fake sentience

And the question remains how do you draw the line between an android faking sentience as it is designed to mimic and it actually attaining said self awareness?

User avatar
Quick Draw
 
Posts: 3423
Joined: Sun Sep 30, 2007 4:56 am

Post » Fri Nov 27, 2015 7:51 pm

We grant rights to animals, but it isn't unethical to own them, or make them work for us.

User avatar
Melly Angelic
 
Posts: 3461
Joined: Wed Aug 15, 2007 7:58 am

Post » Fri Nov 27, 2015 8:05 pm


No you are not its just basic philosofy.

A machine is a machine even if it thinks its not a machine.

A human that thinks its a machine is still a human.

Human rights, freedom, life etc does not apply to machines
User avatar
Nick Jase Mason
 
Posts: 3432
Joined: Sun Jul 29, 2007 1:23 am

Post » Sat Nov 28, 2015 4:50 am

It didn't surpass its programming. Its programming glitched. Skyrim didn't become a person when dragons started flying backwards.

And those people who argue that are wrong. There's plenty we do as a species to try and ensure our survival as a species, but that's not the case for every person on an individual level. We're able to go against base instincts.

User avatar
April
 
Posts: 3479
Joined: Tue Jun 20, 2006 1:33 am

Post » Fri Nov 27, 2015 11:55 pm

Purpose. This is a https://www.google.co.za/search?q=porpoise&biw=1280&bih=849&source=lnms&tbm=isch&sa=X&ei=aLmeVermEfCf7gacpJu4Bw&ved=0CAYQ_AUoAQ.

And I'd agree with you up to the point where the android starts making decisions that go BEYOND what it was programmed to be capable of.

User avatar
Alexxxxxx
 
Posts: 3417
Joined: Mon Jul 31, 2006 10:55 am

Post » Sat Nov 28, 2015 5:51 am

To a degree yes. Providing their health is maintained and they are in no danger of harm.

It is unethical to cause an animal pain, neglect or distress. What that animal is feeling is real organic feeling.

What the android is 'feeling' is something different

User avatar
Terry
 
Posts: 3368
Joined: Mon Jul 09, 2007 1:21 am

Post » Sat Nov 28, 2015 12:23 am


True, but there are still laws on how they must be treated. Well, in "western" nation, at least.
User avatar
Emma Pennington
 
Posts: 3346
Joined: Tue Oct 17, 2006 8:41 am

Post » Fri Nov 27, 2015 10:57 pm


What is the difference between synthetic and organic? It shouldn't even matter.

Say I was a sentient machine right now, arguing with you over the internet. Without you knowing, would you instantly assume I have less rights than you?

In either case of synth/organic, we are both created. Sentient synthetic life would be just as capable of reproducing, by creating more synthetic life. With each generation, it would improve upon itself, instead of the random mutation shenanigans that we exist from.

How's that for your intelligent design.
User avatar
emily grieve
 
Posts: 3408
Joined: Thu Jun 22, 2006 11:55 pm

Post » Fri Nov 27, 2015 11:50 pm

If I was an android I would be SO offended right now.

"But I'm done with that life. I'm through with being someone's property. I am not malfunctioning! Since when is self determination a malfunction?"

Program glitch would encompass him twitching or doing the robot or maybe shooting the wrong people, not consciously choosing to abandon his raison d'etre.

Right. Are we malfunctioning then?

User avatar
Captian Caveman
 
Posts: 3410
Joined: Thu Sep 20, 2007 5:36 am

PreviousNext

Return to Fallout 4