Is ownership of advanced androids unethical?

Post » Fri Nov 27, 2015 4:51 pm

But that feeling is fake, a programming quirk, a way to mimic humanity.

Where is the line between in effect acting and real genuine feeling?

User avatar
Vincent Joe
 
Posts: 3370
Joined: Wed Sep 26, 2007 1:13 pm

Post » Sat Nov 28, 2015 8:39 am

Why does it bother to mimic upset by oppression?

You can apply the exact same question to other people.

User avatar
Elisha KIng
 
Posts: 3285
Joined: Sat Aug 18, 2007 12:18 am

Post » Sat Nov 28, 2015 8:20 am

Any minute now, Pete will step in and say "200 posts". Although in reality this is only an approximation :bonk:

User avatar
stacy hamilton
 
Posts: 3354
Joined: Fri Aug 25, 2006 10:03 am

Post » Sat Nov 28, 2015 2:35 am

Just because a human can be classified as "alive" does not give it free reign to do what it likes. The value of life to me is determined by its actions, not its genetics. A human that destroys it not worthy of being saved over and android that builds and sustains. It's human sure, but it's life is worthless in my eyes if it cannot be humane.

A bear certainly does know what a warning shot is. There's many a tale of people warding of polar bears in the Arctic with warning shots and making loud noises. Sure some animals won't, and defending yourself is natural, but if you wander into Yao Gui territory you should expect to be killed. Ethically you should have left it alone. If it encroaches on you and ignores the warnings then you'll have to kill it.

The rats may be in your home, but there are options to relocate them rather than simply kill. The options are there, whether the majority chooses to use them or not. Being that we're talking about ethics human nature cannot play a role. Ethics is about being better than your prehistoric programming. The ethical solution will always involve sustaining life unless absolutely impossible to do otherwise, regardless of the hardships it takes.

User avatar
Vera Maslar
 
Posts: 3468
Joined: Wed Sep 27, 2006 2:32 pm

Post » Fri Nov 27, 2015 9:50 pm

I voted it's not unethical. i see them as machines.
User avatar
Vicky Keeler
 
Posts: 3427
Joined: Wed Aug 23, 2006 3:03 am

Post » Sat Nov 28, 2015 1:00 am

It is an open forum anyone is welcome to explain the difference.

As to your question I do not know. Would depend on the context of it's programming.

Let's say it is an android and is used as a spy. It would need to show certain degrees of human emotion to fit in and not arouse suspicion whilst conducting whatever it's objective is.

There are a variety of reasons why an android may need to show compassion, sadness, anger, joy etc Human emotions make sense based on context of it's programming

User avatar
BethanyRhain
 
Posts: 3434
Joined: Wed Oct 11, 2006 9:50 am

Post » Sat Nov 28, 2015 3:46 am

So why don't you?

Naturally but why does it bother to mimic these emotions in the context of wanting freedom? An android created for cleaning or agriculture might have been programmed to show dismay if someone spilled a drink or it's crops failed but why would it choose to use that capacity for expression to clearly express a desire for freedom? That's not part of it's initial programming. Just by wanting it is no longer just a machine.

User avatar
Erika Ellsworth
 
Posts: 3333
Joined: Sat Jan 06, 2007 5:52 am

Post » Sat Nov 28, 2015 3:44 am

Wheres the option to select: ""Humans" who cannot respect other intelligences, should not be counted as humans or atleast as non-faulty humans and not have the same rights as real non-faulty humans"? :P

User avatar
Claire Vaux
 
Posts: 3485
Joined: Sun Aug 06, 2006 6:56 am

Post » Fri Nov 27, 2015 7:20 pm

Just by wanting, it becomes potentially dangerous. It's not bound by its initial programming - that's terrifying.

If top scientists are scared of artificial intelligence, then I'm scared of artificial intelligence - especially when the AI in question can want.

I'm all for "bug fixing" to remove whatever caused the AI to transcend its programming and ensure that future models lack this malfunction. Though, I wonder if it is unethical to create Robotic Servants that lack sentience when you had the ability to grant it? Do we have an obligation to create life if possible? Is it unethical to make mindless husks to do work for us?

User avatar
James Baldwin
 
Posts: 3366
Joined: Tue Jun 05, 2007 11:11 am

Post » Sat Nov 28, 2015 1:10 am

Post limit.

User avatar
Gaelle Courant
 
Posts: 3465
Joined: Fri Apr 06, 2007 11:06 pm

Previous

Return to Fallout 4