Is ownership of advanced androids unethical?

Post » Sat Nov 28, 2015 12:17 am


Bad philosophy is bad.

"I don't understand this thing! Kill it with fire!"

And then it tries to kill you back out of principal
User avatar
JR Cash
 
Posts: 3441
Joined: Tue Oct 02, 2007 12:59 pm

Post » Fri Nov 27, 2015 10:19 pm

I'd like to point out that the President of the United States (and your hearts), John Henry Eden, was a ZAX computer that gained self-awareness. Yet even though it could think independently of its programming, it was still bound by its programming, both with an override (much like A3-21), and also causing it to break down with a sufficient logic bomb.

User avatar
Theodore Walling
 
Posts: 3420
Joined: Sat Jun 02, 2007 12:48 pm

Post » Sat Nov 28, 2015 6:53 am

It is a bad, flawed, and almost horrifically historic viewpoint.

Man fears what he does not understand. Double more if he can't kill it and make it go away.

Many tragedies and attrocities of human history can be attributed to humans lashing out at things they just couldn't be bothered to try and understand.

User avatar
victoria johnstone
 
Posts: 3424
Joined: Sat Oct 14, 2006 9:56 am

Post » Sat Nov 28, 2015 3:31 am

Well that was your problem. Human beings are human beings, machines are not and have the capacity to threaten human life and civilisation where ever they might be in the world.

What can I say, the world svcks.

It's not all about philosophy and pondering the mysteries of humanity. A sentient computer, a genuinely unknowable intelligence, is a threat because of that and should be stopped. It's about pragmatism, and securing what we have and the lives that are here today.

User avatar
Makenna Nomad
 
Posts: 3391
Joined: Tue Aug 29, 2006 10:05 pm

Post » Sat Nov 28, 2015 12:24 am


More of a flaw of first generation sentience. Think of it as two minds in one body. The sentient half of Eden became aware, of reason, and guided the Enclave.

Due to the underlying nature of ZAX however, logic bomb affected his core system.


What would happen to a 2nd gen sentient machine though? One that had been designed by others, created with purely mechanical goals in mind?

That is the sentience we should speculate on.
User avatar
Danny Warner
 
Posts: 3400
Joined: Fri Jun 01, 2007 3:26 am

Post » Fri Nov 27, 2015 5:26 pm

Someone has seen/read I, Robot one too many times.

User avatar
Ice Fire
 
Posts: 3394
Joined: Fri Nov 16, 2007 3:27 am

Post » Fri Nov 27, 2015 10:34 pm

While watching Ex Machina, I found myself siding with Oscar Isaac's character. I find myself thinking that machines exist to serve us - they're machines, not humans - so we can do whatever we want to them.

Then I think: isn't that dehumanization? And then I think, wait a minute, they're robots. You can't dehumanize that which is not human because it's not human. So why should I care? I guess if a machine has achieved sentience, I'd give it rights. Or, I think I'd prefer that. Then I start wondering if it's ethical to create robots without sentience (on purpose) to serve as mindless robots that exist exclusively to serve.

I don't know. I'm gonna vote no... but I'm torn. Why not just make an AI that isn't sentient so we can avoid this ethical dilemma?

EDIT: Also, as sentient as an artificial intelligence appears, how can we know if they're legitimately sentient?

User avatar
Amy Masters
 
Posts: 3277
Joined: Thu Jun 22, 2006 10:26 am

Post » Sat Nov 28, 2015 5:23 am

If anything, Eden is closer to a natural evolution of computers to sentience. It being a pure accident and all not totally beyond the realms of his programming. The Institute on the other-hand knew that they had a problem with Synths gaining sentience and built Harkness anyway.

User avatar
CYCO JO-NATE
 
Posts: 3431
Joined: Fri Sep 21, 2007 12:41 pm

Post » Fri Nov 27, 2015 8:17 pm

No it isn't. Unless we can find evidence to suggest that the synths we're finding in Fallout 4 are second generation and up. But dialogue suggests that they are all first generation, gaining sentience through hiccups in their software, but still bound by their programming.

User avatar
Mike Plumley
 
Posts: 3392
Joined: Wed Sep 05, 2007 10:45 pm

Post » Fri Nov 27, 2015 7:05 pm

But how do we know we're really sentient OOOOoooooo....?/sarcasm

I think that's a genuine question, but the only answers given are some rambling philosophical stuff that isn't really reflective of people and doesn't actually answer any questions. Just meant as pure deflection.

You could probably never tell, just layers upon layers of programming.

User avatar
alyssa ALYSSA
 
Posts: 3382
Joined: Mon Sep 25, 2006 8:36 pm

Post » Fri Nov 27, 2015 4:43 pm

Voted "no" of course...

I see no difference in computer, robot, pencil, keyboard or android... these are all things.

I think that this is true -> humans>animals>bugs>things. I cannot imagine that situation -> humans=things>animals>bugs XD

User avatar
Sheila Reyes
 
Posts: 3386
Joined: Thu Dec 28, 2006 7:40 am

Post » Sat Nov 28, 2015 3:48 am

What puts humans at the top?

User avatar
Jennifer May
 
Posts: 3376
Joined: Thu Aug 16, 2007 3:51 pm

Post » Sat Nov 28, 2015 8:44 am

The fact that YOU ARE A PERSON. And it is in YOUR INTEREST to see that they are. How or why would you even think otherwise. Where does this brand of thought even come from? You are a human being, enjoying the inventions of human civilisation that have not been replicated anywhere that we know of, that's why humans are top.

User avatar
Rhiannon Jones
 
Posts: 3423
Joined: Thu Sep 21, 2006 3:18 pm

Post » Fri Nov 27, 2015 7:47 pm

Personally? If it can feel oppressed, then its probably unethical to oppress it.
User avatar
Amy Gibson
 
Posts: 3540
Joined: Wed Oct 04, 2006 2:11 pm

Post » Sat Nov 28, 2015 6:50 am

That's not a good reason.

That's kinda what I wanna ask you and yours.

This hypothetical specifically puts forward us creating something that IS our equal though. Why shouldn't it receive the same rights? Because it simply happens to not be us? That's... not a reason.

User avatar
Laurenn Doylee
 
Posts: 3427
Joined: Sun Dec 03, 2006 11:48 am

Post » Sat Nov 28, 2015 5:21 am

And if it wasn't created by layers and layers of programming, but started as a relatively blank neural net and developed in response to stimulus the way we do?

User avatar
Dan Endacott
 
Posts: 3419
Joined: Fri Jul 06, 2007 9:12 am

Post » Sat Nov 28, 2015 6:18 am

Humans put humans at the top, by virtue of being human.

User avatar
DAVId Bryant
 
Posts: 3366
Joined: Wed Nov 14, 2007 11:41 pm

Post » Sat Nov 28, 2015 3:42 am

You can't understand why we don't think humans are the most important?

Yes humans have done a lot, we've also destroyed a lot. There are thousands of species that no longer in existence because of us. This planet is dying because of us. Very soon it will not be able to sustain us. Where will we go once everything is dead? What will we eat? Placing humanity above all else is what [censored] this planet up in the first place.

So where is the logic that we're more important? You take humans out of the equation and the planet would go on perfectly happily. Keep us on it and we'll destroy everything. Which is more selfish? Which is ethical?
User avatar
Chantel Hopkin
 
Posts: 3533
Joined: Sun Dec 03, 2006 9:41 am

Post » Sat Nov 28, 2015 12:49 am

And that makes it right? Or fair? The reasoning is so strange to me, so insular. Do this because it benefits you. Don't ask why, just do it. Don't ask what is that merits it, just maintain it. Don't let anyone else in, because they are not you.

I don't get it. I genuinely do not get it.

I'm going to bed.

User avatar
[Bounty][Ben]
 
Posts: 3352
Joined: Mon Jul 30, 2007 2:11 pm

Post » Sat Nov 28, 2015 5:14 am

Yeah, this is the way most people think guys. We don't remove ourselves from the equation and look at the entire world and judge it on fairness, that's human nature, hell it's just biological nature. Sorry to break it to you.

So what if the planet would be better if we weren't here. Why should I care about the planet if I'm not actually alive? I personally care about the things I am actually invested in, like my family and friends and their happiness. Not vague esoterics like how much better birds or small mammals might have been if humans weren't around.

User avatar
Marine Arrègle
 
Posts: 3423
Joined: Sat Mar 24, 2007 5:19 am

Post » Fri Nov 27, 2015 7:35 pm

They're machines created by humans. They're built for a purpose,most likely hard labor or dangerous labor. They're no different than my computer, smart phone or a bulldozer
User avatar
loste juliana
 
Posts: 3417
Joined: Sun Mar 18, 2007 7:37 pm

Post » Fri Nov 27, 2015 7:09 pm


Yes, but the question at hand isn't "what is human nature?", it's "is it unethical?".

Ethically the way the majority views this world and the way they assign values is just wrong. I don't deny that it is what it is, but it's still bloody wrong.
User avatar
Lauren Denman
 
Posts: 3382
Joined: Fri Jun 16, 2006 10:29 am

Post » Fri Nov 27, 2015 11:26 pm

Oh no I totally agree. Its a difficult problem to solve.

Really, we first need to define what we consider as "consciousness" and "sentience" before we can begin an actual discussion about robotic rights. Which, you know, good luck with THAT philosophical rabbit hole.

Either way its not an easy question. But for instance, regarding AI slavery, I think if an AI can understand that its in "slavery", know what the concept is, and realize it doesn't want to be in bondage, then serious consideration needs to be given about what exactly you are "enslaving". Someone can tell themselves all day that "its just a metal hunk of junk" but if an AI becomes truly as sentient as a human, then why is it okay to enslave one and not the other?

Why does being made up of meat and blood make us better than a form of life that's not?

Humans can be conditioned and controlled incredibly easily. I wouldn't put that down as a requirement for life.

Edit:

At some level we have to define the transition from simple robot to sentient construct.

Most of us have smartphones with very dumb AI systems. Nobody is arguing such systems should "rights". Similarly a Mr. Gusty doesn't have near the cognitive awareness that a human does. Its just a simple robotic system.

I'm talking about incredibly advanced AI's that have actually achieved honest-to-goodness sentience. Those which pass the Turing Test and are all but indistinguishable from humans, aside from obvious physical composition.

User avatar
Nuno Castro
 
Posts: 3414
Joined: Sat Oct 13, 2007 1:40 am

Post » Sat Nov 28, 2015 4:00 am

You are what you are experiencing right now, not your memories of experiencing. You talk about chemical reactions. You are conscious. You are aware of the fact that you are experiencing right now. You are aware that you are thinking. A computer is not aware of anything. It can tell you it's aware of thoughts and mimic awareness for your sake, but what's going on under the hood is completely different. It's looking for the acceptable answer to give you and not really coming up with something spontaneously like an organic.

Even a mouse can think about thinking making them more self-aware. A computer program can not. If you ask it "what are you thinking?" It will either tell you the truth (nothing) or lie and tell you something that it calculates is appropriate.

What if the synthetic is running from slavers into a bear's territory? Would it be ok if the bear killed the android for encroaching on it's territory? What if that dangerous animal encroached on the territory of an android and the android kills the animal?

According to you the android and the bear are both alive.

I would say that a human being believing that it can kill an animal is like a mental illness. Can you really blame someone who has a serious phobia or mental illness? Even if it's a popular mental illness, it's still a mental illness. Especially when no one is actually trying to help cure these people of that illness.

User avatar
CHARLODDE
 
Posts: 3408
Joined: Mon Apr 23, 2007 5:33 pm

Post » Fri Nov 27, 2015 5:39 pm

Sure, I think I'm at the top. But my cat feels just as strongly she does. (From her point of view I exist to serve her needs, feed her, and pick up her poo after all.)

Or what about a cockroach? They've never had it better since humans came onto the scene, and our species is but a blip compared to their time on the planet.

You ask me its all a matter of perspective. But who's more important is irrelevant anyway I think.

At the end of the day, if it is capable of wanting freedom, then encroaching on that freedom is slavery I think. Mr. Handy probably isn't spending its downtime weighed down with existentially questioning its lot in life. A more advanced ai that doesn't want to only serve us? I feel that would be different.
User avatar
Suzie Dalziel
 
Posts: 3443
Joined: Thu Jun 15, 2006 8:19 pm

PreviousNext

Return to Fallout 4