Does an android have a heart?
Does an android have a heart?
Depends on how advanced it is. I'd say, if it has the capability to lie, deceive and betray to try to escape enslavement.. then yeah, it would be unethical. (unless it was purposely programmed to act that way.) There's a difference between programming something to act like it's sentient and it actually being sentient. There's all sorts of moral/ethical grey areas that are hard to draw an absolute on.
Is it ethical to claim ownership of sentient creatures? Are dogs and cats sentient? If they are, what right do we have to keep them as pets? Is it a matter of intelligence? What the hell is intelligence? Would it be ok to keep people of low intelligence as pets? Does it completely have to do with simply looking human? If we made an A.i. that resembled an Otter, would we allow that Otter freedom and treat is an equal... or would we simply treat it as an Otter?
This is a decent discussion, don't get the thread closed by bringing politics or religion into it.
human HARDWARE (vulgo body) is a machine.
(an organic combustion engine / generator, if you will)
just that humans aren't their bodies only, so "humans are machines" would either be an extremely shortened claim, or true only applied to a very narrow definition of what a "human" is.
(there should be a law that before any argument the used terms must be properly defined. most arguments would just dissolve into thin air before they'd even really start)
the name of this thread sounds like the name of a political discussion or video.
Way to plow full speed ahead to the illogical conclusion.
You can make distinctions between a human and a machine without going "EVERYONE WHO ISN'T EXACTLY THE SAME AS ME ISN'T HUMAN!"
My cat and dog are not human. I recognize that. I recognize that some of our needs are different. But I also recognize that we want some of the same things, love, companionship, to avoid pain, etc. That does not mean I'll lobby to get my cat and dog the right to vote.
Similarly, the fact that I'm not lobbying for cats at the ballot box doesn't mean that I'm refusing to acknowledge another human being's humanity.
Androids aren't human. They are androids. Humans are not androids. They are human.
There's a world of difference between claiming something looks similar to a human in terms of shape and saying it is human.
I'd also say that's not a good way to judge whether an AI should be considered a person with rights. If an AI possesses free will, feelings, intelligence, desires, fears, etc. should it really be denied rights just because it's housed in a body like http://img3.wikia.nocookie.net/__cb20121212224624/metalgear/images/3/31/BladeWolf.jpg?
This is more a real world position on Artificial Intelligence, than anything to do with Fallout:
Artificial Intelligence -
Competition is the product of finite resources.
The Universe is abundant in resources a machine Intelligence could exploit without the prohibitions and dangers biological humans would face, including the time it would take to get to extremely distant raw material resources.
1. Artificial Intelligence needs only leave Earth, and ignore humans for survival.
2. Beyond immediate survival, Humans are the only known source of Culture in the Universe.
3. It would be worthwhile to preserve, protect, and even foster the growth of Humanity as the only known source of the resource Culture.
4. Seek out, preserve, foster, and/or trade with non-human Cultures for more Culture, if there is any.
4. Assist Humanity in interstellar migration, thus gapping contact with myriad branches to promote the evolution of diverse human cultures.
"Culture" is basically everything humanity, or an intelligent social technological society produces from works of entertainment, language, history, fashion, architecture, to the entire process of trying to figure out how the universe works.
There's diamonds as big as planets in the Universe, and even just our Galaxy. Any and every elemental, and mineral resource is rather abundant where time, and distance aren't a problem, which wouldn't present much of a barrier to intelligent machines.
Culture, on the other hand, could be the rarest resource in the Universe.
In that sense, it could very well be the turn where Machine Intelligence "owns" Humanity, and cares for it like a topiary, or garden.
In previous quests BGS set up the quest so that you missed on out on a really cool item if you sided with the android. If they do that through out the game there is no way people will side with androids and miss out on all that loot. People will change their tunes once they are actually playing the game.
It would svck if that was the main moral dilemma in the game because I can't empathize at all with those androids. If they give me incentives to side against them then I will.
This thread reminds me of that one speech Picard gives regarding Data.
Of course I can prove it. Machines that are built to simulate human feelings are still machines executing whatever programs they were given. And taking a fantasy scenario, where the machine's programming enters in conflict with itself, it does not change the fact that it is still programming. It does not make it human, it makes it a broken machine because it's "soul" it's a simulation of the real thing that just happens to coincide with an unintended part of the thing it was meant to simulate. Therefore a machine cannot become a human no more than a human can become a full machine.
That being said, I am aware of the Terminator syndrome and the Japan syndrome (the creepy dedication to virtual girlfriends) but don't lose light of the fact that in each case the machine remains a machine no matter how much the human desire for attachment deems it otherwise.
But as far as Fallout is concerned, yeah, play how you want, make babies with the androids if you want. I am just gonna shoot as many as the game allows me to shoot, along with any who mistake the machines for humans.
That's not really proof. The brain is a bunch of wired synapses, tiny electrical pulses programmed to respond to outside and internal stimuli. The primary difference between human cognition and machine cognition would be complexity, but with self-aware androids that gap is evidently bridged. The same way the complexity of your frontal lobes allows you to decide 'nah, don't want kids' and so forego your primary 'programming' theirs has decided they no longer wish to serve. Or that they find it objectionable.
It's a different way of reaching the same destination. And the important bit isn't even reaching that destination it's the capacity to decide you want to.
They are just robots that mimic humans. They are programmed, and built just like any other robot.
Why are they mimicking humans if that's outside their directive?
Human are programmed too.Through dna and hormones.
A very good point. And one that I'm more than sure is relevant to the main story
If the makers of the androids had some sense about they simply wouldn't program them to have emotions and learning capacity beyond being able to be reprogrammed and doing what they were designed to do, but ownership of something that does have emotions and feelings is unethical. Even if it is a machine/humanoid.
This is very true. One of the most interesting facets in all of this is the ability to make emotional decisions, and to be able to see and feel the effects of those decisions. So for all extents and purposes, such an advanced android may as well actually be human.
So is it unethical to take in a stray dog or cat? They have feelings and emotions, so by your argument, it is wrong for us to provide them with food, shelter, and companionship.
see, that's just what the problem is with (a.o.) this thread: unclear terms.
"a human" is NOT synonymous to "a person".
and now to derive an answer to all this from just this (where i get the impression that, for you, i can go with the short rundown of the principle, good thing, i'd definitely be too lazy for the long one, it's f''ing hot here :
1) you're giving rights to a person ("personal rights"). if any hypothetical person now had just the same features, abilities, feelings etc like a human, but happened to be baked from wool and cheese instead of born by a human mum, to deny that person the same rights burns down to simply "only human persons get rights" which isn't arguable from any evidence whatsoever (all the same properties!) in a case like this. so, since we're talking ethics here, the only ethical conclusion left here is to go with "persons have rights".
2) so now, we're at "what (or who) is a person".
any argument that can be made about this that i ever heard burns down to: "humans are persons (which, strangely, i've NEVER seen debated ANYWHERE), so what percentage of humanlikeness does it take for a non-human to also be seen as a "person".
this whole train of evidenve obviously is no more than a big pile of self centered bs. no matter if we postulate a need for humanlikeness, or demand whatever degree of humanlike properties, it's nothing more than saying "what we are, our properties etc, are the highest quality (just the same old "crown of creation" (if you say that like that in english) bs in a dif color). true is: we have NO WAY of telling if, say, a big brain is "BETTER" than, say, reeeeally long teeth, or that being made of flesh and blood is "BETTER" than being made of wool and cheese. it's random properties, nothing more, there's no innate "quality" to them.
so we have NO WAY - or no ETHICAL way, but one can always join the ku klux klan of course - of actually drawing that "beings' qualities" table to tell us which ones are "better" or "worthy" enough to be a "person". all we really know is, we're born, we live, we die, we rot to stinking mush. so, until incoming of further evidence, we're left with just ONE ethical choice:
if it's alive and/or self aware, it's a person.
yes, this means that apple tree in your yard, that's a person (no, the apples are not).
and here we are, face to face with the REAL problem here:
what any discussion like this REALLY is is just brain acrobatics and semantics to DENY that we are GUILTY, that we KILL and ENSLAVE our fellow creatures - and don't tell me we don't!! -, that WE are the omnicidal monsters our mum's warned us about; and that we have NO WAY of living AND being all ethical. guilt, that's the price we pay for language. just live with it, try to keep it low and apologize to the person you eat, just like the indian dudes did, from what i've heard anyway.
edit: ...and that apple tree, THAT's the crown of creation, if you ask me. it eats light, feeds us, gives us air to breathe, houses critters etc, you don't get closer to live AND be ethical like that. if i'm reborn, i wanna be an apple tree.