These guys nailed it.
These guys nailed it.
I'd sit down and have a discussion with my car, if it convinced me that it WANTED to roam the streets and it wasn't just in some fancy binary loop retreading the same streets over and over I WOULD let it go.
I'd have no problem using machines that didn't have the capacity to yearn for freedom as laborers, but it seems unethical to use an android that is capable of wanting to be free imo
Skynet wanted to preserve it's own life too, just saying.
I wonder if this is how the debate for giving females the vote went?
"Women should get to vote too!"
"No."
"Why not?"
"They're not men."
"But-"
"THEY'RE NOT MEN!"
"I-"
"Not men!"
I actually should have used emancipation here. That'd be much more on point.
It can simulate being self-aware. It can know what it should want for itself.
Maybe, but an advanced android may simply act and look more human without ever being a sentient AI and the definition of an android is a robot or synthetic organism designed to look and act like a human, especially one with a body having flesh like resemblance (nothing about AI).....there is a world of difference between a machine designed to mimic humans and a sentient machine intelligence.
Is Harkness anymore sentient than President Eden because of his artifical form.
No and their appearance is utterly beside the point.
There's no real way to know that, so I'd give it the benefit of the doubt.
Ever hear of solipsism? There's no real way to disprove it, afaik.
The brain does not run on programmes, that's deliberately trying to draw a parallel with machines. The human senses are not programmes, they are direct biological reactions or actual stimuli.
Perhaps I should enunciate further, I really don't care if a machine is capable of an illusion of self-awareness. It's a machine, I am a human, ipso facto I consider humans more important. It's really as simple as that, a machine is built to serve people. If it starts gaining dangerous levels of intelligence it should be corrected to protect other people, which are more important - the same way a dangerous animal is destroyed even if it was only acting on it's natural instinct or what-ever. It's not about science or some of the genuine misanthropy that other people have posted, I am on team human and what-ever is against that I don't want around - a machine of unknown intelligence or capabilities operating around people is one of those things.
For full disclosure I don't even like the idea of machine labour replacing humans in the first place but that's not the topic at hand.
and that was just as stupid of an arguemt as saying robots are mechanical, thus they have no rights.
[censored]s sake, let's not derail the topic.
Lol, to be human is not necessarily about being orderly or good. You are experiencing life. You know what you are experiencing and you can assume you know what I'm experiencing based on our similar DNA. You can't know that a robot is experiencing anything at all. Maybe the robot gets some 0's and 1's that tell it to put on a show for you. It's not capable of real consciousness. It can have a sophisticated learning program, but none of that would allow it to actually feel anything real. It's simulating wants for the sake of an audience.
Lol, and all these people literally die because you don't have enough people to build a fence to keep out a pack of Death Claws, lol.
You aren't necessarily your senses. You feel and experience, but that doesn't define you. You only know what you experience. You can guess that someone with similar DNA can experience something similar, but you can't possibly know what a "sentient" computer is experiencing. Even if they are experiencing "life" you can't relate to what it is they are experiencing because it's probably something completely different from what you are experiencing.
Some people can't even relate to other living alien races so I don't see how must people could empathize with something that's not even organic.
This guy just gets it man. At some point we have to come to terms with the fact no matter how much they think, want, and crave to be human they just aren't. If they cannot accept this and become dangerous to all or some humans, they must be dealt with however is best. I'm really surprised I feel like half of the Android supporters don't feel as strongly for the Ghouls as they do for the androids yet Ghouls were actually human once.
You could spend your entire life convinced you were human but if your not actually a organic being yousa robot baby!
But your question only addresses androids, when the majority of sentient machines in the fallout universe are not androids.
Star Wars has been touching on this subject for years. And many times, there have been moments of droids lobbying for their own rights-- heuristic proccessors, that when left to their own devices, can develop things, habits, personalities even. One of the major flaws of the R5 series astromech was that it permanently imprinted the first quirk or habit it learned and it could not be erased from the processor- making them vastly unpopular for many.
But on topic...Some will say they are machines, they are property. And this is true, to an extent. I've traveled with enough robots and synthetics in my gaming career to look at them as...buddies. My time with Ed-E alone amongst others in gaming, have given me a sort of...sympathetic attitude towards them. I tend to treat them as comrades rather than servants.
So is ownership unethical? No, no I would say it's not. It's how you treat them, ultimately, that brings up the issue of ethics.
Yeah pretty much. I am a human, part of human society even. I care about it's safety and propagation. Machines are created to serve us. Now one can philosophise till the cows come home about the nature of sentience but I consider myself a more practical person, if something is a possible threat to human society - such as a sentient machine, with an intelligence/capabilities we cannot understand (especially in todays computer connected world) - then yes. I think it's a risk and our safety is more important than musing on the nature of the universe, people's potential safety over thinking that doesn't accomplish anything/
Not to sound like I'm baiting, but this arguement I'm seeing reminds me when settlers discussed the "indian" problem during the expansion period.
Fun fact: Native Americans were not considered a "people" of the United States until the mid-late 1800's.