Which of course is so different. Humanity can't possibly find a way to exploit that as well, can it?
Point remains. If we keep holding to this humanity first philosophy our species will fail.
That's like saying that a scientist isn't "pulling his own" weight because he isn't a carpenter. Those androids were invented and built by someone who had scientific knowledge.
I doubt it will be anywhere near 50/50 with the general public unless there is an android origin story available or there are certain perks you get for siding with the railroad.
If your choice is Loot Vs Morality like it was in The Replicated Man quest then it's a no brainer for me. That would be said, but I can only assume that they will do what's necessary to make the railroad a decent choice.
Being a good inhabitant of earth has nothing to do with being a live. Being a human is just about experiencing. If we die off then so be it. Maybe an android could be a better Stewart of the planet. That doesn't mean it's alive.
You've side stepped my question a little bit here. The bear doesn't know what a warning shot is. In the game a Yao Gaui is certainly not going to just run off. Why should it? It's the apex predator. You are a the prey. It's the natural order of things.
Yes, in the case of a bear it's just about educating people. They can stay out of the bears territory, but what about rats and roaches in New York. How long can you last staring at one of those sewer rats in a studio apartment before pressuring the landlord to get an exterminator? That is the mental illness I am talking about. You are supposed to be able to be unbothered by something like that, lol.
Yeah, we've got so much other stuff that can wipe us out that I would bet on A.I. actually solving these problems long before they become a problem themselves.
I voted "it depends".
Without succumbing to banolity, what is a moral action?
For me I take somewhat of a hedonistic and consequential approach, if the agent in question can experience differing stages of mental well being via palpable sensations such as pain and pleasure.The morality of an action is then predicated on it's ability to abate or catalyze these sensations.
As John Stuart Mill elucidates "Utility, or the Greatest Happiness Principle, holds that actions are right in proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness."
If these androids are sentient and able to experience a gamut of pains and pleasure, one must consider actions that fall within this gamut moral or immoral acts, such as subjecting an android to thralldom
Even for primitive and generally repellent life forms like spiders, few if any would say that to put your cigarette out on a living spider is moral
This is of a life form with no real cognizance and of fettered perception, never mind an android that can ponder the anguish and turpitude of slavery.
I know people are going to say "androids are just hunks of metal, computers on legs so to speak", we have to remember that all of our emotions, our pleasure and pains are ultimately (without going into the quantum level) just neurological impulses fired from our brain to various electrophysilogical constructs like the nervous system.
Ultimately what people experience, feel, do and think is down to little electrical impulses and flashes of light much like a man made circuit.
Yeah, you spend a million dollars on parts and the thing wants to run off and pick daisies. It's defective.
Lol, those things could be worth 100,000 in bottle caps alone. I doubt you would blink an eye at selling one of those things for scraps.
Maybe were asking the wrong question.... what if all that makes up a human life, the fundamental basics of human consciousness, is as simple in execution as a computer program?
What if a human brain could be wiped of this 'OS' and replaced with an android OS setup.... and no phone jokes here... In the fallout universe we have seen the robots with silicon synthetic brains, and real brains, and we have had another have his brain scooped out like an ice-cream Sunday cone in an auto doc, and that only pissed him off, what is it to be human and without a brain?
Wow.
I don't quite know what to say after watching that!
*mind blown
That video got me pretty emotional the first time I watched it.
There's no way I can't empathize with something that acts like it's alive and has thoughts and emotions. How empathetic we are towards the androids in FO:4 depends on how well they're characterized in the game. In real life however, I think most of us when presented with something that looks like a person and is begging and crying for its' life would be compassionate. Yes, even those who post "meh, it's just a toaster on legs." Those who would have no sympathy in such a case are probably the same ones who have no sympathy for people in real life.
Wow, that was amazing ... and made me feel a little bit sad.
I tend to lean toward 'yes' even before watching the video.
I cannot vote. There is no "Kill them all and I don't care who sorts them out" option.
Electrical current goes to organic tissue not a computer chip. You experience therefore you are. A computer can only simulate experience no matter how sophisticated it is. It can't think it can only calculate. The girl in the video was only acting that way for his benefit. There was no consciousness there.
It's called being logical. He basically sent a defective product to someone that will eventually end up growing bored and killing the customer.
I also think its weird people have some sort of inherent idea that being organic automatically makes us more 'worthy' or superior.
If that kind of advanced AI is just "a toaster on legs" then we're also just sacks of red goo.
I first watched that on one of the Twilight Zone marathons they have every so often.
I thought it was pretty poignant as well.
Electrical currents go through everything*, pretty much all the time.
Do you realize that Neurons are binary?
A flip flop uses 8 digits of Binary states.
It is rather feasible that with the proper technology- which would entail not only the ability to scan human brains with the utmost detail down to neuron level, but also miniaturization techniques that give us the ability to recreate through electronics to the neuron level- a human brain could be duplicated, with electronics.
Whether or not that supports any notions of sentience or sapience is aside from the point.
It is possible to build a brain. Just not plausible, currently.
Good thing FO has SCIENCE! on it's side.
Anyone suggesting that it wouldn't be actual life. What if the android was kicked online by the bio-electrical impulses of something that was alive? That's all that life really is. Passing a torch, or rather a spark, into the fetus, until it's own systems are online and able to do for themselves.
*some insulators are really good at stopping current. Though that electricity, unless given a ground, just clings to them.
I have to say, until she started panicking, I was on board with the video, but something about her performance and the animation just don't gel enough and takes me out of the piece.
If it wasn't for the cropping, I'd be so happy with this channel. Unfortunately, it completely destroys the framing, which undercuts the narrative.
why does the name Isaac Asimov suddenly come to my mind , or blade runner when i am reading this thread
True for me, if we assume it's gonna be standard Beth protocol with neatly defined factions I am going to be hardcoe Railroad. Nothing in this thread has convinced me it's not wrong to enslave something that is functionally human, regardless of whether they actually are. Especially since that seems to be the fundamental, underlying reason. They just aren't us. They might be exactly like us but they're not so that makes it ok. You'd think like the entirety of human history would help not repeat this kind of thinking but nooooooo...
Well this thread is definitely interesting...
I am a big fan of the WH40k universe, and I think that the way humanity interacts with other sentient beings in this fictional universe is far more likely than the overall "peace, prosperity and wealth for everyone" attitude of StarTrek.
Maybe my opinion is gonna be unpopular or will be viewed as inhumane, but imo it is like this:
If the Android is not sentient, then there is no point in giving it "rights". No matter what you do it is limited to its programming and doesn't even have the ability to make decisions on its own and thus make use of its rights.
If the Android actually IS sentient, then it is not a construct anymore, but an own form of life - sentient life - in its own right.
Where is the inhumane part in that? Remember 40k....
Given the tools for the job, and lets just assume they are intelligent to procure them, an Android can reproduce almost instantly. Unlike (most) biological life it doesn't even need a mate of the same species.
As a machine its body is not more than an interchangeable tool, and can be optimized for any given task and is rather easily repaired compared to ours, making it physically vastly superior. Especially in a world like Fallout.
Mental superiority is hard to measure, I assume an android would lack in the creative department, but propably have a lot better access to data and mor "computing power".
Another sentient lifeform that is superior and produces fully able members of its species almost instantly (while we take about 16 years for that, kids gotta grow and learn too after all)?
The risk that the other lifeform might decide that resources are wasted on us or declare US the cattle or anything else of that kind is a given since they are sentient and can make their own decisions.
And given their advantages we wouldn't stand much of a chance.
Do not take that risk - destroy them. Every single one of them, remember that a single one could repopulate his entire "species" with himself as template.
Maybe mourn the loss of what could have been if it went well, but never regret that you decided to "play it safe".
So no, I don't see much of a problem with owning an android.
I actually do think that the question in the OP will prove to be very important in-game.
Why does the compounds the electrical impulses are acting on make a difference? What's the difference between a human mind calculating and an advanced technological brain calculating? Organic tissue operates almost identically to the binary systems of computers, to differentiate from one to another is completely illogical if the outputs are the same.
If you met a sentient, intelligent space faring species but it wasn't based on carbon, and therefore didn't use the same compounds in it's structure as earth based life, would it still be acceptable to enslave it? Be wary of it, yes. Defend yourself if it attacks you? Also acceptable. But to immediately dismiss it as a lesser being because of it's non carbon based chemical reactions is just plain dumb.