Fallout 4, Soma, Recurring Themes

Post » Fri Jan 22, 2016 1:54 pm

So for those of you who know about the game Soma, you'd know how it explores the idea of what consciousness means, and what humanity means, what does it mean to be "human". I won't spoil anything from that game because it's a great game and you really should play it or at least watch a Let's Play of it on YT to see what it's about. Anyways, on this same topic which FO4 seems to explore, albeit in a different perspective, again we could ask the exact same question, what does it mean to be human?



I wanted this to be a general discussion hence no spoilers are going to be posted here. I think despite the issues some people have with the game (story wise, RPG wise etc.), it does present several serious moral dilemmas and I've actually found it exceedingly difficult to determine what is "right". What I do like/dislike about factions end up relate to the way they do things, but if I simply look at their ultimate goals and simply say "the end justify the means", then it's difficult to say who is right, and it seems like all it comes down to is "what it means to be human?". Is it the ability to have free will? Or to anolyze and respond to situations? Is it the ability to have emotions? To be able to have a "memory" of the past? Can "machines" ever become human?



Thoughts?

User avatar
Daniel Lozano
 
Posts: 3452
Joined: Fri Aug 24, 2007 7:42 am

Post » Fri Jan 22, 2016 6:58 am

Here's a thing for after you come up with an answer: Why does it matter? What should we do about all of the 100% nonhumans and their goals? Think of the robots, like Codsworth, Curie, KL-E-O, Ironsides, and so on: we know they're not human, just simple robots with pre-programmed personalities. Yet they all have formed their own identities and goals; "Emergent Behavior", if you will. What kind of respect should we afford them, and their decisions? How real is my wasteland knight's friendship with Codsworth?

User avatar
Chris Cross Cabaret Man
 
Posts: 3301
Joined: Tue Jun 19, 2007 11:33 pm

Post » Fri Jan 22, 2016 7:14 am


Securely in second place, just above that damn spotted owl that is living in the place I want to build my home. :goodjob:

User avatar
Laura
 
Posts: 3456
Joined: Sun Sep 10, 2006 7:11 am

Post » Fri Jan 22, 2016 2:04 am

I think if the Institute is allowed to continue developing synthetic humans which also continue to keep getting smarter and more superior to humans in every way, it is inevitable that eventually one or more of those synths will have a Skynet moment and decide to ensure their freedom of choice by employing some form of 'final solution' on those pesky, inferior humans. Much as I may sympathize with individual synth's plight at the 'current' moment of the game, I have to side with the BoS in their clearer understanding of the existential threat that the synths would unfortunately present to humanity. The Institute is playing god, with no emergency brake.

User avatar
Haley Merkley
 
Posts: 3356
Joined: Sat Jan 13, 2007 12:53 pm

Post » Fri Jan 22, 2016 11:37 am


Why is there no "3rd" option? Humans and synths should be able to co-exist, right? Relationships could be built on a mutual beneficial base?



On the other hand, one thing that I have always wondered is, what keeps man going? Why do we do the things we do?



Take me for an example, I love to travel and I've been too 41 countries of the world (there are 195 grand total and some exceptions). Why would a robot want to do that? What's the point for a robot to do something like that?



What I mean is, does synthetic lifeforms have the same kind of "strife" as we humans have? Would a robot compose a classical masterpiece and then let others enjoy it for their own satisfaction and fame?



As a side-note, my home network is named Skynet. I hope it does not become self aware!

User avatar
Jerry Cox
 
Posts: 3409
Joined: Wed Oct 10, 2007 1:21 pm

Post » Fri Jan 22, 2016 3:45 am

There's an episode of Star Trek: TNG which explores this also.



The episode centres on a court case to determine whether Data (essentially a Gen.2 Synth) has rights. Captain Picard has a line which is worth remembering in debates like this.



(paraphrased): "Humans and androids are BOTH machines. Positronic in their case, electro-chemical in ours."



Being a "machine" isn't the point, IMO. Sentience and sapience are. Any entity with thoughts, feelings, needs and/or wants, should have rights

User avatar
kristy dunn
 
Posts: 3410
Joined: Thu Mar 01, 2007 2:08 am

Post » Fri Jan 22, 2016 12:42 am


Well those are my questions really. At the end of the day, just because synths started out as just robots, who says they cannot develop sufficient behaviour or whatever that's supposed to be something human to make them on par with "human".




Spoiler
how many people went to the railroad and immediately said they would die for a synth? how many simply said "not sure"? I think the majority of the people wouldn't have known what to think when presented with that question.






What is considered sentient though?

User avatar
Daniel Lozano
 
Posts: 3452
Joined: Fri Aug 24, 2007 7:42 am

Post » Thu Jan 21, 2016 10:54 pm

We are simply a species that can rationalise its' own existence, and have the ability to trust and co-operate at more than the 'here and now' level. Therein lies all that is good and bad about us, and everything in between. Aside from that, we aren't anything special, we just got lucky in the evolution game.

User avatar
Kirsty Collins
 
Posts: 3441
Joined: Tue Sep 19, 2006 11:54 pm


Return to Fallout 4