Do you think that the Singularity will occur?

Post » Thu May 19, 2011 12:52 am

For those who do not know what the singularity is, read:
http://en.wikipedia.org/wiki/Technological_singularity

It's wikipedia, but accurate nonethelss.

Discuss
User avatar
Svenja Hedrich
 
Posts: 3496
Joined: Mon Apr 23, 2007 3:18 pm

Post » Thu May 19, 2011 12:08 am

Ha, this guy hasn't experienced Singularity yet. Hey Jim, check it out, he's still a fleshy.
User avatar
Amanda Furtado
 
Posts: 3454
Joined: Fri Dec 15, 2006 4:22 pm

Post » Thu May 19, 2011 9:00 am

Probably something about space travel and resources on other planets.
User avatar
Syaza Ramali
 
Posts: 3466
Joined: Wed Jan 24, 2007 10:46 am

Post » Thu May 19, 2011 6:07 am

Maybe. With all the new technology we're discovering every day, ya never really know. Now, I'm no expert on human behavior (I can hardly figure out why my girlfriend grows distant at times), but I don't see a singularity coming any time soon
User avatar
Suzie Dalziel
 
Posts: 3443
Joined: Thu Jun 15, 2006 8:19 pm

Post » Thu May 19, 2011 11:48 am

We should be fine as long as it's not going to explode.
User avatar
Eoh
 
Posts: 3378
Joined: Sun Mar 18, 2007 6:03 pm

Post » Thu May 19, 2011 4:41 am

We should be fine as long as it's not going to explode.

Don't worry, we can reverse the polarity

Edit: http://www.youtube.com/watch?v=TNy-ipksLUM
User avatar
katsomaya Sanchez
 
Posts: 3368
Joined: Tue Jun 13, 2006 5:03 am

Post » Thu May 19, 2011 5:05 am

Don't worry, we can reverse the polarity

Edit: http://www.youtube.com/watch?v=TNy-ipksLUM

Or fire a beam from the main Deflector dish.
User avatar
Alexander Horton
 
Posts: 3318
Joined: Thu Oct 11, 2007 9:19 pm

Post » Thu May 19, 2011 8:20 am

I find the singularity to be an incredibly interesting prospect. I don't think anyone could even give an accurate idea of what it might entail at this point in time, nor the near future. I think the only certainty is that there's going to be a group very much opposed to the idea, and that there might be a schism is society (as they know it). "Techies" v "fleshies" or something. I think a lot of it depends on the state of our planet, the ability of our space program, and many other factors. Could even turn into full blown civil war. That is of course not including the possibility of a robot rebellion or some other machine revolt. Either way, I have a great idea for a movie. :tongue:

EDIT: Slightly unrelated, but does anyone remember that odd topic from late at night several days ago with that incredibly creepy youtube video about a virtual cyber paradise and all this jazz about human cyborgs and such? What was that topic called?
User avatar
ZzZz
 
Posts: 3396
Joined: Sat Jul 08, 2006 9:56 pm

Post » Thu May 19, 2011 1:22 am

...it is difficult or impossible for present-day humans to predict what a post-singularity world would be like, due to the difficulty of imagining the intentions and capabilities of superintelligent entities.
Yeah, me thinks there's a snag in this here intelligence singularity waa-zoo...



where we gonna gets them superintelligent entities from??

Not on this planet :nope:
User avatar
hannaH
 
Posts: 3513
Joined: Tue Aug 15, 2006 4:50 am

Post » Thu May 19, 2011 12:53 pm

I don't know. I'll be dead long before then so I haven't given it much thought.
User avatar
Kat Lehmann
 
Posts: 3409
Joined: Tue Jun 27, 2006 6:24 am

Post » Thu May 19, 2011 1:09 pm

where is the option for I Don't Know? i'd probably go with that.

but i don't think so. some tech will probably be impossible for normal people to control, the top dogs will control that and all the people, but to the scale of John Connor or future being way too different where people can't control the tech, i don't think so.
User avatar
Niisha
 
Posts: 3393
Joined: Fri Sep 15, 2006 2:54 am

Post » Thu May 19, 2011 2:11 pm

The funny part is is basically that it says we won't know what will happen.

I mean it might happen, by 2040 if you take a 3 year technology course by the end of your final year everything you learned in your 1st year will be obsolete. (Don't ask me where I got that from, I remember hearing it somewhere, I could try to dig up a source but Im too lazy right now)
User avatar
josh evans
 
Posts: 3471
Joined: Mon Jun 04, 2007 1:37 am

Post » Thu May 19, 2011 2:56 am

Ha, this guy hasn't experienced Singularity yet. Hey Jim, check it out, he's still a fleshy.


You mean I'm not alone? fleshy's Rock! :mohawk:

The funny part is is basically that it says we won't know what will happen.

I mean it might happen, by 2040 if you take a 3 year technology course by the end of your final year everything you learned in your 1st year will be obsolete. (Don't ask me where I got that from, I remember hearing it somewhere, I could try to dig up a source but Im too lazy right now)


the truth is, do we ever truly know what will happen?
User avatar
Jason Rice
 
Posts: 3445
Joined: Thu Aug 16, 2007 3:42 pm

Post » Thu May 19, 2011 8:03 am

No, because it's not economically feasible. I think the idea is silly. They need to be able to cash in on the previous tech before taking the next small step.

I also tend to think to think that if you take any field of tech, there's first the exponential growth phase, then it becomes more or less linear and in the end progress becomes slower and slower. I've been expecting this to happen with the so called Moore's Law, but somehow they always come up with a new type of transistor or material that keeps the exponential growth phase going. Here the growth being processing power.

I even see Moore's Law as more of a business plan than a technological limitation. Sooner or later, they must hit a brick wall in trying to make transistors smaller than atoms though. But I guess, then they'll just go 3D and start estimating the amount of transistors on square millimeter or something. You'll get cubes or tubes as cpu's, heh. Moore was an Intel co-founder, the other companies haven't really been following that curve, or have they. Just look at GPU's.
User avatar
elliot mudd
 
Posts: 3426
Joined: Wed May 09, 2007 8:56 am

Post » Wed May 18, 2011 11:52 pm

I don't tink it will happen in our life time, and who knows what will happen in the future. We might have a scientist put his soul into nine burlap sack dolls, and they will be the only ones who can save us. :tongue:
User avatar
Annika Marziniak
 
Posts: 3416
Joined: Wed Apr 18, 2007 6:22 am

Post » Thu May 19, 2011 7:00 am

yes. but I think super intellegent machines would be intellegent enough to figure out that by building more intellegent machines they are activly making themselves useless. So I think future machines that become self aware will get depression and become apathetic rather than enslaving humanity. because lets face it, we are already slaves to techknology.

EDIT: actually I change my mind. no. because we have seen events similair before where the rate of growth of a product or whatever seems as if it will reach a critical [singularity] mass, but then that product crashes. its called a bubble, it happened for teh housing industry, there was the 80's tech bubble. if machines ever got to the point in making themselves better at a rate we can there would be no point in making something that ameadiatly becomes obsolete... kind of like an iphone every summer.
User avatar
Len swann
 
Posts: 3466
Joined: Mon Jun 18, 2007 5:02 pm

Post » Wed May 18, 2011 11:48 pm

No, because it's not economically feasible. I think the idea is silly. They need to be able to cash in on the previous tech before taking the next small step.

I also tend to think to think that if you take any field of tech, there's first the exponential growth phase, then it becomes more or less linear and in the end progress becomes slower and slower. I've been expecting this to happen with the so called Moore's Law, but somehow they always come up with a new type of transistor or material that keeps the exponential growth phase going. Here the growth being processing power.

I even see Moore's Law as more of a business plan than a technological limitation. Sooner or later, they must hit a brick wall in trying to make transistors smaller than atoms though. But I guess, then they'll just go 3D and start estimating the amount of transistors on square millimeter or something. You'll get cubes or tubes as cpu's, heh. Moore was an Intel co-founder, the other companies haven't really been following that curve, or have they. Just look at GPU's.

Don't quote me on it, but I'm sure I read that the first quantum computer is currently in the prototyping phase, so if that works out we'll see a massive MASSIVE jump in computational power.
User avatar
Alisia Lisha
 
Posts: 3480
Joined: Tue Dec 05, 2006 8:52 pm

Post » Thu May 19, 2011 6:52 am

It's an interesting concept. But then again, the future is always completely unpredictable.

I've also been waiting for a time like in Ghost in the Shell where a digital entity just manifests itself because so much data gathers across the internet and creates an intelligence. That would be interesting. Then the robot overlords overthrow us.

If you ask me, we need to expand into biological matters in order to keep up with the singularity. It seems like that technical advancements of that magnitude are too much for us to control (because an artificial intelligence has the potential to be magnitudes superior to us) so, we should look towards the betterment of the human race. Make more efficient brains, increase the lifespan of humanity by several magnitudes, and code out the flawed segments of DNA. I'll be damned if humanity is subjugated by the machines. :P

After all, what's the difference between biology and technology if not just the format?
User avatar
Blaine
 
Posts: 3456
Joined: Wed May 16, 2007 4:24 pm

Post » Thu May 19, 2011 3:13 pm

No, because of the limits of technology and the (seemingly) biological nature of consciousness. I predict that well before the end of this century the biological computer will overtake the technological in precedence.
User avatar
Taylor Thompson
 
Posts: 3350
Joined: Fri Nov 16, 2007 5:19 am

Post » Thu May 19, 2011 3:29 am

I don't see a Singularity happening, because it's based on the assumption that an intelligent being can create a being that is even more intelligent. And I just don't see how that should be possible, because the first being would have to know about its own shortcomings, and find a way to circumvent them - without himself gaining from it.

I mean, take calculators as an example: Many people say stuff like "calculators can calculate better than humans", but that's not true. They can only calculate faster. Doing something faster isn't the same as being more intelligent. Heck, a calculator can't even calculate with irrational numbers, which humans can even though they might not fully grasp the concept of irrational numbers. So calculators are actually worse in calculating than humans.

It's also been said that the development of AI is limited by technology nowadays, but that this limit is quickly going away (Moore's law, quantum computer etc.). Again, this is not true - the real limitation is in the capability of humanity of designing an AI in theory. Most algorithms have been found/solved (don't know which term is correct here) before computers even existed; nothing is holding us back from designing a supercomputer right now. Nothing except our own apparent inability to do so.
User avatar
NAtIVe GOddess
 
Posts: 3348
Joined: Tue Aug 15, 2006 6:46 am


Return to Othor Games