PC requirements for CRYSIS 2

Post » Wed Jan 07, 2009 3:45 am

@ bigboulder
You can even buy it without having a PC.
@ Reapling
I'm afraid your RAMM will be the problem.

Now seriously. Too bad that at this moment there aren't any real new things on the marked. Still HDMI (where is display port (or newer)) still no OLED screens. Where are the 10 bits color screens? DirectX 12 (since Direct 11 looks to me correcting what they did wrong in DirectX 10 ) and where are the graphics card that can do some ray tracing real-time? (The newest can do it with one frame every sec. I'm not sure how many steps the ray’s go in that case. And the processors are also not doing much. When can we see 16 cores on 5ghz processors?

If there was some nice stuff the release of Crysis 2 would be a good moment to upgrade.
User avatar
herrade
 
Posts: 3469
Joined: Thu Apr 05, 2007 1:09 pm

Post » Wed Jan 07, 2009 11:44 am

[quote]@ bigboulder
You can even buy it without having a PC.
@ Reapling
I'm afraid your RAMM will be the problem.

Now seriously. Too bad that at this moment there aren't any real new things on the marked. Still HDMI (where is display port (or newer)) still no OLED screens. Where are the 10 bits color screens? DirectX 12 (since Direct 11 looks to me correcting what they did wrong in DirectX 10 ) and where are the graphics card that can do some ray tracing real-time? (The newest can do it with one frame every sec. I'm not sure how many steps the ray’s go in that case. And the processors are also not doing much. When can we see 16 cores on 5ghz processors?

If there was some nice stuff the release of Crysis 2 would be a good moment to upgrade.[/quote]4GB is more than enough for 12 man games xD What doesn't seem to go through your head is the fact that the majority of us don't have 3000$ to spend, and the ones who do should use that money more wisely. The post above is pathetic, think before you speak.
User avatar
Ebony Lawson
 
Posts: 3504
Joined: Fri Feb 16, 2007 11:00 am

Post » Wed Jan 07, 2009 8:06 am

[quote]@ bigboulder
You can even buy it without having a PC.
@ Reapling
I'm afraid your RAMM will be the problem.

Now seriously. Too bad that at this moment there aren't any real new things on the marked. Still HDMI (where is display port (or newer)) still no OLED screens. Where are the 10 bits color screens? DirectX 12 (since Direct 11 looks to me correcting what they did wrong in DirectX 10 ) and where are the graphics card that can do some ray tracing real-time? (The newest can do it with one frame every sec. I'm not sure how many steps the ray’s go in that case. And the processors are also not doing much. When can we see 16 cores on 5ghz processors?

If there was some nice stuff the release of Crysis 2 would be a good moment to upgrade.[/quote]

You would think this guy went in a timemachine, saw the future, came back, and is now demanding 16 core processors with 25 inch OLED screens and full out ray tracing...

-_-

Please think of reality before you speak. Much of that technology you are demanding is atleast 5 - 6 years away.
User avatar
Tha King o Geekz
 
Posts: 3556
Joined: Mon May 07, 2007 9:14 pm

Post » Wed Jan 07, 2009 3:14 pm

Well you could buy x2 6 core CPUs, x2 4gb gpu cards, a 30 inch 2560x1600 monitor, 48 Gb ddr3 ram 1900mz and a 1TB SSD drve. That would be 8000-10000 at the lest. It would be cool to have a Pc like that, but I can't pay that much and would not do it if I could.

I think my PC should run it well.

Cpu: I7 2.66 GZ ( I did not but a better fan so no OC.)
RAM: 6 GB of DDR3 1600 MZ.
GPU: ATI 4870 ( It was the best card for under 200.)

That was my 2 cents.
User avatar
Mizz.Jayy
 
Posts: 3483
Joined: Sat Mar 03, 2007 5:56 pm

Post » Wed Jan 07, 2009 1:52 pm

i think you are going to high guys... i just bougth two GTs 250 with 1gb each. Also i bougth 4GB DDR2 6400 800mhz and i already have an intel quad core. I think that is more than enogth to play Crysis 2. Even with one of those card you can play it using a descent resolution but not full.
User avatar
Sophie Louise Edge
 
Posts: 3461
Joined: Sat Oct 21, 2006 7:09 pm

Post » Wed Jan 07, 2009 9:45 am

SLI will give u micro lags.

And im not thinking we are going to high in specs.
User avatar
Irmacuba
 
Posts: 3531
Joined: Sat Mar 31, 2007 2:54 am

Post » Wed Jan 07, 2009 4:55 pm

I really am interested in knowing these specs as well. I had a q9450 oc'ed at 3.5 ghz with 2x 8800 gts. I upgraded to a new silverstone raven case so I am ready to support 2x470's oc'ed when this game comes out. I have one oc'ed to 850/1700/2100, and am going to pick up the second when this game comes out. My question is once I got the first one I don't see any real reason for the second one as no games struggle with it. I know that on some games I am getting close to a bottleneck like dirt 2 at around 80 fps, but thats just because of dx11's hbao technology that is so cpu intensive and hence, wasteful as you can't even sli 2x400 series an i7 without a bottleneck on these games. I really am hoping crysis 2 to be revolutionary enough and gpu intensive like the last one, so that I can actually use and benefit from a second gpu with my cpu, as more intensive gpu games actually cause less cpu bottleneck then less intensive gpu games. My question is if we remember 3 years ago when the original crysis came out and the only cards out there were the 8000 series which you needed like the top 2 in sli to run crysis at 30fps. Do we need to do the same for this or since this game is going to be in consoles is it going to be less revolutionary. I really am hoping for it to be the first to maximize 2x gtx 480's or something and to not use the stupid dx11 hbao anymore because right now if it did it would in my opinion look like a decent mockery of bfbc2 graphics which would be a huge disappointing to me. who remembers the revolution the original crysis brought we need that again to push gaming back up and bring back pc necessity from those poor console systems that is holding the improvement of gaming forward. I mean most people still say crysis has the best graphics out of all games and its been nearly 4 years we can't let this game be anything other then a revolutionary in graphics or in my opinion its a failure as gameplay was never its main strength.

so final question what will max quality be like and require and do you think my system with 2x470's will be able to max the game out (i am hoping for atleast yes, but barely would be nicer.)
User avatar
Christine
 
Posts: 3442
Joined: Thu Dec 14, 2006 12:52 am

Post » Wed Jan 07, 2009 7:47 am

Oh yeah and why I think dx11 hbao would be such a failure is because cpu's have not been revolutionary it has fallen so far off the moore's law pace as I bought my q9450 2 years ago and an i7 for the same price I paid for my q9450 2 years ago isn't even 2x faster and moore's law says to double the speed every 1.5 years I mean my cpu has barely even dropped in price so what is dx thinking to move the work to the cpu's more its the gpu's that are revolutionary move all the shading and work over to the gpu already and get some new **** out there I can't tell a difference of almost all dx11 games when they are playing in dx11 or dx10 anyway except that my fps drop about 20 frames due to the cpu intensity. Especially in dirt 2, which is ashame since they wasted so much time integrating dx11 tech in that game. we really do need a gpu revolution, and I know sm3.0 has the capabilities for developers use so I am hoping that crysis 2 will do this anyone have any insight.
User avatar
Queen of Spades
 
Posts: 3383
Joined: Fri Dec 08, 2006 12:06 pm

Post » Wed Jan 07, 2009 1:04 pm

oh yeah and sli gives you micro lag only when you have a crappy motherboard that doesn't support your sli bandwidth well enough. so bigboulder you should really consider sli or your computer will be toast within the year no matter what you got unless you don't think you want to play future games.
User avatar
phil walsh
 
Posts: 3317
Joined: Wed May 16, 2007 8:46 pm

Post » Wed Jan 07, 2009 5:10 am

[quote]Oh yeah and why I think dx11 hbao would be such a failure is because cpu's have not been revolutionary it has fallen so far off the moore's law pace as I bought my q9450 2 years ago and an i7 for the same price I paid for my q9450 2 years ago isn't even 2x faster and moore's law says to double the speed every 1.5 years I mean my cpu has barely even dropped in price so what is dx thinking to move the work to the cpu's more its the gpu's that are revolutionary move all the shading and work over to the gpu already and get some new **** out there I can't tell a difference of almost all dx11 games when they are playing in dx11 or dx10 anyway except that my fps drop about 20 frames due to the cpu intensity.[/quote]
Moore's law actually is the doubling of transistors on the same die space every two years, the processing speed doesn't have to double with it. It's hard to make processors twice as fast, while it's easy to make graphics cards twice as fast; just add more cores to graphics cards and they get faster, doing the same for processors doesn't really work unless the programs running can actually use the extra processor cores(which is hard to implement compared to graphics cards).

If anyone is planning on building a computer for this, I would wait until a month before the game is expected to be released to go out and buy one. The next processor generation from Intel and(maybe) AMD will be out along with the next graphics gen(6xxx series) from AMD(no, that is not a misspelling, AMD has killed the ATI moniker after all these years)

@Reapling, I could build a core i7 rig for ~$1200(without monitor). I would go for an AMD or core i5 rig to play games though, they're much cheaper for the same fps in games.
User avatar
Max Van Morrison
 
Posts: 3503
Joined: Sat Jul 07, 2007 4:48 pm

Post » Wed Jan 07, 2009 2:48 pm

I'm actually interested to see if the next gen processors will actually be able to play this game without a discrete graphics card, since they will have integrated graphics on the processor itself. Not expecting much, just wondering if it would be playable.
User avatar
Czar Kahchi
 
Posts: 3306
Joined: Mon Jul 30, 2007 11:56 am

Post » Wed Jan 07, 2009 2:23 pm

Highly unlikely. Non-discrete will still svck at gaming for many years to come.
User avatar
Alessandra Botham
 
Posts: 3440
Joined: Mon Nov 13, 2006 6:27 pm

Post » Wed Jan 07, 2009 5:59 pm

One thing moore's law is first double every 1.5 years, it doesn't make sense to double every 2 years your getting confused with 2 and double. Secondly while yes the transistor count is true it still doesn't hold up. My computer is 2 years old and my old core 2 quad has 820 million transistors and core i7 950 and all the top i7's only have 731 million transistors, but yet they perform much better. So keep in mind I am kinda being on the friendly side still because intel really has fallen behind or technically has not moved ahead at all in moore's law terms .This is a platue like no other intel movement. I mean technically talking about moore's law amd x6 has the most with 904 million transistors and still this is 2 years after the core 2 quad release. The only improvements in these processors is architecture allowing it to calculate floating point integers differently with newer architecture. Now for the importance with the nice improvements i7's for the same price do perform more like 60% faster then a good core 2 quad so I was comparing performance over transistors to make it more fair as transistor count really doesn't count for performance like how it use to, but even at that intel is still way too behind moore's law to be considered respectful. Cpu's are amazing at that since they are limited in size unlike gpu's which are still expanding but that doesn't resolve the truth that they have improved less then they ever did these past 2 and on going 3 years till amd APU's come out.

Now for shinanigan's you actually ask a cool question about dicrete gpu's future. I have no real fact to back this other then theory but AMD has been talking about their apu' techonology which the first will come out next year to compete with westmere architecture next gen intel's and the purpose of the apu's are meant to integrate the cpu and gpu's into one chip. It purpose is to make it so there is a revolution in the bandwidth speed from how the cpu and gpu communicates between eachother and everyone understands the difference between a hard drive and memory right well think of this more of the difference between memory and cpu cache speed it will improve gpu's performance by a lot and very well might make the first no discrete graphic cards have a chance to hold up against the descrete. I mean this has been tried and failed before because at the time our transistor technology for the cpu's was continuously improving but now that our cpu's aren't improving so much the next step is instead of always adding more cores to the cpu or cores to the gpu where they become unbalanced or too separated to take full advantage of eachother speed is to shrink the cpu instead of improve it and then improve gpu bandwidth which would be much more efficient then just adding more core's onto the gpu with an already old pci extreme technology. Now since this tech has been talked about for over 5 years and held back first made me think that they were having difficulties, but after I have seen intel fail and fail again to keep up with moore's law I think its more of a marketing situation. I mean what would be the benefit for amd who has always been the cheaper manufacturer to monopolize the cpu market, I mean competition is beneficial especially when they can sell old architecture for 4 years that they didn't even plan on needing to make profit off of originally.
User avatar
lucile davignon
 
Posts: 3375
Joined: Thu Mar 22, 2007 10:40 pm

Post » Wed Jan 07, 2009 2:38 pm

Whops sorry shinanigan didn't ask the question kraden did sorry
User avatar
Dawn Farrell
 
Posts: 3522
Joined: Thu Aug 23, 2007 9:02 am

Post » Wed Jan 07, 2009 6:25 pm

Just an FYI: Moore's law states that TRANSISTOR count doubles every 2 years, not that PERFORMANCE doubles every 2 years. That and it's not limited to CPU's exclusively, as it covers all types of IC's.

http://en.wikipedia.org/wiki/Moore%27s_law

End of the day, it's just an observation, not a law.
User avatar
Ross Zombie
 
Posts: 3328
Joined: Wed Jul 11, 2007 5:40 pm

Post » Wed Jan 07, 2009 6:23 pm

okay thanks I see that I did get confused with others who predicted every 18 months while your right moore did say every 2 years. Still weather its for all types of IC's why is it now that the cpu's have been unable to keep up with Moore's law, when in the past it looks like they always have been able to?
User avatar
Beat freak
 
Posts: 3403
Joined: Thu Dec 14, 2006 6:04 am

Post » Wed Jan 07, 2009 11:57 am

Because as i said, it's one guy's observation of a trend. It's not something cast in stone and it's pretty phenomenal that it's been so accurate thus far.

Semi-On topic: You don't believe in fortune tellers do you?
User avatar
Amy Melissa
 
Posts: 3390
Joined: Fri Jun 23, 2006 2:35 pm

Post » Wed Jan 07, 2009 4:49 am

[quote]Because as i said, it's one guy's observation of a trend. It's not something cast in stone and it's pretty phenomenal that it's been so accurate thus far.

Semi-On topic: You don't believe in fortune tellers do you?[/quote]

Fortune tellers are all wackos. :D

And yes, Moore's law isn't really a law, more like a prediction of how quick chip technology would progress. Now instead of doubling transistors, it is more about architecture and the number of cores to divvy up the workload as effectively as possible.
User avatar
courtnay
 
Posts: 3412
Joined: Sun Nov 05, 2006 8:49 pm

Post » Wed Jan 07, 2009 11:05 am

Are you trying to say that all fortune tellers are fakes? Blasphemy! :P

Too bad superscalar processors don't increase performance all that much. It would be nice to have 1 huge processor that could process everything at once(like intel's hyperthreading tech) while still being able to be faster for single threaded tasks.

[quote]okay thanks I see that I did get confused with others who predicted every 18 months while your right moore did say every 2 years. Still weather its for all types of IC's why is it now that the cpu's have been unable to keep up with Moore's law, when in the past it looks like they always have been able to?[/quote]
I want to see you double the transistor count of a processor, while making it use those transistors to increase performance, every two years :)
They're only human beings after all. They are intelligent, but it's very hard to find a way to increase performance while being pinned against a thermal and economical wall; they can't make too many transistors or the money they make per processor goes down plus the extra heat from more transistors causes lower clockspeeds. If only there was a way to switch from x86 in processors to something more parallel like GPGPU.
User avatar
victoria johnstone
 
Posts: 3424
Joined: Sat Oct 14, 2006 9:56 am

Post » Wed Jan 07, 2009 3:07 am

Mhh... speculating is so much fun...
User avatar
Je suis
 
Posts: 3350
Joined: Sat Mar 17, 2007 7:44 pm

Post » Wed Jan 07, 2009 11:39 am

Actually, if you wanted to count hyperthreading, IPC increase, and overclocking inrease I would say that the core i7 would be a sizeable inrease to the speed of the core 2 generation.
25%+20%+20% for a 65% increase.
Of course, hyperthreading doesn't get used much, since you need more threads to actually use the extra threads HT gives.
User avatar
Alexandra Louise Taylor
 
Posts: 3449
Joined: Mon Aug 07, 2006 1:48 pm

Post » Wed Jan 07, 2009 9:06 am

I agree kraden 65% is a sizeable increse but architecturally its still too similar to the core 2 quad if you ask me the only real main improvements were hyperthreading, qpi, and sse4. But other then that its core design is very similar to the core 2 quad series. Look at the 2nd gen of westmere, its a huge architecture jump. I mean I know the i7 is a good 65% better I do agree at least for games that support 4 core hyper threading and a graphic card that can make full us but its not like its even considerable to how much of a jump the core 2 was from the pentium D that was almost like a 300% jump in performance maybe even more. Whats most important is that I still believe the gpu's with 3 billion transistors deserve to do all the shading and I am really against the ambiant occulsion technology as it really is too inefficient procesors can be used for a much better purpose then shading.
User avatar
TIhIsmc L Griot
 
Posts: 3405
Joined: Fri Aug 03, 2007 6:59 pm

Post » Wed Jan 07, 2009 9:30 am

[quote]SLI will give u micro lags.

And im not thinking we are going to high in specs.[/quote]
[quote]SLI will give u micro lags.

And im not thinking we are going to high in specs.[/quote]
[quote]You flagged this comment. The gamesas team will review it as soon as possible.[/quote]


bigboulder, i didn′t want to flag your comment, i didn′t do it on porpouse, it was just lag in my hand.

SLI will give you lags just if doesn′t support the GPU or the video card memory. This happends when the GPU is doing his job faster than the CPU, wich is highly unprobable in the configuration i mentioned before. And yes one guy mentioned very high specs, eve thing that doesn′t exist..
User avatar
Stephani Silva
 
Posts: 3372
Joined: Wed Jan 17, 2007 10:11 pm

Post » Wed Jan 07, 2009 6:35 pm

I have
Core 2 Quad Q9550
4 GB RAM OCZ GOLD (Dual-ch)
GTX 285 1GB XFX
Windows 7 64bits
How much FPS, will have?
User avatar
Adam Kriner
 
Posts: 3448
Joined: Mon Aug 06, 2007 2:30 am

Post » Wed Jan 07, 2009 5:05 pm

i think you are fine... maybe with another gtx 285 you would be perfectly fine....
User avatar
Raymond J. Ramirez
 
Posts: 3390
Joined: Sun Oct 14, 2007 8:28 am

PreviousNext

Return to Crysis