I've heard some things about Nvidia and AMD...

Post » Wed Dec 02, 2015 5:46 am

I can confirm that a 980 TI has no issue doing single GPU PhysX on Arkham City, Origins, or Knight, Black Flag, or Madness Returns. Haven't tried the Witcher 3 yet.

Some of the Gameworks features are basically future-proof features. There may be no single GPU at game release that will max the game at 60fps with all features turned on.

User avatar
Timara White
 
Posts: 3464
Joined: Mon Aug 27, 2007 7:39 am

Post » Wed Dec 02, 2015 6:13 pm

I can aslo confirm the 980 Ti is on a whole other tier compared to a 970, even an EVGA FTW 970. Your experience is with a $650 or more GPU.

Some of those games I mentioned came out in 2011 or earlier.
User avatar
Sarah Edmunds
 
Posts: 3461
Joined: Sat Jul 08, 2006 8:03 pm

Post » Wed Dec 02, 2015 5:58 pm

Not sure which article you're referring to - it may be one I've already seen. I did see http://arstechnica.co.uk/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/ that showed that in one particular game Nvidia saw little benefit in DX12 over DX11, but as they pointed out, that could have been because;

  1. It was just one game (and we have no information on how well the developers optimised that game to the particular Nvidia card the article authors were using);
  2. Nvidia architecture and drivers may be highly optimised for the more linear implementation of DX11, and so see less benefit from the switch to DX12.

Also, as AMD had already done work optimising their drivers for Mantle, I can imagine they were further along in getting optimised drivers for DX12 than Nvidia were.

Certainly in that article the Nvidia card only saw a reduction in framerate for DX12 in one test, where the article authors had downgraded the CPU by disabling two cores and hyperthreading - and, frankly, if you're going to mess around with a CPU like that I'd be surprised if you didn't get a few peculiar results.

I'd still maintain that, if a developer wants to get the best out of any GPU architecture, then having as thin an API as possible is the way to go (Mantle or DX12 for AMD, DX12 for Nvidia). However, getting closer to the hardware does mean the game developer has to do more of the optimisation themselves - and not all developers will be equally good at that.

I agree, no reason to get worried about what graphics hardware we have, whatever middleware or extra features Bethesda use for each brand. As a cross-platform game, I reckon Bethesda will have only minor bells and whistles varying from one GPU type to another.

User avatar
Kayleigh Williams
 
Posts: 3397
Joined: Wed Aug 23, 2006 10:41 am

Post » Wed Dec 02, 2015 8:53 am

Yeah, that's why I said some of the Gameworks features are future-proofing. They are the icing and cherry on top when you come back to those games a few years down the road with a more powerful set-up. Many games famously (or infamously) can't be run on Ultra settings until years after release.

User avatar
jason worrell
 
Posts: 3345
Joined: Sat May 19, 2007 12:26 am

Post » Wed Dec 02, 2015 3:28 am

It's not so much the nvidia drivers that lack DX12 support, but the fact the GPUs themselves lack DX12 async compute and it is emulated in their drivers. As shown as much lately, proper async compute support has shown the most benefit in performance in DX12, this is where GCN shines.
User avatar
James Shaw
 
Posts: 3399
Joined: Sun Jul 08, 2007 11:23 pm

Post » Wed Dec 02, 2015 4:21 am

Havok has been the physics library for all previous Bethesda titles so I doubt PhysX is used. That would be bad news since I love those crazy explosions and giant smashes. Shadows were bad in Skyrim so that would be my guess. Also all the Skyrim models had the painted doll hair look. It would be nice to see those issues fixed as long as Nvidia doesn't require some kind of Maxwell only tessellation that gimps it on AMD and older NVidia cards.

User avatar
Lifee Mccaslin
 
Posts: 3369
Joined: Fri Jun 01, 2007 1:03 am

Post » Wed Dec 02, 2015 11:29 am

Still going to upgrade to an R9 390. I'm more familiar with ATI and their models. I haven't had a GeForce since 2002 and I'm not going to figure out their odd model system.

User avatar
Charlotte X
 
Posts: 3318
Joined: Thu Dec 07, 2006 2:53 am

Post » Wed Dec 02, 2015 8:14 am

I used both several AMD GPUs and several nVidia GPUs and every nVidia GPU fried in 3 years while AMD GPUs endured 4+ years. Also considering the fact that nVidia doesn't really care about price/performance segment for many years now, we end up with them pricing 128-bit stuff at 250$. They just look at AMD's performance and inhibit theirs now. At least Intel doesn't do thing like that and you get your money's worth, you get maximum performance at a reasonable price. So I'm still sticking with AMD for now but apparently they aren't working close with the developers I care like Creative Assembly, they failed big time with Attila TW this year. If they failed again with FO4, I might have to switch to nVidia next time.

User avatar
Jani Eayon
 
Posts: 3435
Joined: Sun Mar 25, 2007 12:19 pm

Post » Wed Dec 02, 2015 9:26 am

WCCFtech isn't exactly the most reputable site around. Take what you read there with a grain of salt, especially stuff such as this.

And if FO4 were going to be a GameWorks game, I think we would have heard official word about it by now.

User avatar
Stephanie Nieves
 
Posts: 3407
Joined: Mon Apr 02, 2007 10:52 pm

Post » Wed Dec 02, 2015 9:38 am

Look at Piper's hair in https://youtu.be/kizzTSyvoLQ?t=25s, obviously hair physics are in Fallout 4 to some extent. PhysX support seems to be a real possibility as well.

User avatar
Epul Kedah
 
Posts: 3545
Joined: Tue Oct 09, 2007 3:35 am

Post » Wed Dec 02, 2015 9:30 am

I haven't seen anything confirming the Gameworks rumors, or giving a better idea of how screwed AMD card owners are going to be. Still, even the rumors are enough to make me nervous, and I'm impatiently delaying ordering the GPU for the rig I'm building specifically to be able to play F4.

User avatar
rolanda h
 
Posts: 3314
Joined: Tue Mar 27, 2007 9:09 pm

Post » Wed Dec 02, 2015 11:18 am

Why delay? I imagine you already have an idea of whether or not you really want an AMD of Nvidia GPU. The same article that claims Fallout 4 is using Gameworks also said that Bethesda also worked with AMD to do optimizations.

You can look up a list of what potential Gameworks features are likely to be in the game, and decide whether or not those extra bells and whistles would be worth it for you or not. Both an AMD or an Nvidia card should run the rest of the game on Ultra settings if powerful enough.

User avatar
Natalie Harvey
 
Posts: 3433
Joined: Fri Aug 18, 2006 12:15 pm

Post » Wed Dec 02, 2015 2:45 pm

My original part hunting had me doing a lot of comparisons, and I'd settled on the Radeon R9 380 as being generally a slightly better cost-to-performance ratio card than the GTX 960. However, if it turns out that F4 is going to look 20% better on Nvidia cards and/or AMD cards are going to run 20% worse with the settings on Ultra, then that value calculation goes out the window.

User avatar
Maya Maya
 
Posts: 3511
Joined: Wed Jul 05, 2006 7:35 pm

Post » Wed Dec 02, 2015 8:44 am

Yeah, I was split on 960 to 380 as well and I decided to give the game a try first because I'd feel terribly angry if AMD performed bad like it did for Attila Total War.

User avatar
Austin England
 
Posts: 3528
Joined: Thu Oct 11, 2007 7:16 pm

Post » Wed Dec 02, 2015 4:16 am

If Bethesda has done their due diligence, there shouldn't be much of a performance difference between AMD and Nvidia.

We can already tell they aren't using Hairworks. So that leaves a few possible Gameworks options open:

1) Tessellation

2) TXAA and HBAO+ (Most likely in my opinion)

3) Volumetric lights (Very possible)

4) Turbulence (Used for rolling smoke, water, etc.)

5) PhysX (Unlikely - though possible. Most Nvidia users wouldn't be able to turn this on.)

The ambient occlusion and anti-aliasing can be closely matched with AMD options. AMD cards are traditionally not as strong at tessellation.

User avatar
Nuno Castro
 
Posts: 3414
Joined: Sat Oct 13, 2007 1:40 am

Post » Wed Dec 02, 2015 3:18 pm

You should just check reviews about your GPU you interested for, and then decide. Fallout 4 it's just another Bethesda game, it's not the end of the world.

Am i wrong ??? :wink:

> how screwed AMD card owners are going to be. <

Why you thing that way ???

Even if it's true not all Gameworks features are forbidden on AMD GPUs, but even so - just play without those.

There will be plenty of game details you can tweak within the options UI so there is nothing you should worry about.

And i am sure there will be patches to come (after release) either from the Beth devs as from Ati as well. :wink:

If it's a fast GPU - like the R9 390x for example, that's a mammoth GPU. You have nothing to be afraid.

Also, you may know or may not, but with the DirectX12 API, the majority of ATI GPUs will receive a huge performance boost.

And yes, i don't own an ATI GPU but that doesn't mean anything... :wink: (some things are just as they are and some not)

User avatar
Tracey Duncan
 
Posts: 3299
Joined: Wed Apr 18, 2007 9:32 am

Post » Wed Dec 02, 2015 5:03 am

If others are anything like me, this fallout game will last for 5-6 *years*. Other games you may end up shelving after a month of playing it. The construction kit really gives this game extra legs. So buying a card that works well with the game.. hell, I bet a fair number of us have been still playing new vegas/fallout 3. (Or both with ttw)

User avatar
Jeff Turner
 
Posts: 3458
Joined: Tue Sep 04, 2007 5:35 pm

Post » Wed Dec 02, 2015 2:20 pm

Yeah, when I buy a new PC, I decide based on Total War and TES/FO series. If those run well, I'm set. I can play those 5-10 years later, unlike most stuff. A machine capable of playing the latest TW well is powerful enough to play anything else anyway. And Unreal Engine is crafted and optimized in a way to work well with both companies so I never have any problem with the overwhelming majority of other games.

User avatar
~Amy~
 
Posts: 3478
Joined: Sat Aug 12, 2006 5:38 am

Post » Wed Dec 02, 2015 7:12 am

It should be pointed out that, Mad Max is a GameWorks game and is one of the best performing games of recent memory. The mere inclusion of GameWorks does not mean that a game will perform poorly.

Though, some specific GameWorks settings may cause issues, even on Nvidia hardware. Take HairWorks in TW3, for example. This will put a dent in your FPS, even if you're using a high end Nvidia card. Some others, though, are perfectly fine, even on AMD hardware. For example, I'm using HBAO+ with my 290X in TW3 w/ a mix of high and ultra settings and get a pretty damn solid 60 FPS in 1080p.

User avatar
Tania Bunic
 
Posts: 3392
Joined: Sun Jun 18, 2006 9:26 am

Post » Wed Dec 02, 2015 4:58 am

http://wccftech.com/fallout-4-nvidia-gameworks/

User avatar
Ron
 
Posts: 3408
Joined: Tue Jan 16, 2007 4:34 am

Post » Wed Dec 02, 2015 2:19 pm

Yep Mad Max ran just fine on my GTX 460 and AMD 6300 on mostly medium settings.

User avatar
Isaiah Burdeau
 
Posts: 3431
Joined: Mon Nov 26, 2007 9:58 am

Post » Wed Dec 02, 2015 5:35 pm

th

just read this i guess im good to go with r9 390x
i search video in youtube for witcher 3 to see what diffirent if turn hairwork on/off

here the link for gtx 980 (when he turn hairwork off and he got fps around 50-60)
https://www.youtube.com/watch?v=UUTsy3OeiwQ

and this is r9 390x only hairwork off fps around 40-50
https://www.youtube.com/watch?v=eUecIcrfPl0

for me it look like no diffirent much
the nvidia gamework just for people who want more lag(lolllllllllllllllllllll im just kidding)

sry my english not good

User avatar
Smokey
 
Posts: 3378
Joined: Mon May 07, 2007 11:35 pm

Post » Wed Dec 02, 2015 2:38 am

This is my first time actually building a computer that's attempting to play anything at Ultra. I haven't had a machine that's even capable of running current-gen games at anything higher than bare minimum in years, so I'm not particularly up on all the ins and outs of how the tech actually shakes out in real performance instead of just comparing a bunch of benchmarks and spec sheets. So I suppose I'm a bit more susceptible to buying into the whole "Nvidia cards or bust" in regards to Gameworks (or in general, really).

User avatar
Amy Gibson
 
Posts: 3540
Joined: Wed Oct 04, 2006 2:11 pm

Post » Wed Dec 02, 2015 5:08 pm

There has allways been differences in performance between amd and nvidia from game to game. Some game developers choosing to work with one or the other to provide optimizations for their respective hardware.

I don't mind that... optimizing im good with, but actively sabotaging performance for the competitor is crossing the line (if that is the case).

User avatar
Jeffrey Lawson
 
Posts: 3485
Joined: Tue Oct 16, 2007 5:36 pm

Previous

Return to Fallout 4