PC gaming becoming too expensive

Post » Mon Nov 30, 2015 3:18 am

I just would like to show you guys how we are being ripped of by big companies such as NVIDIA and ATI.

I just bought new Assasins Creed game, and what I saw just blew my mind away.

[url=http://imgur.com/FxgrQpV]http://imgur.com/FxgrQpV

So basically to run the game smoothly at 1920x1080 with highest graphics you need 8GB graphic card.

My GPU in fact should show double what it shows there, I dont know why it only shows 2GB, but nvm.

You can get some ATI cards with 8GB at the price of around $350 or if you wish, fork out whole $1000 for Titan X for 12GB.

While I wouldn't mind to spend some money on TITAN X, but the card in the next 2 years become useless, as there will be more resource demanding games coming out.

At the same time you can get for a quarter of TITAN X price an Xbox One or PS4, not just a component but ENTIRE machine able to handle the newest Assasins Creed and next couple year games.

The bottom line is, both ATI and NVIDIA become greedy bloodthirsty !@#$%^-s which probably ALREADY have developed a 128GB graphic card, but is willing to milk out PC users to the limits. I see no reason whatsoever why GPUs have to be so damn expensive. In most cases people wants them for gaming anyway. The greed is legendary.
Whats your thoughts guys?
User avatar
Sylvia Luciani
 
Posts: 3380
Joined: Sun Feb 11, 2007 2:31 am

Post » Mon Nov 30, 2015 12:15 am

Except, it's not just NVIDIA and ATI. It's all computer companies, console companies, components, etc. Look how they were putting out new consoles every year or two forcing you to buy another and another to "upgrade" to the new games. The game designers are building games based on the highest tech available and what is coming as computer companies put out more and better every 6 months. The only way to stop it? Don't buy it. Don't upgrade to play it. And force these companies to slow down again.

User avatar
Lifee Mccaslin
 
Posts: 3369
Joined: Fri Jun 01, 2007 1:03 am

Post » Sun Nov 29, 2015 10:02 pm

My thoughts are that you don't need to run at maximum possible ingame settings...

I run with the settings my hardware can manage. This hardware were selected from what I wanted and could afford to spend on hardware. Maybe when I upgrade, I'll revisit the game and then play at higher settings. :)

User avatar
Kayla Bee
 
Posts: 3349
Joined: Fri Aug 24, 2007 5:34 pm

Post » Mon Nov 30, 2015 8:25 am

Its only expensive if you want to play at highest resolution. If not, then a $100 card will outperform the PS4 and XB1 right now assuming you already have an ok processor. Cutting edge video cards have always been expensive on release. This is not new. Back when 3DFX was around, their Voodoo card was around $300 on release and we are talking nearly 20 years ago. About 10 years ago, a Quadro was around $1k as well.

A frugal gamer never buys cutting edge. You always get the mainstream model (best price to performance model) because the difference in performance is not worth the cost, and in two years, the mid range model will blow away a the cutting edge model anyway. I actually wait about 3-4 years per upgrade and have not really missed the high end graphics. The games are fine in medium and just as enjoyable. If you have money to burn, go for it and buy whatever you like. Otherwise save that money and invest in some stocks or something.

User avatar
Charleigh Anderson
 
Posts: 3398
Joined: Fri Feb 02, 2007 5:17 am

Post » Sun Nov 29, 2015 11:22 pm

Unfortunately this is how it has always been except that when the consoles really got popular it helped keep video games in check so that we could feasibly build around the 5 to 7 year life span of a console...Which is advantageous for a PC gamer to say go in a little more on parts thinking that the game may look better than his console brother.

User avatar
Aman Bhattal
 
Posts: 3424
Joined: Sun Dec 17, 2006 12:01 am

Post » Sun Nov 29, 2015 9:36 pm

I don't know.. How often do people need to upgrade consoles? Every 2-3 years? I've been upgrading graphics cards every 3-5 years. I think in the long run owning a Gaming PC may be cheaper than buying consoles.

User avatar
Betsy Humpledink
 
Posts: 3443
Joined: Wed Jun 28, 2006 11:56 am

Post » Mon Nov 30, 2015 2:51 am

wth has this topic to do with fallout 4? Oo

User avatar
Charlotte Lloyd-Jones
 
Posts: 3345
Joined: Fri Jun 30, 2006 4:53 pm

Post » Mon Nov 30, 2015 10:00 am

Yeah, this should probably be in: http://www.gamesas.com/forum/18-community-discussion/

User avatar
Cartoon
 
Posts: 3350
Joined: Mon Jun 25, 2007 4:31 pm

Post » Mon Nov 30, 2015 1:43 am

Eh, what? More like every 5-8-10 years, when a new gen launches. The point of consoles is that the hardware remains the same.

(buying another console of the same gen - i.e, an X1 when you already have a PS4 - isn't upgrading.)

And this topic is an interesting contrast to the threads that kept popping up during the PS3/X360 gen... "AAAA! Consoles are holding back PC games, haven't had to upgrade/push my hardware! This is terrible!" Of course, these are two different groups of people.... the ones who push the envelope with $500+ GPUs, watercooling, overclocking, etc, want games to get ever tougher to run well; the people with budgets who make more average computers are happy to have game reqs not climb too quickly. :tongue:

(personally, I'm more in the second group. Most expensive GPU I've ever bought was $175. And I was thrilled that the PS3 gen let me run games for years & years on an ATI HD2600. But then, I've never expected to run games at Ultra, let alone Ultra/60fps :shrug:)

User avatar
Juan Cerda
 
Posts: 3426
Joined: Thu Jul 12, 2007 8:49 pm

Post » Mon Nov 30, 2015 6:46 am

You have to upgrade grapic card around every 3-5 years depending on how powerful card you bought.

Let say you upgraded your pc to run 360 games like Oblivion, Skyrim is the same generation but will be far more optimized on consoles and the pc version will have additional features.

We are still early in the current generation so the same will happen again, now add that 4K screens start to get common.

User avatar
Music Show
 
Posts: 3512
Joined: Sun Sep 09, 2007 10:53 am

Post » Sun Nov 29, 2015 11:49 pm

2-3 years? No, more like 5 - 6...minimun. And that's only if you buy the new console at release when your previous console purchase was about 14-20 months after it's release.

Some have thought console gens have becoming shorter, when actually if you look at the past 4 generations for Sony they've been about the same or longer than the predecessor.

PS1 to PS2 - about 4.5 years.

PS2 to PS3 - about 6.5 years. (could have been longer, but Microsoft rushed out the 360 and forced Sony's hand)

PS3 to PS4 - *almost* 7 years, 2 days short to be precise (and this time, Sony was the one that forced the hand of Microsoft)

Now rumors are that this generation will be shorter than the last. However, I've heard that rumor now for about 15 years. So take it with a grain of salt.

For example, I did not buy a PS3 or a PS4 at launch. Bought my PS3 in Feb of 2008. Bought my PS4 in Nov of 2014. Both were new consoles, but not new in the terms of the launch date of that console. 6.5 years between purchases (and the PS4 purchase wasn't even my idea, it was my wife collaborating with my parents and her parents to go in on one big Birth-Mas gift...my birthday is Dec 15th)


I used to be in the first group in the 90's and early 2000's...then I moved to the 2nd group, and by 2010 I pretty much was wrapping up my time PC. As I've left competitive gaming a couple years ago, I've been wanting to do a rig again and building it to try to keep ot future-proof for several years. Console is just so damn easy though.

User avatar
Benito Martinez
 
Posts: 3470
Joined: Thu Aug 30, 2007 6:33 am

Post » Mon Nov 30, 2015 7:46 am

Eh, upgrading consoles takes however long it takes for the new console to come out, which in my case was 4 playstation consoles in 17 years.

Also, OP, you mention how expensive card it takes to run the game on max settings, but then you bring up consoles? Seriously? Console versions are often hardly at medium graphical settings, and usually run at 30 fps, and not even consistently. So no, you don't need the latest titan x card, just tone the graphic options down, and you have the same thing. ;)

User avatar
Paul Rice
 
Posts: 3430
Joined: Thu Jun 14, 2007 11:51 am

Post » Mon Nov 30, 2015 8:31 am

I don't see your point, they only evolve as fast as they get funding to hire researchers to evolve, which comes from you the consumer. Oh yeah let's just all not buy stuff and let technology go stale. NO, the answer is lower minimum thresholds on earnings and higher thresholds on high earning jobs to help everyone be able to afford the tech and develop it faster.

Don't contemplate stagnating the industry that is almost solely responsible for the fastest evolution of us as a species. If you want to have a cause go stop people spending their money at chain coffee shops and fast food. Cooking is for the soul, not starve your wallet.

User avatar
ILy- Forver
 
Posts: 3459
Joined: Sun Feb 04, 2007 3:18 am

Post » Sun Nov 29, 2015 8:01 pm

No offense meant but your post is slanted and inaccurate in so many ways.

My most recent video gaming laptop has Intel Integrated graphics and an overpowered CPU. I just turn all the settings down to minimum and it's good enough for any game that isn't written like completely shoddy inefficient crap.

I've had a bad experience with video cards in general, particularly nVidia ones. In the last 8 years, I've had one nVidia desktop card burn out, and several laptop nVidia chips burn out (which meant that I had to send my laptop in for a new logic board!!!).


You shouldn't be using Assassin's Creed as a GPU performance baseline on a PC. Ubi$oft (and particularly the Assassin's Creed franchise) are notorious for treating PC gamers like crap, locking their framerate down to 30 fps, and doing shoddy console ports to PC with insane and unreasonable system spec requirements. There's nothing wrong with your computer, Ubisoft just fails hard.

You're getting ripped off there. You can buy a good gaming desktop for well under $1000.

That's the "AAA" video game industry's fault, not yours or ATI's. I usually just put all the video settings on minimal and use an underpowered computer and it works fine.

If you absolutely insist on Assassin's Creed, then just get a console. Ubi$oft hates PC gamers anyway.

128GB graphics card? Lolwhut? I didn't even know those existed. Also they aren't being "greedy" - almost all of the time, a midrange video card is good enough.

tl;dr; you're ripping yourself off on the latest video cards and using a nonsensical baseline for GPU performance.
User avatar
Riky Carrasco
 
Posts: 3429
Joined: Tue Nov 06, 2007 12:17 am

Post » Mon Nov 30, 2015 10:00 am

It sounds to me like they're trying to "future-proof" the game by giving it some high-end graphics settings for equipment not yet in common use. Is there something wrong with the way the game looks at a lower setting?

I don't understand the annoyance. If the game is playable and looks good, why worry about whether somebody else can run it with higher graphics settings?

Or if it's really important to you, bite the bullet and buy the new card. (I wouldn't. And I speak as someone who just bought a new graphics card, replacing one that was five years old, and that really wasn't doing the job.)

User avatar
latrina
 
Posts: 3440
Joined: Mon Aug 20, 2007 4:31 pm

Post » Mon Nov 30, 2015 11:28 am

I've been PC gaming for 20 years and console gaming for about the same. I've gotten by with playing games on ultra a few years after they are released so my medium range machine could play them the way they were meant to be played. I've never been forced, not once to buy a fancy video card.

What's this about having to buy a console every 2 to 3 years. I've never had to do that. I always get a modern console and it's a lot longer than two years. And that's with any of the oompanies.
User avatar
Joey Bel
 
Posts: 3487
Joined: Sun Jan 07, 2007 9:44 am

Post » Sun Nov 29, 2015 11:14 pm

It sounds like what you are really saying is that at this stage of consumer hardware progress, running games at the highest possible settings and 60fps is more demanding than it was a couple years ago. I'd agree, but it's partly because a few years ago, most games were designed to run on 2005 hardware.

If you already need a decent computer for school or work projects, it's not much more expensive to get something with a mid-range graphics card that can handle new games reasonably well (not 60fps maximum settings). And if you don't already need a laptop or desktop, of course, it will be much cheaper to buy a console.

User avatar
P PoLlo
 
Posts: 3408
Joined: Wed Oct 31, 2007 10:05 am

Post » Sun Nov 29, 2015 9:22 pm

Logic board? So you had a Macbook? (Apple calls their motherboards/mainboards a logic board). Did they happen to have nvidia 7000 series integrated GPU's? Those were so notorious for problems it actually became named, 'bumpgate", due to the GPU's swelling like a bump. Apple, HP, and some other companies have sued Nvidia over the bumpgate incident.

User avatar
Mélida Brunet
 
Posts: 3440
Joined: Thu Mar 29, 2007 2:45 am

Post » Mon Nov 30, 2015 3:36 am

PC's really are not that expensive only if you want to play the latest PC versions of video games with all of their graphical settings set all the way up to Ultra.

$800 dollar (USD) PC's can play most PC versions of video games at Medium graphical settings to High graphical settings.

Then we got sales where PC versions of video games end up being sold below $60 dollars (USD) on gog.com and on Steam and usually about 3 months or so after they released for sale.

User avatar
Causon-Chambers
 
Posts: 3503
Joined: Sun Oct 15, 2006 11:47 pm

Post » Mon Nov 30, 2015 12:46 am

PC gaming has always been on the expensive side vs consoles but is pretty comparable nowadays. high end pc gaming of yesteryear were magnitudes more expensive than consoles at the time (early 90's for sure).

User avatar
Nice one
 
Posts: 3473
Joined: Thu Jun 21, 2007 5:30 am

Post » Sun Nov 29, 2015 9:44 pm

Yes, and it's a mistake to just compare the cost of the hardware.

My GoG.com library stands at over 200 titles now. Many of those games, if I bought them for console, would have been near full price, while I almost always buy a couple of years (at least) after release, and when they are on sale. I typically pay in the neighborhood of $2-$5 US for a game that originally sold 10x to 15x higher.

User avatar
D LOpez
 
Posts: 3434
Joined: Sat Aug 25, 2007 12:30 pm

Post » Mon Nov 30, 2015 9:11 am

exactly what Ballowers said

give me two weeks to look for sales and rebates, and I can build a rig that will play most current titles for not far off from the number Ballowers gave.. and you don't need to constantly upgrade it either

a built my cousins current rig for him back in late 2010, IIRC the parts i told him to order set him back roughly $900-$950 after rebate.. he was able to play most newer titles up until Witcher 3 launched, only upgrades he had me do over that time was throw in another 2GB of RAM and replace his 1TB HDD with a 2TB.. sure he may not have been able to play everything on "Super-realism-Ultra-High", but that in no way took away from his enjoyment of games..


you don't always need settings to be at the highest they can possibly be.. a game is still extremely enjoyable at just high or medium settings.. if the amount of enjoyment you get from a game is how shiny the protagonists hair is, and not the writing or gameplay, well thats just sad to be honest..

User avatar
Lauren Dale
 
Posts: 3491
Joined: Tue Jul 04, 2006 8:57 am

Post » Mon Nov 30, 2015 12:30 am

The PC version of Fallout: New Vegas sold for $50 dollars (USD) in 2010 and I managed to purchase it from Steam in 2011 during the Steam Holiday Sales in December for $5 dollars (USD).

The PC version of Fallout: New Vegas sold for $5 dollars (USD) 1 year later. So if you factor in not upgrading your PC for 5 years maybe 6 years and PC versions of video games selling way below $60 dollars (USD) and not having to pay for online like on Xbox 360, PlayStation 4 (PS4), and Xbox One. PC's really are not expensive at all.

User avatar
NO suckers In Here
 
Posts: 3449
Joined: Thu Jul 13, 2006 2:05 am

Post » Sun Nov 29, 2015 10:50 pm

very true :)


jw though Ballowers, where are you grabbing your numbers? its not that I do not believe you (because I do believe you), I just thought New Vegas was $60 for some reason :P

User avatar
Lucie H
 
Posts: 3276
Joined: Tue Mar 13, 2007 11:46 pm

Post » Sun Nov 29, 2015 10:30 pm

Hmm. I purchased the physical boxed version of the PC version of Fallout: New Vegas in 2010 from GameStop, but if you remember I said many times I was upset it said on the back Steam is required. So I never opened it up and I never returned it for a full refund and kept it in a box closed since them.

I need to get into storage in order to look in the box to see how much the physical boxed version of Fallout: New Vegas costed in 2010. It still has the price sticker on it.

User avatar
Nicola
 
Posts: 3365
Joined: Wed Jul 19, 2006 7:57 am

Next

Return to Fallout 4