» Sun Nov 16, 2014 7:05 am
Agreed. Often the fanbase has no clue what they want, or the developer screws up the fanbase's request.
I know one online game where there was a certain mission that could be set up so everyone would get incredible amounts of rewards for almost no effort. This caused a lot of controversy on that game's forums, with half the players demanding the rewards be nerfed and the other half demanding that everything stay as it was. The result? The developers did nothing to that particular mission, but instead they nerfed all missions of its type, and then started nerfing several non-overpowered abilities of some character classes.
The next day: people piled on to that game's forums (again) and demanded that the nerfing be rolled back. Most of it was.
Another good example: Might and Magic 8(MM8)/Heroes of Might and Magic 3(HoMM3), last expansion. Might and Magic 7 had 2 endings: a "good" ending that resulted in building a gate to another planet, and an "evil" ending that resulted in swarms of undead with sci-fi technology taking over large parts of the world. The developer team decided to take the "evil" ending as the canonical ending (which MM8 would be based off), and made plans for the last expansion of HoMM3 (a strategy spinoff with its lore linked to the main series) to include a sci-fi undead faction.
The result? The fans revolted, and the developers were forced to toss their ideas for the HoMM3 expansion and MM8 into the garbage bin. Instead of something original, the last HoMM3 expansion featured a bland elemental-themed Conflux castle type, and MM8's storyline was also centered around "the mad elementals are causing natural disasters and you have to stop them". In short? The fans didn't want an original storyline, so they got something bland and ended up running the series into the ground. MM8 had mediocre sales, and its dev team was removed from the Might and Magic franchise. If I remember properly MM9 was built by a new team that had never made any video games before, and the publisher (3DO) actually went bankrupt during its development. As a result, MM9 had so many glitches it was effectively unplayable. After 3DO went bankrupt, they were bought out by Ubisoft.
------------------------
As for my opinion: What would I like to see from modern developers/publishers in terms of customer support?
Definitely better testing. It seems like the current industry standard is to release a broken product and then patch it later.
Does anyone remember the early days of Skyrim right after launch, where there was a completely broken patch that reportedly made dragons fly backwards and caused a myriad of other problems? To add insult to injury: that patch was configured as "mandatory", which meant that Skyrim wouldn't launch until you installed that patch. As a result, players were stuck with a broken game until the next patch.
One of my friends is a big World of Warcraft fan and pre-ordered the Warlords of Draenor expansion that launched just a few days ago. He's been telling me that for 2 days after the expansion launched, the servers were having serious lag and stability issues, and they had to be configured for a lower maximum number of players. This caused login queues and much frustration.
But it isn't limited to "AAA" games either. There are plenty of indie developers that sell their games while they are in beta or even alpha (and there are plenty of people dumb enough to pay for a game before it's properly finished).
IMO the main reason for the lousy quality of most games at launch is that modern technology has made it very easy to get away with sloppy programming practices.
In the 1990's:
-CPU power, RAM, disk space very limited. This means that you either learn to optimize (with some developers going as far as to write some functions in assembler), accept a lower framerate, or your product might even run like crap (if it even runs). If you took a lot of programmers from today and sent them back in a time machine to the 1990's, my guess is that a good number of them wouldn't make it in video game programming.
-Most people did not have video cards. Some games even included configuration options to use a discrete video card or the CPU only. Obviously this takes a decent amount of skill to do.
-Good computers were expensive. Not optimizing = lose out on players who can't afford a top-of-the-line computer.
-Internet access wasn't widespread, and most home internet connections were 56K dialup. This meant that you absolutely had to test your product thoroughly... because if your product had serious problems, you had to mail out patches on floppy disks or CDs.
-There was no Steam, Origin, or other online content delivery/management systems. Software was packaged as a product and bound to its physical media (instead of the "software-as-a-service" model) - for example, most games in the late 1990's were packaged as CDs. If you bought a game and it turned out to be crap, you could uninstall it and resell the CD. This further enforced quality - developers that made a lousy product would quickly find their prices undercut by the secondhand market.
These days:
-CPU, RAM, disk space keeps increasing. Intel CPUs just keep getting better. 16 GB RAM is now industry standard for personal computers and you can easily get terabyte disks for laptops and desktops. Solid-state storage is another (more expensive) alternative and you can get 512GB of it for a reasonable price. The result? No need to optimize, programmers are expensive and equipment is cheap. Just write everything inefficiently and the end-user's computer should be able to handle it.
-Fast Internet access is almost ubiquitous in developed/Western nations. While it's a very nice convenience to have, it also means that developers can make a shoddy product riddled with glitches and then fix it later.
-A lot of games these days are bound to online service accounts. Often, purchases are non-refundable, so there is no developer/publisher accountability for a low-quality game.