Issue #2 - Quality and Graphics

A new generation of gamers is coming, and their focus is yet again moving. When did this start? Where will it end? With the widespread use of the Internet to “review” games, anyone out there can place his opinion. And often this opinion is not only biased, but also inc­redibly wrong in focus.
State-of-the-art graphics have become a requirement for any game on the market. Long-established games are expected to upgrade their graphics engines to mirror that. But what are we talking about here? Graphics became a measure for quality, but can a game’s quality be based on the graphics?
What do we define as “quality” when speaking of games?
“Quality”, first and foremost, is a term we use to describe whether or not something is “good”. But that encompasses what exactly? And how do we determine what is good and what is not?
For the most part quality is a matter of taste. Our brain takes impressions and experience into account in order to tell us whether or not we “like” something. This means, first of all, that every single person out there has their own definition of quality, and their own impressions. With that said, we need to accept one thing: Quality cannot be globally depicted at all. We can come to a general consensus. We can have the majority decide on an average quality “score”. But we cannot objectively declare the quality of anything.
Nonetheless there appears to be such a general consensus, based on what the loudest group says. A 15-year-old YouTube reviewer with a graphics-focused definition of quality with a mass of subscribers will always have more impact than the voice of a games veteran who still believes in mechanics “making a game” on a small review site.
By this point it should have become obvious: Many people out there don’t come to their own conclusions about the quality of games, they merely adopt one they have been given. And this is the opinion given by the loudest group.
Why the hate on “graphics”? They are an important factor nowadays.
They shouldn’t be the main factor, though. And that is exactly what we see “nowadays”: Games getting judged based on their graphics. Sure, eye candy is fine, and always a good addition, but it should not be the core.
In order to get this point cleared fully, we need to take another turn here and first of all discuss tech demos. New technology, such as graphics hardware and engines is usually presented alongside some basic program displaying a large share of the potential this new technology bears. Such a program is a tech demo.
Current games do exactly the same, though. They make full use of the state-of-the-art technology to cater the aforementioned focus on “eye candy”. This is the point at which one has to judge as to whether such a game is still a game, or merely a tech demo. Why do we need the most recent technology to run a game that could easily run with a dated technology?
Why do we produce games which use technology that isn’t established yet, especially regarding PC gaming? Consoles work with an easy system: Generations. Games for the PlayStation 3 are sold to a sum of customers which already owns the required technology, even with launch day games. This is caused by the fact that there is only one setup for a suitable console, except for some minor tweaks. PC gaming on the other hand has a wide variety of setups for the same “generation”. Heck, one can even emulate Windows on most other systems.
And still, most PC versions have a higher graphics tier than console games, while the share of customers able to fit into the small area between lowest and highest settings is far, far smaller than on a console.
This, my dear friends and fellows, is not how our industry should operate, and our definition of quality, induced by the “next generation of gamers” is as responsible for this as our own naïveté for heeding their call.

No comments:

Post a Comment