I was talking with a friend a while back, who said that he played a game with updated graphics but was disappointed at other qualities of the game. Specifically, he said that since they put so much effort into making the game look good, he expected that the gameplay aspect would be fun. That got me thinking. Long-time readers of this blog will know that I put creativity and polish in gameplay and design much higher on a list of priorities than graphics in games. But could it be possible that developers are actually shooting themselves in the foot by focusing so intently on making their games look shiny?
Let’s back up a second. Graphics certainly have their role to play, and on its own, enhancing graphics does produce a positive effect. They quickly describe the environment and action of a situation to the player(s), and they contribute (if not in the most important way) to the immersion of the experience. But by making a game that looks fantastic, developers are saying to some extent, whether they mean to or not, “We have this amazing experience that we want to show you in the most detailed way!”, and if that experience doesn’t hold up to the built-in hype, then it’s like a coat of paint on an outdated car, and it can feel more like a letdown than a decent game with the plus of nice graphics.
That explains part of the appeal of independent games (the best ones, at least). By and large, they’re not all that concerned with the greatest graphics (they often don’t have the resources for AAA-level art anyway), and instead focus on delivering a compelling and/or unique gameplay experience that they think will be fun.
I’d love to get some discussion going about this topic. Do you think developers are actually hurting themselves by adding shiny graphics over better gameplay mechanics and design? Do you think better graphics are always a plus? Leave a comment!
Lag has been around since the very earliest days of online gaming, and there’s not much that can be done to remove it. It’s just a fact of life that, with current technology, it takes a bit of time to send data from one place to another, so someone is always going to be a little behind. Before, the lag was just accepted and played through, but recent technology has attempted to mitigate the problem with prediction and other forms of compensation. This lag compensation has some nice effects, allowing players to move smoothly and without the nasty skipping effects. Unfortunately, these techniques are not without their price.
While the immediate effect is that the player’s and other movements appear smooth, the lag still exists. As a result, weird stuff appears to happen and the player doesn’t even know it’s lag. The classic example is the dreaded “kill around the corner” scenario, typically observed in shooters. The player runs around a corner to take cover from enemy fire and dies thereafter, when he was clearly out of harm’s way. Little does he know that he was hit before he made it to safety, but didn’t know about it until a little more seemingly smooth game time passes by. These sorts of inexplicable alterations to the normal game flow can really become frustrating to the player, and their root cause is hidden by prediction or anti-lag.
I suppose what we can learn from this is that, in lower latency situations, anti-lag can be a useful technique for smoothing out the jitters. Unfortunately, when lag becomes more severe, its aggravating effects can be exacerbated. Perhaps an alternative technique could be employed for higher lag situations? Maybe games should just put heavier focus on matching people with nearby players? Or, perhaps this is just another argument for dedicated servers? I’d like to get your take and start some discussion on the problem and its potential solutions. Let us know what you think in the comments!
Or did work become a game? While World of Warcraft is not the only game guilty of it, this is high among the reasons I stopped playing that particular game. A significant portion of my game time eventually became dominated by repetitive actions which in themselves were not particularly enjoyable, but promised a contribution to future activities. Why not just make the whole game fun and fresh? I understand the ideas behind the needs for these items, such as an economy or producing new usable items and weapons. Still, farming cloth or metal, or grinding, does not make for a compelling gameplay experience. There must be a more mentally rewarding method to provide the player with these resources. The word grinding holds a negative connotation for me now, because it represents going through the same portion of the game time and time again. Sure, a game should have some replay value, but this is a separate issue.
I should mention that it shows up in other games too, but more commonly in role playing games. RPG’s and grinding are a very obvious example of this phenomenon, but the genre is not bound to this fate. Mass Effect 2 is an excellent example of a role playing game that avoids this behavior almost entirely. Yes, you can scan planets for resources, but it doesn’t take a lot of time, and you don’t really have to if you don’t want to.
As always, I’m interested to hear different viewpoints on this topic, and your comments are most welcome.