2015-02-23

 

Amid the gentle drizzle of livetweets coming out of Middlesbrough Uni's recent Animex festival, my attention was hooked by a presentation from Ustwo's Ken Wong and Peter Pashley titled "Rethinking Game Design for the Post-Gamer Era".

The Post-Gamer Era! It's a concept that's been nagging me for some time in a vague, elusive way, but seeing it phrased like that somehow snapped a lot of things into focus. I set about trying to find out more about the talk, and perhaps the most useful thing I uncovered was this video of Ken Wong's talk at DICE 2014, titled "Games Without Gamers":

If you don't want to watch the video, here's my takeaway: Game designers, who are usually also keen game players, have a bias towards designing games based on what they expect as players. But there is a much larger cohort of potential customers out there who have no interest in traditional games (emphasis on "traditional") and if you want to tap into that market then you must abandon your preconceptions of what players want. Drastically reduce the complexity of your controls and the difficulty of your gameplay, while emphasising narrative and aesthetics; focus on creating experiences, rather than challenges. This is the philosophy behind Monument Valley, and it led to commercial success without sacrificing artistic value.

Videogames are a collaborative experience co-authored by designers and players, working in partnership despite never meeting each other. But designers often assume that players' enjoyment flows exclusively from the interactive play itself, and try to drag the experience out by jamming in more content or making the game more difficult. The truth of the matter is that nobody cares how long and hard it is - what's important is that you help your partner reach a satisfying conclusion.



Two thoughts came to mind after listening to that talk.

Firstly: He's right. With the launch of the NES, videogames entered the living room in the manner of a harpoon entering a whale. But now, more than 30 years later, and in spite of the mainstream cultural acceptance of games in general, the majority of people are still put off by the kind of wilful masochism of traditional videogames. There's a huge amount of commerical and cultural potential in exploring alternative game concepts. This is not news to industry observers.

But along with the celebration of acceptance and diversity, it does also create a wrinkle of frustration for some of us who grew up with traditional games: As more and more generations of people grow up surrounded by games, shouldn't the market for 'games for gamers' become stronger and more stable? Instead, we see games like Vanquish (a modern masterpiece of traditional game philosophy) slumping in the charts and struggling to hold publishers' attention, while games like Tiny Tower (without passing judgement, a challenge-free invest-and-express experience) threaten to drown their developers beneath a tidal wave of money.

Oh good Tetsuya Mizuguchi's new game is a mobile Match Three 'Em Up https://t.co/3LBA9Y753I

— Lewie Procter (@LewieP) February 20, 2015

Sometimes, seeing the industrial-scale shift towards 'games for non-gamers' feels like a kind of gentrification - scrubbing up the most marketable areas of game design for the benefit of outsiders with no appreciation of its history or culture. I think this is one of the reasons why certain members of the community react so violently to the idea that games should be more inclusive.

Although of course, this is bullshit. Instead of gentrification, a more appropriate metaphor would be that of pulling down derelict capsule hotels to build family homes - things will change, but only in the sense that they'll become more humane. Besides which, the "history and culture" of the games industry is (by and large) that of entrepeneurs making money, and the casual revolution is a continuation of this; the eagerness of developers to cater to non-gamers these days is due to democratic capitalist forces, not an MS Paint conspiracy.

Well, it's not unusual for people to misattribute democratic evolution to an intelligent designer when the outcomes contradict their personal expectations (and vice-versa).



Secondly: What ARE the best games to introduce non-gamers to traditional gaming? This is a question recently addressed by The Guardian's Keith Stuart, but skimming down his list I can't help but feel he's missed the point a little - it's a fine tasting menu of great games, sure, but I would never recommend a game like Civ 5 to someone who's never played a game before. Baby steps, man! Baby steps.

(Normally I'd make a list of my own suggestions at this point, but that's a whole other article in itself. I'll try and come up with something in the next few months, and hopefully I'll remember to come back and edit the link in here.)

My cousin came over and said "You need to teach me to play video games!" So obviously I loaded her into Alien Isolation in Oculus Rift.

— Liz England (@lizardengland) February 20, 2015

One of the biggest barriers for people coming into games from the 'outside' is that they don't understand what they are looking at, or how to control it. It's really a question of literacy and, like learning to read a book, it's something that can only be developed over time. It's difficult to think of great, traditional games that don't require at least a basic game literacy. So far in my life, I've managed to teach my parents to play Dr. Mario and Animal Crossing, and that's pretty much their whole reportoire. How much training and preparation would it take before a non-gamer could engage with a complex game like Metal Gear Solid 2? Even self-declared fans of the medium struggle to understand it.

The frustration of gamers seeing their hobby 'eroded' by casual gaming rests in part on the assumption that younger generations might also grow up playing the kind of games we grew up with, and might develop the same literacy and appreciation that we have, if only they were being fed a diet of brutally difficult boss fights and unforgiving save systems. But is this really rational? It would be impractical for gamers everywhere to force their kids to play through a reading list of games in chronological order just for the sake of teaching historical context, but if we assume that kids are going to grow up playing contemporary games, how can anyone be surprised when they develop different tastes?

If videogames have a worthwhile history and culture beyond mere capitalist consumerism - and I'd like to think they do - what are we doing to preserve that? What does preservation look like in this scenario, and how often do we see it? For one thing, I think it would mean more than just keeping old games playable via emulator.

Remember when Rez HD came out? Those were good times.

— PurpleChair (@Manpuncher) February 8, 2015

I tweeted that while sitting in a park one afternoon, gazing mournfully out to sea and thinking about the recent last-ever Chet & Jons podcast (in which the hosts gushed about how the original Rez changed their perception of what games are).

Xbox Live Arcade used to be really good. As a public platform for indie games like Castle Crashers, Braid, Space Giraffe, and Super Meat Boy, and as a low-cost avenue for updated remakes of niche games like Ikaruga, OutRun 2, and the aforetweeted Rez, XBLA used to be a huge selling point for the 360. But it didn't start that way, and it didn't end that way either. When people reminisce about XBLA, they're usually talking about a specific period around 2008-2010. And that isn't just nostalgia talking - lest we forget, some of those games have since been removed from the service for one reason or another. If you weren't there, you weren't there, maaaan.

When the 360 first launched, XBLA releases were haphazard - not just in terms of quality (Geometry Wars being a notable early tentpole release that sold a lot of people on the Live Arcade concept) but in terms of release scheduling. Things changed over time. Once it was proven that you could make a lot of money with a XBLA hit, more developers (and publishers) wanted to use the platform. After a while, Microsoft were able to release a new game (or two) every week, and checking out the trial version of the latest game became a regular weekly event for many people.

But as more and more people wanted to release games through Live Arcade, the pressure began to flow the other way - instead of struggling to fill a weekly release schedule, Microsoft now had more games than they knew what to do with. During my first trip to GDC in 2011, I heard a lot of people talking about the problem of release slots on XBLA; basically, that the release schedule for XBLA was locked down months ahead of time, and (as per my understanding) big publishers like EA and Ubisoft could reserve multiple slots in advance and shuffle games between their slots as required, in a way that smaller developers (who only had one slot, for their one game) could not.

I think the subject even comes up in Indie Game: The Movie, when Team Meat's contact at Microsoft tells them that if they miss their agreed deadline then they'll have to wait months for another slot. Aside from whatever else you might say about that film, I think it's worth watching if you want to revisit the 'Golden Age' of XBLA from a developer's point of view.

But that was five years ago. Today, those indie developers squeezed out of XBLA's release schedule have been aggressively courted by Sony, while Microsoft are talking about the exciting potential of apps on the Xbox One. Apps! Harking back to the strangely game-lite console reveal two years ago, they're now adopting Apple's highly successful marketing language to sell general purpose software (I can't be the only person who thinks "App" is more reminiscent of "Apple" than "Application", right?)

I'm not an Xbox One owner, but is anyone really looking for this in a games console?

Well, yes, probably! Not to nit-pick about consoles-vs-handhelds, but the most popular gaming devices by far these days are mobile phones and tablets. That's where most of the money is, and it's certainly where most of the users are, and under these crude terms it's having a huge effect on popular gaming culture - that's why Namco keep putting out terrible Free to Play versions of their most popular games, for example. Free to Play has become such a ubiquitous business model on the App Store that Apple recently introduced a special category for games where you simply buy the game and then play it, with no funny business. What a novel concept!

These economic shifts reverberate back up the production process to development studio culture. I read a piece by Tadgh Kelly recently which put this into perspective. It starts by discussing the the recent drama surrounding Peter Molyneux, but it was the second half that really resonated with me:

"Despite having many years experience working in the industry, [Greg Wondra] was finding it impossible to secure a new post. He noted that many studios seemed to want cheap design specialists, like game designers who knew endless runner games inside out and were willing to work for less than $60k. That the market for game design had become more of a market for those who could develop content for existing games, for game economists, or people who essentially had technical skills but could also have an idea from time to time – and no family."

Speaking as one of those young game economists with no family, I know what he means, and it troubles me too. System analysis is part of my job, and when I look at the processes at work within the games industry, I'm left with doubts about my long-term career prospects. How long will it be before I find myself on the scrapheap, replaced by an unpaid data analyst intern? I think it's still some way away, but I'm already preparing fallback plans - contingency careers - for when the time comes.

The industry has changed. The industry is always changing - the world is always changing - but I think we turned a corner somewhere between Wii Sports and the App Store. Games have gone mainstream, but while we often think about that in terms of the kinds of games being played (Talk amongst yourselves: Which is more culturally significant - Dark Souls or Candy Crush Saga?), I don't see as much discussion about how it affects the industry's business culture, or the gamer community's relationship to wider society.

Ten years ago everyone was talking about how games magazines were being involuntarily euthanised by online blogs that provide content 'for free'. This process is still underway; it still saddens, but no longer surprises anyone when publishers respond to shrinking profits by closing down magazines.

Today, we instead talk about whether written websites are being killed off by YouTubers, Twitch streamers and other video content. At this point in time, smart outlets are investing more and more in improving their video capabilities - consider Eurogamer's significant investment in their video team. Even here at Midnight Resistance, insulated though we are from market forces, we've all been buying new hardware and dabbling in video production ever since we launched the site.

That said, I'm not terribly interested in watching videos. I prefer to read. If the strength of the video format is that you can see the game in action, its weakness is that you can't control its pace as easily - skimming lighter sections and then slowly chewing over complex analysis, as it suits you. But also, there are very few streamers and YouTubers catering to turtleneck-wearing pseuds like myself. There's no Newsnight Review for games. It says a lot when one of the most interesting videos I've seen was that time PewDiePie got to the end of The Last Of Us and just shut up for a moment, briefly knocked out of character by a sincere emotional reaction to the game. No jokes: I think his silence testifies to the quality of the game's writing. They should have made it a box quote.

Before I move on, this feels like an appropriate place to highlight a few alternative blogs that (in my opinion) do provide some good analysis of games, one way or another:

Critical Distance

Sufficiently Human

Unwinnable

Cane and Rinse (podcast)

Memory Insufficient

ZEAL

Hardcore Gaming 101

Another Castle (podcast; dead now, but I do love it)

The transition from print publishing to blogging to vlogging has been accompanied by distinct qualitative shifts, which are mostly driven by the changing economic context.

While established print publishers have (historically) had enough money on-hand to pay for sub-editors, researchers, and legal teams to deploy against former employees, blogs are usually much smaller, scrappier affairs. The larger, more successful sites might command similar resources, particularly if they're part of a larger media network, but your typical independent, mid-card site gets by with a small team of writers for whom it is rarely a full-time job; selling your review copies to try and claw back some money for the time you've spent writing is part of the culture (but don't tell the publishers!) As for the new wave of vloggers, what they lack in resources they make up for in enthusiasm - it's what viewers want to see, and it's what keeps you going when your subscriber count (and hence, ad revenue) is low.

I am genuinely thrilled to see enthusiastic gamers talking publicly about games they love. My issue is that (for me, at least) enthusiasm alone doesn't make up for critical rigour - I think there's a parallel to be drawn between the rise of casual games, and the rise of what could be described as casual journalism. Clearly plenty of people want to see that sort of thing, and that's okay too! But as it becomes the dominant form of games media, I wonder whether audiences are being sold short. We don't even have a solid standard of capital-J Journalism for games to refer back to - Simon Parkin's piece on how FPS games are funding the arms industry comes to mind, but stories of this calibre are rare.

The main issue is money, of course - broke-ass consumers are drawn towards free media over expensive print magazines, and the subsequent lack of funding for media producers makes it harder to produce great work. Patreon feels like a shot in the arm, in this respect - I like being able to ringfence a certain amount of money each month to help bring free-to-access work into the world - but it doesn't feel like a 'solution'. Patreon is a for-profit business selling a service that soothes, but perpetuates, the ailments created by our over-reliance on service providers like Patreon.

You may have noticed a common theme emerging from all this.

- Games are trending towards being free and casual, because it's more profitable
- Game designers are being replaced by analysts, because they're more profitable
- Publishers squeeze indie developers out of 'free' distribution channels, once they prove profitable
- Games journalism is shifting towards enthusiast vlogging, because it's cheaper

In all these different dimensions - design, production, distribution, and media coverage - the general trend sees traditional monetary transactions disappearing from the system in favour of microtransaction-supported services, aggregated ad revenue, sponsorship, non-monetary labour compensation, and so on. Nobody pays for anything, and nobody is paid, except through privately-owned parallel channels that are run for profit. All of these models benefit established bodies that have the resources to take advantage of them, while pricing out small independents who can't (indie devs whose $1 iOS game won't sell in a market full of 'free' alternatives, journalists who don't make enough money to live on, etc).

To be clear: I hate it when people say "money is the root of all evil". People are the root of all evil; money is just a hydraulic fluid that transmits the force of their will. Economists call it a 'liquid asset' - it has a low friction, it seeps into cracks, it spills and spreads and gets everywhere. Blaming money itself for the things people use money for is a common, but fundamental error.

But, as I say, money gets everywhere. I think it's fair to suggest that one of the reasons traditional games thrived in their weird little petri dish is because the outside world thought they were silly and/or unprofitable - videogames were a sort of cultural Galapagos, cut off from the outside world (well, except for all the films they cribbed ideas from). But the social and technological changes of the last decade have burst the dam. It has been proven that there's a lot of money to be made from games, and increasingly sophisticated means are being used to tap that potential.

Consider the App Store. It's never been an open platform, but it used to be considered quite democratic once you got inside, with its socially sourced user reviews and so on. But it didn't take long for companies to monetise its social features, and these days it's quite easy to buy a few thousand downloads or five-star reviews to artificially bump up your chart ranking... if you've got the money. Here's a photo that emerged recently of a review farm in China, to give you an idea what it looks like:

This is how App Store ratings work. Welcome to the reality. pic.twitter.com/0MyHmTeqwE

— simonpang (@simonpang) February 2, 2015

It's impossible not to place this pan-industry drive towards free-to-access, sponsored content within the wider economic context of the global financial crash. Midnight Resistance isn't really the place for a critique of neoliberal economic theory, but to cut a long story short there are parallels between these changes and the rise of zero-hours contracts, deregulation, rising inequality, and so on. I think it's very significant (and appropriate) that David Goldfarb recently described AAA developers as "the 1%". The economy, especially online, has centered itself around people providing free labour while corporate platform-holders reap the rewards - as Cory Doctorow wrote, "We're all sharecroppers in Google's fields".

Like it or not, this is the reality we live in. A world in which games are part of mainstream culture is already flowering all around us, and it looks like your dad playing a Flappy Bird clone on the bog. You're a fool if you think the appropriate response is to moan about how 'filthy casuals' don't have an adequate sense of appreciation for Mega Man; the effect of all the extra money being sucked up to the top end of the industry is that the full, terrifying force of globalised capitalism is slowly, blindly groping around like Polyphemus in search of games that people - capital-P People - want to play. There's really a great deal of investment and innovation going on - and not just in the casual space - it's just that sadly none of it is going into a Vanquish sequel.

The question on my mind is "What's coming next?"

If we are living in the early days of the Post-Gamer Era, what will the Post-Post-Gamer Era look like? Will the current generation of developers who grew up in the 90's and spit out endless thinly-veiled tributes to their childhoods eventually trigger a traditional game revival? Will casual audiences come to demand more sophistication from their games? Will they eventually stop playing again altogether, turning mainstream games into a weird fad that gripped society in the days before the machine war?

How will things change once the industry makes inroads into BRIC countries? Will we still be able to play games online after the ecologically-triggered Ragnarök that we are quite obviously hurtling towards? Will there always be people out there who believe that videogames are magically unaffected by the contexts in which they are made? How long will it be before increasing wealth inequality leads to virtual item farming becoming a widespread, socially accepted job in the West, and how many games will become even more grind-heavy as a result? Which will launch first: Shenmue 3, or a last-gasp nuclear strike from a collapsing totalitarian regime?

Monster Hunter 4 Ultimate is really good.

Show more