I’ve begun a new series of “Let’s Study” videos on horror games, just in time for Halloween. This first episode explores the historical roots of the survival horror genre, which means that it’s a new manifestation of this lesson plan.
Over the summer, I was working on a peer-reviewed video essay that’s quite thematically dense. As a result, this video feels a little bit shaggy to me: loose, casual, searching for a central raison d’être. I constantly had to remind myself that this is for general audiences, and not every audiovisual argument needs to be an airtight assemblage of well-researched examples.
The unqualified good news? This video is a massive improvement on the previous blog post version of this lesson plan. The future videos in this series will be a mix of original material and “enhanced remakes” of previous lesson plans.
Transcript below the fold, as usual.
The question of whether videogames should attempt to tell stories was all the vogue in game studies in the late 90s and early 2000s. You’re less likely to encounter the issue in academia today (unless said academics are writing think pieces at The Atlantic.) But it is still very much an ongoing debate in game development: it isn’t too difficult to still find opinionated developers launching screeds against linearity, against the single-player campaign, and against games’ subservience to the logic of cinematic storytelling.
As is so often the case in such conversations, there is a temptation to jump directly to a categorical assessment, leaping over qualitative assessment entirely. The categorical question “should games tell stories?” is a good way to start a rousing bar fight of a debate. Alternately, the qualitative question “do games, as we know them, have a history of telling stories well?” will most likely lead to the reasoned response, “no.” This, in turn, will possibly lead to further avenues of polite and potentially incisive inquiry, such as “why do you suppose that is?” and “are there any ways that we could chart new types of storytelling that might be more compatible with games’ basic features?”
I’m going take the polite and careful qualitative route, not really because I prefer it (I enjoy a rousing debate as much as anyone else), but because I actually think it’s necessary to set the groundwork before making any larger qualitative claims.
2017 marks the year of animator David OReilly’s return to to the medium of videogames, following up on his strange and serene digital-art-toy-screensaver-thing Mountain (2014). His new game, Everything, released on PS4 on March 21st, and releases on Windows, Mac and Linux this Friday.
The game’s title, Everything, is also the game’s premise: It is a game about everything. Specifically, it is a game in which players can be everything, switching at will from trees to koalas to rocks to quarks and back. I haven’t had a chance to sit down with it yet—I suspect I’ll make time for it once it’s out for PC—but I did want to take the advent of its multi-platform release as an opportunity to muse on this premise’s history in gaming.
Everything may be the first game that explicitly promises to allow us to be everything, but games have previously offered the ability for us to step into the role of quite a lot of things, including a surprising range of inanimate objects. “The child plays at being not only a shopkeeper or teacher,” wrote Walter Benjamin, “but also a windmill and a train.”[i] Games have proved to be a continuing outlet for this childhood animist fantasy—why, in just a couple weeks’ time, we’re going to be able to play as a coffee mug!
Join me, won’t you, in a breezy tour of some of the stranger things games have let us be.