Since their birth as a science-fair curiosity at Brookhaven National Laboratory in the late 1950s, video games have moved inexorably towards higher and more central cultural ground, much like film did in the first half of the 20th century.
Games were confined at first to the lowbrow carnival of the arcade, but they soon spread to the middlebrow sphere of the living room, overran this private space, and burst out and upwards into the public spheres of art and academia. With prestigious universities like NYU and USC now offering graduate-level programs in game design, and major museums like MoMA, MAD, and SF MoMA beginning to acquire games and curate game exhibitions, preserving the early history of the medium appears more important than ever. But what exactly does it mean to preserve a digital game?
The answer is surprisingly simple: It means, first and foremost, preserving a record of how it was played and what it meant to its player community. Ensuring continued access to a playable version of the game through maintenance of the original hardware or emulation is less important—if it matters at all.
That, at least, was the provocative argument Henry Lowood made at Pressing Restart, which recently brought preservationists, teachers, academics, and curators together at the NYU Poly MAGNET center for a day of “community discussions on video game preservation.” Lowood is no contrarian whippersnapper; as a curator at the Stanford Libraries, he has been professionally involved in game preservation efforts for well over a decade.
In his talk, part of a panel on collection criteria for collecting institutions, Lowood decried the fallacy of the executable—the idea that game librarians in 2100 can sleep easy feeling they’ve done their job well so long as they can brainsync their patrons with fresh working copies of Diablo III, Bejeweled, or any other canonical game. The problem with this attitude, Lowood argued, is that a game is not simply a piece of software, but rather a historically specific site of shared experience.