Yes, movies like this week's Warcraft have been just as poorly received as 1993's Super Mario Bros.—but as video games have evolved, the possibility of a groundbreaking film adaptation has only grown more remote.
Fun fact: Of the 32 major feature films based on video games released to date, not a single one has scored more than 50 percent positive reviews on aggregator Rotten Tomatoes—which won't even deem a movie "fresh" unless it hits a threshold of 60. Not Tomb Raider, not Prince of Persia: The Sands of Time, not either of the Hitmans. Not a single one of the five Resident Evil films released to date. (Though there's always a chance next year's Resident Evil: The Final Chapter will buck the trend!)
Not Final Fantasy: The Spirits Within, a movie that bears basically no resemblance to the series that spawned it, which somehow sits atop the list with a whopping 44 percent positive reviews. And not a single one of this year's three distinct attempts to break the seemingly endless chain of shitty video game adaptations: Ratchet and Clank, The Angry Birds Movie, and this week's Warcraft, a $160 million riff on the long-running fantasy franchise, which lands in theaters with all the excitement of a loud fart on an airplane.
This is not an original observation. For as long as the Hollywood video game adaptation has existed—since 1993's legendarily unwatchable Super Mario Bros.—film critics and gamers alike have been united in recognizing the sheer ineptitude of these movies. But a closer examination of both Hollywood and the video game industry reveals a slightly more complicated story. Yes, video-game movies are terrible—but over the years, I've become convinced that the reason video game adaptations are terrible has totally shifted.
The slipped cog in Hollywood's game-to-movie machine starts with the fundamental nature of adaptation. Every type of media carries its own strengths and weaknesses on its way to the big screen. If you adapt a novel, for example, you lose the intimacy of the experience, the capacity for detail, and the imagination of the reader—but you gain the immediacy and emotional impact of sound, visual design, and performance. Video games carry their own strengths and weaknesses, and in the early days of the medium, the tradeoff was clear: If you turn a video game into a movie, you lose interactivity, and gain exponentially more realistic visuals and a beefed-up story.
But as the technological end of video game production has evolved—faster than any other art form in history—storytellers have grown increasingly adept at exploiting the unique qualities of an interactive medium in a way that's impossible to translate to the big screen. If I were a part of the creative team behind a big-budget Hollywood adaptation of a video game, I'd start with one question: What can I possibly add to this? And the answer, in most cases, turns out to be nothing.
Take this summer's biggest Playstation 4 hit: Uncharted 4: A Thief's End, which hails from developer Naughty Dog, a company that openly aims to make "playable summer blockbusters." At Vulture, Lane Brown began an otherwise excellent feature on Uncharted 4 with a version of the same tired qualifier that opens any article about a video game's narrative: "You could be forgiven for wondering why anyone would care about the leaked plot details of a video game. Nobody has ever enjoyed Super Mario Bros. less for knowing in advance that Mario rescues the princess."
Yes, there was a time when games like Super Mario Bros., Double Dragon, and Mortal Kombat—which used their paper-thin narratives as justification for jumping on turtles, beating up street toughs, and ripping out ninjas' spines, respectively—were the norm, and their '90s film adaptations echoed their narrative simplicity. But this is 2016, which means it's been nearly 20 years since Hollywood-inspired games like the original Playstation's Metal Gear Solid ushered in an era of self-consciously cinematic storytelling. Isn't it safe to assume that the average reader recognizes that video games have emerged as an art form capable of complex, multifaceted storytelling in its own right?
Today, narrative game development is bent on finding new emotional resonances that cinema is incapable of replicating. Naughty Dog's previous game, the post-apocalyptic thriller The Last of Us, originally ended with a climactic non-playable scene in which the game's protagonist commits a near-indefensible act of violence. Eventually, the creative team concluded that the moment would have even more emotional impact if the player was still in control of the protagonist; in the final version of the game, you're the one who pulls the trigger. The Last of Us has been widely lauded as an ideal choice for a big-screen blockbuster, and the game's director even wrote a draft for a proposed Hollywood adaptation. But in an era when technology has enabled some fairly staggering realism, and motion-capture allows actors to contribute fully realized performances to games, what does a film have to offer that the game hasn't already accomplished?
In practice, the ongoing evolution of the medium means we're left with roughly two categories: video games that prioritize gameplay over narrative, which makes any Hollywood adaptation an uphill battle; and video games that prioritize narrative, but rely on the medium's interactivity to push the boundaries of storytelling in a way that a film could never replicate. Why would anyone look at either category as fertile ground for a movie?
Hope springs eternal, and with Hollywood adaptations of popular game franchises like Assassin's Creed, Sly Cooper, and—yes—Naughty Dog's Uncharted on the horizon, there's always the possibility that a brilliant filmmaker will manage to make a video-game adaptation that brings something to the table that the original game never could.
But as video games continue to grow more sophisticated, the challenge only becomes greater. Even Warcraft, made with Hollywood's most cutting-edge special effects, is based on a game series that ultimately belongs to a different era. The first Warcraft, a real-time strategy game, hit shelves in 1994; World of Warcraft, the series' insanely popular (and ongoing) MMORPG, arrived a decade later. In both permutations of the series, the story was ultimately a backdrop—a derivative fantasy playground designed primarily to allow the player to craft their own adventures. If you ask the average World of Warcraft player to tell you their favorite memories from the game, they won't regale you with the backstories of characters like Ragnaros or the black dragon Nefarian; they'll tell you about the online clans they formed with their friends, or the challenging raid they juuuuuuust managed to pull off.
Those adventures are inherently personal. Why drive to a movie theater and plunk down money to watch a bunch of actors play CGI orcs when you can just stay home and be one?