I don’t know the exact number of Sega or Nintendo games I’ve played in my life, though the combined number is likely situated somewhere in the single digits. (Most definitely the low single digits once you rule out the games that star Italian plumbers.) That may not come as a surprise to many longtime readers; though I’ve written extensively about TV and film over the last several years, I’ve never once had cause to post a piece about the gaming world.
My lack of familiarity with the medium has some inherent positives (I’ve never gotten into any arguments over Gamergate) but it also obviously comes with some negatives (some of the games I’ve missed out on do look pretty awesome). But there is one aspect of the virtual world I’ve long been familiar with: their film adaptations.
More specifically, their often dreadful film adaptations. Yes, whether you’re an avid gamer or simply a casual movie buff, it’s no secret that movies based on video games tend to be met with vitriol from critics and audiences alike. Often panned for generic storytelling, thin characters, and cheap special effects, the output of this genre has made it the black sheep of the action film family.
Or so it was, until recently. In the last couple of years, Hollywood has released a string of films adapted from video games, and the result has been… not bad. Though none are being hailed as masterpieces, the last five video-game adaptations to attain a theatrical release – chronologically, they are Tomb Raider, Rampage, Detective Pikachu, The Angry Birds Movie 2, and Sonic the Hedgehog – have all received decent-to-good reviews, all while maintaining solid box-office returns. As of this writing, all five films have a score above 50% on Rotten Tomatoes – a feat that no pre-2018 video-game adaptation has ever achieved.
So what happened? How have Hollywood’s video-game flicks attained this yet-unseen level of pretty-goodness?
To understand, we need to examine the history of the genre, dating back to its inception in the 1990s.
Though films about video games have dated back to the inception of VR (the preeminent example was 1982’s visually groundbreaking Tron; other prominent examples from that decade include WarGames and The Wizard), the first video game to score a major Hollywood deal was Super Mario Bros, granted a silver screen adaptation in 1993. The film, which starred Bob Hoskins and John Leguizamo as the titular brothers, was riding high on a wave of the game’s popularity, and seemed poised to be a shoo-in success.
But production of the film was fraught with difficulty. The film’s directors, husband-and-wife duo Rocky Morton and Annabel Jankel, clashed with the studio over the tone and direction of the film – whether it should hew closely to the video game that inspired it, whether it should appeal more to adults or little kids. The resulting arguments led to drawn-out filming and an overblown budget; it was eventually released to critical and commercial failure.
In retrospect, the film may have been doomed by is source material, which was slim in scope and didn’t exactly translate to a big-budget feature. I’ve played enough Mario to know that it’s an episodic game, with characters – both good and evil – who primarily function as window dressing. One level of running, jumping, and Koopa-flattening leads to another, one Princess-free castle yields the next, on and on until its finish. (Later Mario games have elaborated on the conceit, but the characters have remained stagnant – Mario and Luigi as stereotypical Italians, Peach a one-note damsel in distress.)
These sorts of characterizations and scenarios do not translate well to feature films, which require complicated characters and storytelling to compensate for the audience’s lack of direct interaction with them. In a video game, there’s less emotional investment with the people onscreen – audiences control the main character, and can always hit the reset button to try and change their outcome. Ergo, there’s less reason for video game designers to craft complex motivations that don’t directly serve the tenets of gameplay. But a film comes equipped with a set story and outcome, and needs clear-cut and coherent story arcs to invest its audience in what happens onscreen. (Before anyone leaves a comment about this – yes, I’m aware that a lot of video games have developed more complex stories the those of the past. But my point about the fundamental difference between the two genres still stands.)
But as video-game movies attempted to hew close to their progenitors, audiences were dealt one shallow, forgettable film after another. Mortal Kombat, Street Fighter, and Lara Croft: Tomb Raider generated buzz thanks to the intellectual properties they represented, but were poorly received due to their one-note characters and uninspired stories. (All three were popular enough to receive sequels, which were all somehow even worse than their predecessors.)
The Tomb Raider and Resident Evil franchises cleaned up the box-office in the early 2000s, but even many of their supporters would admit that they primarily function as turn-off-your-brain entertainment. And beyond action films with attractive female leads, video-game franchises didn’t have many avenues to travel.
Enter, then, a filmmaker named Uwe Boll. The German-born director had dabbled in original dramas during the early phases of his career, but he began gaining widespread notoriety in 2003, following the release of his film House of the Dead. An adaptation of the zombie-infested video game, House of the Dead was a cheaply-produced film centering a group of college students who stumble across the title’s unfortunate house. The film flopped at the box-office and was panned by critics, but the director was just getting started.
Over the next several years, Boll released a slew of other films based on modestly popular video games, including Alone in the Dark, BllodRayne, Postal, and Far Cry. All were box-office flops and excoriated by critics, who lambasted their poor writing, inept editing, and aimless direction. Boll’s films were so hated, in fact, that he began to earn something of an anti-fandom on the Internet. His wide negative publicity gave him the appearance of monopolization over the video-game movie genre.
By 2010, Boll had largely moved on from said genre, but the perception of films based on video-games had by this point grown so negative that most film studios were hesitant to touch them. They had bombed so often in the past that future attempts seemed like money in the trash.
Still, as action blockbusters began to overtake Hollywood – not merely during summertime, but (thanks to franchises like the new Star Wars flicks and the Marvel Cinematic Universe) year-round – it seemed only natural that video games would get more turns at the big screen. The films still stumbled (neither Need For Speed nor Assassins’ Creed received a warm reception), but they were achieving a level of financial backing they’d never attained before.
Then in 2016, Universal released Warcraft, an expensive spinoff of the popular Blizzard franchise. Like its predecessors, the film was poorly reviewed and did not perform well domestically. But worldwide, the film smashed the box office, ultimately earning over $400 million. No video-game film before (or, as of this writing, since) has hit the benchmark of Warcraft, and its success proved that films of its ilk still had a market.
And here we can begin to answer the original question: How did video-game movies become good?
Three reasons come to mind. First, the increased budgets to action films across the board has inspired studios to develop video-game adaptations that don’t look like something out of grade-school Printshop. The Pokemon in Detective Pikachu are rendered convincingly, as is the electric blue star of Sonic the Hedgehog (er… not counting the original trailer version).
Second, these films now put a bit more effort into character. Angelina Jolie’s Lara Croft was a generic action heroine with no real personality behind her weapons and wardrobe. (The sequel, Cradle of Life, tried injecting a romantic arc into the story, but it felt as forced as everything else.) The 2018 Tomb Raider, however, gives Lara (played here by Alicia Vikander) an origin and backstory, plus a characterization that makes her feel like more than a female robot.
Third, the films don’t necessarily adhere to the rules of the source material, allowing them freedom to tell more developed and unconventional stories. The first Angry Birds movie felt restricted by its need to play things safe and build up to its expected climax, contriving a story that would allow its avian leads to slingshot their way into piggy towers. The Angry Birds Movie 2, however, threw this sort of logic and conformity out the window, opting instead for pure madness and animated lunacy. The result was a film that was far funnier and more self-aware than the first – and far better as a result.
These latest video-game films don’t have any real connection to one another (apart from Pikachu and Sonic, which take similar routes in telling stories of human leads teaming up with CG comic sidekicks). But their across-the-board improvement of the genre is a promising sign that filmmakers may have at last begun to take them seriously. After spending so many years at the bottom of Hollywood’s barrel, it’s encouraging to see this genre finally begin to level up.