Goldeneye 007, Disney’s Aladdin, The Lion King, Alien Trilogy, Ducktales, Turtles in Time. These were some of the best games of the ‘90s. They’re also all licensed tie-ins for movies or TV shows. While this genre has an atrocious reputation (and justifiably so) among many gamers who grew up in the ‘00s, it’s worth noting that in the ‘80s and ‘90s, games based on movies and TV shows were not only much better, but often considered the pinnacle of gaming. Back then, if your game had the official support of the studio, director or just some random celebrity, that meant it was the real deal.
Even in the decade that followed, games such as The Warriors, Lego Star Wars and Return of the King/The Two Towers were selling well and earning critical acclaim. Yet somewhere along the way we decided not to make games based on movies anymore and so we found ourselves last year, with a new Mummy movie, a Pirates of the Caribbean sequel and some shit about Emojis, all of which would have once had a respective videogames by default, coming and going without the faintest vibration of a controller. But what was it that led to the death of this genre and why did movie studios decide to stop investing in the games industry?
To answer these questions, we must first go back to the salad days of the hobby, back when computer games were still attempting to distinguish themselves from pen & paper RPGs. Many of the earliest examples of tie-ins are not for movies, but for novels. Games such as The Hobbit (1982) attempted to recreate Tolkien’s Middle Earth by allowing the player to input simple commands, such as ‘go east’ or ‘attack Gandalf’ (pro tip - don’t attack Gandalf.) The Hitchhiker's Guide to the Galaxy (1985) took a more literal approach to adaptation and forwent any actual visuals in favour of a detailed, descriptive, text-based adventure. It ultimately went on to sell 400,000 copies, an extremely impressive number for the time and along with Zork and Wishbringer it is now widely regarded as one of the most influential titles in the genre.
This isn’t to say that movie tie-ins weren’t around in the early ‘80s. In fact, the now infamous E.T. The Extra-Terrestrial (1982) was one of the earliest examples of a movie tie-in and was almost responsible for the collapse of the whole industry. While it wasn’t an entirely isolated example of a poorly licensed title, the ‘80s were generally a pretty decent time for games based on movies, with studios like Ocean Software picking up many licenses and doing a generally good job with them. A decent example of this is the first ever game based on Batman (1986), which used an isometric viewpoint, rather revolutionary for the time, and tasked the caped crusader with rescuing his sidekick, Robin.
But it wasn’t until the ‘90s, or as I like to call it, the decade of radical awesomeness bro, that movie tie-ins became an established norm. It was in this era that developers would make a huge song and dance about how closely they were working with the studios to achieve a greater degree of ‘authenticity’. And authenticity was important. Even as recent as the Harry Potter movie tie-ins, several of which were heavily criticised for not sticking to the film’s script, players have made it quite clear that if you’re going to make a game based on a movie, you’d damn-well better have all of their favourite scenes in it.
It was around this time that we saw some film studios setting up their own game development sections in order to have total creative control over their franchise tie-ins, subsequently allowing them to pocket all of the profits. The most famous studio of this era is arguably LucasArts. George Lucas’ videogame publisher had actually been around since the early ‘80s, but it wasn’t until Indiana Jones and the Last Crusade (1989) that they really hit their stride with movie-to-game conversions. At the time it sold a whopping 250,000 copies and was their most commercially successful title to date. It was helped in this regard by the fact that it was a fantastic graphical adventure game and really laid the groundwork for some of their most successful point-and-click games of the ‘90s. What followed included a string of Indiana Jones games, including the awesome Fate of Atlantis (1992) and no less that 73 separate Star Wars games, from classics such as Star Wars: Battlefront (2004) and Knights of the old Republic (2003) to the utterly inane Angry Birds: Star Wars (2012) and the dry milking of the cow’s teat that is the Lego Star Wars series.
The ‘90s produced as many hits as it did misses. For every Aladdin and X-Men, we had piles of hot garbage such as the Fantasia (1991) game. But as we crept ever onwards towards the end of the decade, with every movie that came out requiring some sort of computational equivalent, the quality slipped further and further into unacceptable territory. The Crow: City of Angels (1997) was universally panned by critics as one of the worst games ever made. It even attracted the ire of The Angry Video Game Nerd, it’s that bad. Even movies that nobody really cared about, such as Bloodwings: Pumpkinhead’s Revenge (1995) and the utterly ridiculous Expect No Mercy (1995) had video games based on them. Expect No Mercy was so bad and so silly that it is now solely remembered by the industry as the quintessential example of trying to out-do a successful title, in this case Mortal Kombat, by simply ripping it off. Although I have to give it credit for having a finishing move that allows you to rip out someone’s spine, using a lady’s ponytail. That’s pretty badass.
By the turn of the new Millennium, games had achieved a degree of narrative complexity that was beginning to rival (some might argue surpass) the movie industry. Once upon a time, input from a director, artist or celebrity might have been welcome. But by this point, they were simply people who knew nothing about making games, holding power of veto over every aspect of a game’s development. A great example of this is the ill-fated Tomb Raider: The Angel of Darkness (2003). This game effectively killed Core Design, the creators of the franchise, as it was a broken, confused mess on release. This is due, in most part, to Eidos’ insistence that it resemble the Tomb Raider movies of the time. Given that Tomb Raider is essentially a puzzle game series and not, somewhat unfortunately, 90 minutes of Angelina Jolie’s ass in tight pants, the game failed to compete with other action-adventure titles on the PS2, especially given it was up against Grand Theft Auto: Vice City (2002) and The Simpsons Hit & Run (2003).
Around 2005, we began to see the 7th generation of consoles being released. This era was to be one of flat-screen TVs, HD video resolution and insane game budgets (Call of Duty: Modern Warfare 2 (2009) cost an incredible $250 million to produce.) This was also the point at which certain consoles *cough cough* Nintendo Wii *cough cough* would begin shovelling out cynical, cheaply made, poorly developed franchise tie-ins for anything and everything. By 2010, gamers had endured a whole decade of almost unequivocally abysmal movie tie-ins and they were beginning to vote with their wallets. The response to this was a general backing away of movie studios, allowing developers much more creative control over intellectual property. This can’t have been easy for them to do. In fact, it must have seemed almost counter-intuitive at the time, but it led to some of the best games of the last decade. Titles like Batman: Arkham Asylum (2009) and Middle Earth: Shadow of Mordor (2014) are evidence that movie studios have begun to take a much less authoritarian approach to game development.
For us gamers, this is an era of movie fans being given creative control to effectively write fan-fiction and bring it to life in an interactive environment. In recognizing that game creators make better games when they’re working with their own ideas and intellectual property, studios have been forced to give the devs the keys to the kingdom, effectively making Batman, Mad Max or Lord of the Rings the property of a videogame studio, at least temporarily. There’s still a lot the games industry could learn from directors, scriptwriters and actors, but at least now the distinction between learning and stoically recreating has been made clear.