OpenAI Text Generator GPT-2 Creates Video Game Walkthrough for 'Most Tedious Game in History'

Video Player is loading.
Advertisement 0:16
 
FINAL FANTASY VII REMAKE Theme Song Trailer (Closed Captions)

When OpenAI announced the automatic text generator GPT-2 in February of 2019, its language model had a simple objective: predict the next word. Since its release—and despite high computational barriers—programmers, tinkerers and artificial intelligence researchers have explored creative ways to use the advanced language model, developing applications for GPT-2 far beyond simple text generation. In January, AI researchers demonstrated how GPT-2 can empower video game design, though you're unlikely to want to play "the most tedious game in history."

Trained using text from 8 million web pages, users could give GPT-2 a prompt—typically a sentence or two—and the model can create text that reads like natural English-language writing, even if the ideas contained within the words don't always relate to anything in the real world.

"You can prompt the model with whatever text you want, and it will try to guess how to complete it," Presser told Newsweek.

So when it's "trained" with texts involving video games, including walkthroughs, strategy guides and game tips, the result is walkthroughs of video games that never existed; guides to adventures no one has ever programmed.

Ads by scrollerads.com

A programmer with a lifelong interest in artificial intelligence, @me_irl—or as he's known on Twitter, "The Government Man" (he requested anonymity to keep his social media presence and professional life separate)—assembled a data set of video game information, writing a program to extract text from online game guides, then handpicking the walkthroughs to be fed to GPT-2 over four days of off-and-on work.

"There's a certain 'golden age' of text-based video game walkthroughs, and I wanted to capture a good cross-section of different styles of games, without any one genre having too much influence," @me_irl told Newsweek in response to email questions.

"It's also important to select content that isn't too repetitive, because these text generation neural networks have a tendency to overfit, or get 'obsessed' with simple, repetitive patterns in the training text," he added, describing an early attempt that over-sampled Pokémon, until GPT-2 would only spit out endless tables of Pokémon stats.

He had better luck with DOOM:

Presser, who used @me_irl's publicly released dataset to train a more powerful version of GPT-2, described one of its creations as "a walkthrough for the most tedious game in history": a dense set of instructions for something that sounds a lot like a first-person shooter.

"When the room opens, go forward. You should find a rocket launcher," the walkthrough begins. "Push the switch and a door opens. Take cover in the corner and shoot the guard. The door will close when he dies. Now jump over the gap and kill the guards. In the next area is a switch. Push it and the door will open. In the next area is a scientist. Kill him. Go back to the previous room and push the switch. Open the next door. In the next room is a scientist. Kill him."

Fictional players of this fictional game end up killing a lot of guards and scientists, but any gamer who has turned to GameFAQs in a pinch will recognize the direct approach.

goldeneye-n64-artificial-intelligence
Anybody else getting 'Goldeneye 007' flashbacks? Rare Limited

GPT-2 even provided tips for finding enchanted items, magic rings and "Boots of Blinding Speed" (which GPT-2 likely cribbed from The Elder Scrolls III: Morrowind), in addition to inventing in-game perks like "Wildcat," which provides +10 damage to your animal companion's attacks. Or, the language model can be applied more narrowly, auto-generating playthroughs for popular games by synthesizing existing walkthroughs. Here's GPT-2's thoughts on Final Fantasy VII:

Several different versions of GPT-2 exist, each with different built-in parameters. The largest, requiring the most computation, is known as 1.5B. It has more built-in context and vocabulary, including knowledge regarding characters from a wide swath of video games, including Overwatch, Final Fantasy VII and the popular multiplayer online battle arena game Defense of the Ancients, usually referred to simply as Dota.

heroes-of-dota-2
The heroes of 'Dota 2.' Valve

Presser tried out other experiments related to video games, like auto-generating Dota 2 and Overwatch hero descriptions, built by prompting GPT-2 with existing video game characters. By using profiles for D.Va, Doomfist, Hanzo and Junkrat, Presser was able to create surprisingly accurate predictions of attributes held by Sombra, Roadhog, Pharah and McCree.

In November, OpenAI—an artificial intelligence and machine learning research organization co-founded by Elon Musk, Sam Altman, Ilya Sutskever and Greg Brockman—made the 1.5B model publicly available. But the algorithm's massive size made experimenting with the language modeler a near impossibility with consumer GPUs, constraining GPT-2 experimentation to university researchers and others with the necessary computation.

Designed in response to the particular needs of machine learning and artificial intelligence applications, Google TPU (Tensor Processing Unit) microchips offered one possible solution. Originally used in Google data centers to power Google Translate, Search and other services—48 TPUs were used in the AlphaGo computer that defeated Go world champion Lee Sedol in 2016—the tech giant began offering TPU cloud computing to companies in 2018.

But renting a "TPU pod" for cloud computing can cost millions, making them prohibitively expensive for all but large companies—organizations unlikely to try out playful experiments. So Presser developed a technique he dubbed "swarm training," to employ 80 individual TPUs on a single data set.

"In swarm training, we can run dozens or hundreds of TPUs in a loose network which swaps updates on the fly," Presser told Newsweek. "It's chaotic, but it winds up working pretty well: it's much faster than using just a few TPUs, but much cheaper than renting entire TPU pods. We're hopeful that swarm training will be very useful to other researchers."

GPT-2 has also proved adept at gaming functions beyond just generating games-related text. Presser previously collaborated with technology writer and researcher Gwern Branwen to train GPT-2 to play chess, by providing it hours of "training" in legal chess moves (using standard notation) and asking it to output its own responses. After hours of training GPT-2 on which responses are valid moves in an ongoing chess game and which responses are nonsensical, the text generation engine was eventually able to complete a full game.

"Now that it is so easy to get access to information, code, pretrained models and computing resources for deep learning, I (like many others) are able to dabble with it in my spare time," @me_irl told Newsweek.

While it may be years before game designers are employing text generating language models in their designs, Presser said he already sees potential practical applications.

"If you prompt the model with descriptions of some spells from your tabletop campaign, the model can generate new spells," Presser said. "It's quite versatile."

For example, Dungeons & Dragons players could input spells like Fireball, including a description of its HP damage, and get back from GPT-2 new attack spells to use in tabletop roleplaying sessions.

"I think there's an opportunity to build new indie games using GPT-2," Presser said. "Imagine making a mod for Skyrim that uses GPT-2 to generate new quests. You'd have infinite replayability. It'd be like AI Dungeon 2 in 3D."

This article has been updated with additional information on the video game data set, provided by @me_irl.

OpenAI Text Generator GPT-2 Creates Video Game Walkthrough for 'Most Tedious Game in History' | Newsgeek