Using an AI Neural Network to Generate New “MacGyver” Plotlines, and the Future of Creative Works | Insights

Developing new creative ideas can be a challenge. Even in legal practice, where our writing is frankly less creative than in other fields, we have all stared at that blinking cursor while we try to imagine a killer opening paragraph to catch the reader’s attention. What if we had a brainstorming colleague who suggested 10 ideas that we could react to? What if that colleague was artificial intelligence (AI)?

This post explores the rapidly evolving space of generating predictive text, but starts with a practical example: using AI to generate new plotlines for one of my favorite late ’80s/early ’90s TV shows, “MacGyver.”

New “MacGyver” Plotlines

“MacGyver” likely needs no introduction (but here is the introduction to Season 6). The show, where the protagonist gets in and out of jams with science and engineering, is deeply rooted in our social consciousness. It was the key to Bart Simpson uncovering Sideshow Bob’s plot to kill Aunt Selma, it was parodied on SNL – which was spun off into a parody movie – and it was even rebooted in 2016. We all know the show, even if we have never watched an episode.

I loved “MacGyver” when I was a kid. The protagonist gets in and out of jams with science and engineering? I’m in. I was revisiting some of the old plotlines a few weeks ago, and they were wilder than I remembered. Below is a sampling of real “MacGyver” plotlines:

  • “While out mapping the Alaskan wilderness, MacGyver finds a ship with its crew missing. A Russian girl onboard tell him they were frightened off by a creature the locals call Bigfoot.” Season 3, episode 4
  • “MacGyver and new colleague Nikki Carpenter go undercover as a married couple to look for a downed experimental plane. But the Soviets are looking for it, too – using a psychic!” Season 3, episode 6
  • “Frantic with worry about his gravely ill Grandpa Harry, an attack from a man who’s stolen an ancient Egyptian relic leaves MacGyver in a coma, where he experiences a vision where departed family members teach him the value of life.” Season 5, episode 21
  • “MacGyver and Thornton travel to Romania to help with analysis of Ceausescu’s secret files. However, MacGyver is captured by a soldier loyal to the deceased dictator and must choose whether to save the man’s life when he is bitten by a snake.” Season 6, episode 2
  • “While out of town to help with a new geothermal power plant, MacGyver is witness to a UFO sighting, complete with strange crop circles in a farmer’s field. He decides to investigate, and he uncovers a scam with the help of his new friends.” Season 6, episode 10
  • “After getting caught in a wolf trap, MacGyver finds himself being cared for by two determined ladies running an isolated bed and breakfast which a mob accountant is using to hide out with his stolen million dollars.” Season 6, episode 18
  • “Jack Dalton, with MacGyver’s help, searches for the Fountain of Youth in a land that appears to be the mythical Shangri-La, but also harbors a base to facilitate H-bomb construction.” Season 7, episode 13

It is hard to believe that someone could condense these ideas into 43 minutes of ’80s high-octane network television action, but “MacGyver” taught us that nearly anything within the bounds of physics is possible.

It occurred to me that these plotlines are so rich with material that they would make excellent training data for AI. There were also seven seasons of the original show, so I was able to find 135 plotlines for my training set. The AI application created results that are weird and wonderful new story concepts:

  • “Pete and Jack Dalton plan to help a Native American elder locate a sacred wolf mask but things add up to big trouble as MacGyver tries to kill a dictator”
  • “MacGyver and Pete Dalton have to help a nun and a murderer on trial to France with a bomb agent and bookmakers who want him to throw the game to save the rap of a business drug assassin”
  • “MacGyver finds a bomb expert who wants Nikki Carpenter dead for investigating smugglers who are illegally importing goods manufactured in dissident labor camps”
  • “MacGyver tries to kill the dictator, but MacGyver’s obsession soon puts his and Pete’s jobs at risk for drug smuggling at a zinc mine”
  • “MacGyver is living in the Old West, where he is forced to help a Native American elder locate a sacred wolf he claims was stolen from his father and a UFO”
  • “MacGyver is attempting to detonate a time bomb that leads to two construction teachers”

(Note: These plotlines are slightly cleaned up from the raw output to make them more legible. They are all, of course, ridiculous.)

At least I would like to see the one about MacGyver living in the Old West while searching for a sacred wolf mask (and a UFO is somehow involved!). It is fascinating that this absurd story concept idea sprung from software rather than a creative human mind.

How Were the “MacGyver” Plotlines Generated?

The application used to generate the plotlines can be found here, and it builds from work by Shivam Bansal that generates New York Times headlines based on sample historical headlines. Creating new TV plotlines from old ones is a similar exercise, so I did not have to make substantial edits to Bansal’s code. You can generate plotlines yourself by clicking “Run All” at the top of the page and checking the output at the end. Feel free to adjust its variables to modify the output and create some of your own adventures. The easiest edits would be to substitute new seed words to the text generator at the bottom of the application.

The application primarily relies on the deep neural network Tensor Flow and the Keras application programming interface (API) to build a “neural network.” Neural networks are, to put it mildly, ambitious software: They try to recreate the human mind to process information. They do this by using linked data structures called (of course) “neurons” that weigh input to produce output. Because they attempt to model real intelligence, it should be no surprise that they are a key element of many AI solutions.

As a simple example close to the application above, we could code a neural network to predict which word completes a phrase. Suppose our phrase is “I pledge alliance to the ____.” Our network would take “I pledge alliance to the” as input, then guess what the next word would be as output. To make that guess, the network needs weights, which can be probabilities that a given output is correct. To obtain those probabilities, we might have surveyed lawyers to determine how they would complete the sentence:

  • 90 percent of the lawyers remember their grade school mornings and say “I pledge alliance to the flag
  • 10 percent of the lawyers remember their bar admission ceremonies and say “I pledge alliance to the Constitution

With this information in hand, our simple network takes “I pledge alliance to the” as input and completes it with “flag” 90 percent of the time and “Constitution” for the remaining 10 percent.

Like a lot of things in software, small, simple concepts like the one above scale to large, complex systems that do extraordinary things. In the “MacGyver” application, a neural network takes the existing plotlines’ text and processes them to build many interlinked neurons that “learn” with what frequency one word in the plotline tends to follows another. It goes further by recalling not just the previous word, but a few words before it as well. This is called a “recurrent neural network,” and the “MacGyver” application uses a specific type of this network called “Long Short-Term Memory.”

There is no semantic reasoning at play in the application – it has no idea what MacGyver will be up to in the next episode or whether it makes any sense. The application knows only that words like “and Pete” tend to follow the word “MacGyver” some percentage of the time, then uses a seed phrase to get started. This explains why the plotlines are a mix of understandable and absurd. It also explains why some of the AI-generated plotlines include plot points from original episodes. Compare …

“Pete and Jack Dalton plan to help a Native American elder locate a sacred wolf mask but things add up to big trouble as MacGyver tries to kill a dictator” (AI-generated plotline)

with …

“MacGyver and Jack Dalton plan to help a Native American elder locate a sacred wolf mask” (real plotline from Season 3, episode 17)

The application will also generate as many words as you ask it to, so some editorial license was necessary to decide where to stop. Below is the output from a 180-word request so you can see how the application builds chains of related phrases without end. After the first few words, the application mostly loses the plot (so to speak).:

MacGyver travels west asked to get a reporter out of a Central American country but she won’t leave without getting evidence linking a general to an illegal arms dealer first first states in the discovery of his machine contract has him to the killer to the Russian army has his past of a narcotics agent movement against seeks the killer where the Bulgarian assassin who decides to investigate and he uncovers a scam with time time time honor travel the priceless drug over who is animal parker have the American visits time a investigate human series to obtain in his grandmother assassin and Colton find the father of the high mob who is timber cutting from the homeless and is unwittingly with a stolen man come and Jack may by a family where to him to his family where he and and the only one who can extinguish town the holy drug smugglers who sabotaged it there town obtains priceless death with a vengeful enemy who still seeks travel the killer visits a murder fear is family is teach madness

The application could be improved by branching out beyond the original plotlines with a larger training set (e.g., GPT-3, discussed below). That could produce some new ideas rather than remixing old ideas. I would be interested to see if the…

Read more…

Leave a Reply

Your email address will not be published. Required fields are marked *