Forget Zombies, Writers Need to Worry About the Robot Apocolypse

Photo by Middlewick from Morgue File

What if I told you this article was written by a computer? Would you read it?

Chances are, if you read Forbes, the Tribune or other Internet media powers (who refuse to be identified), you have already read articles written solely by computers.

Technology sometimes crosses a line, a very bad line. As humans having something computers don’t—morals—we need to draw that line.

I understand looking for efficiencies in a workplace or even an industry. But writing is a very distinct human endeavor.

Humans observe, intuit, react with emotions, add sarcasm and humor, and these human traits make prose of all kinds worth reading and identifying with.

Can a computer duplicate this? Maybe so. But should it?

Steven Levy, writing for WIRED magazine, reports in “The Rise of the Robot Reporter” (print title) that:

  • Within 15 years, more than 90 percent of news articles will be written by computers.
  • In 20 years, there will be no area in which a computer doesn’t write stories.
  • In 5 years a computer-generated story will win the Pulitzer Prize.

These predictions come from Kristian Hammond, Chief Technology Officer and cofounder of Narrative Science, the seeming leader of this new writer-killing industry.

Hammond would disagree with me on that “writer-killing” crack. In fact, he tried to defend his company by saying, “Nobody has lost a single job because of us.”

Let’s assume for a second his assertion is true. For how long will it stay true?

Read Levy’s article. It’s quite fascinating, really. Right now, top publications use Narrative Science technology to produce articles on finance, sports, and more. More sports are being covered than ever before because of this technology—Little League games and women’s softball have benefited quite a bit.

When you read the article, you can see a real use for this sort of technology, especially in helping writers, scientists, marketers and others research and review data to help them with their work.

But what about authenticity? A word so often hailed as the hallmark of expert communication in the digital age.

Forbes.com does include a disclaimer about Narrative Science with all its computer-generated articles.

Narrative Science disclaimer from Forbes article

To me, including this disclaimer should be the rule—the law even. Otherwise, isn’t it fraud?

James Frey, author of A Million Little Pieces, comes to mind. He wrote and sold millions of copies of a very compelling story. Unfortunately for him, he called it a memoir, which means the story should’ve been true—at least as he recollected it.

Readers recoiled when they learned of Frey’s lie because they identified with the human emotion and struggle described in the book. They felt compassion and maybe even admiration toward Frey.

What happens when computers start writing more emotional stories? (Believe me, there will be an algorithm for that.) Isn’t presenting an emotional first-person account of an event or story fraud if it’s written by a computer?

Many questions need to be answered before we move forward with this technology. It’s about time we set a moral precedent and decide what we want for our future.

Narrative Science isn’t the only company out there doing this, but let’s remember Hammond trying to make us believe he’s not after people’s jobs. Now listen to CEO Stuart Frankel’s reaction when first looking at this technology himself:

Could this system create any kind of story, using any kind of data? Could it create stories good enough that people would pay to read them? The answers were positive enough to convince him that “there was a really big, exciting potential business here.”

Does that sound like a guy who’s concerned about writers losing their jobs—their livelihood?

Are you concerned? Should you be?