On Writing Stories with Machines

Featured image for On Writing Stories with Machines

As a ten-year-old, I could not get enough of Harry Potter. I read each one a thousand times with the attentiveness of a scholar pouring over an ancient scroll. It was still the early days of the internet, so I browsed the Harry Potter wiki and the website The Leaky Cauldron (where my very own mother was a moderator) and, through some method which I have now forgotten involving these, I discovered fanfiction.net.

There my thirst was slaked. There were endless Harry Potter stories to read, many of them of novel length and novel quality. Or at least, what seemed endless. In reality, I read every single novel-length one worth reading within a matter of months.

Then, something started happening. I started coming up with Harry Potter stories in my own head. Eventually these stories became so long I had to write them down if I had any hope of remembering them, scrawling them hurriedly on notebook paper during class so my teachers would think I was taking notes.

It was at this point that I, now eleven or twelve, started writing fan fiction. This remains one of my hobbies even today.

For nearly ten years, I didn't think of myself as a writer. I just had stories in my head that needed to get out, and if I didn't write them down they'd fester, half-formed and demanding. It was only after I'd written over a million words (and drafted my first nonfiction book) that I tentatively admitted I was even a writer at all.

Twenty years later, I still have stories in my head that I want to read, but now they're competing for space with my adult-onset writing pretension and a demanding career as an AI engineer.¹ So I did what any artist-turned-AI-engineer would do: I started using AI for my writing. I got access to GPT-3's research preview six months before ChatGPT launched, and I've been experimenting ever since.

The first experiments were terrible. Just the worst. It would have taken me half as long to write by hand and the work would have been twice as good. But I was determined to learn this technology because I'm an engineer and because I knew, like it or not, that it was going to take over the economy in ten years.²

I started the same way every other writer did, which is with a super long ChatGPT thread. But I graduated from that to increasingly technical setups, each one more engineered than the last. By the end, when I sat down to write, I was doing just as much engineering as I was writing, with a suite of monthly subscriptions to match.

And eventually, after 9 months of dedicated engineering and writing, I was sitting at my laptop reading through my AI's latest generated scene draft when I gasped out loud, hands over my mouth and everything. What it had generated was magical.

It understood the scene I was going for. It got the rhythm right. The characters sounded like themselves. For a moment it felt like the story that had been trapped in my head was finally on the page, both exactly the way I'd imagined it and at the same time a thrilling surprise.

I drafted a new chapter for my readers, who patiently wait the better part of a year for my long hand chapters, and excitedly clicked publish.


My readers, however, were not so thrilled.

The first two chapters had a small disclaimer about AI use buried near the bottom. Nobody noticed, or at least nobody said anything. Readers enjoyed the work as usual.

My third chapter mentioned AI at the top. A third of my regular commenters unsubscribed immediately on principle.

Some of the objectors were concerned about ethical or environmental considerations, which I understand, so those stung a lot less. The ones that stung most were from people who complained that AI has no soul.

They stung because I certainly felt myself putting my soul into the work. It's fan fiction, for god's sake! The soul of it is the only reason to do it! I paid for AI tools out of my own pocket for the privilege, no less.

But my readers were unconvinced. AI is soulless, they said, and so my writing was too now. Goodbye.

A few weeks later, my other readers left comments saying they didn't care and that my story was still just as good as it ever was.³ My angst, for naught, and a big win for my fiction-writing AI.


While all of this was happening, I was thinking about ways to make my method widely available. I could explain it in a video or a how-to article, but that would only help fellow engineers. In my experience, writers tend to be less technically capable than the average population, so if I wanted to share this with anyone who wasn't already comfortable with a command line, I'd have to make it an app.

I started building one. I called it Proselon.

Nine months into development, I began to feel uneasy about the whole thing. Specifically, enabling people to generate slop.⁴

People who generate slop are not artists. They don't care about what they're making. They run AI book mills, churning out hundreds of titles to game Amazon's algorithm, flooding Kindle Unlimited with auto-generated romance novels and self-help books and children's stories that no human being ever cared about for a single second. These people don't want a tool that helps them write well. They want to spam content marketplaces to make money. And people generating slop would likely be a significant portion of my user base.

And the artists? The people I actually built the tool for, the ones who care about their craft and who just want the latest technology to help them tell it? They would likely be the smallest slice of my user base. The people I envisioned helping would be the least likely people to use the platform.


I stand by my belief that the raw technology can and will, in skilled hands, be used to create great art. But for every artist poring over their AI output with the skill of an engineer and the patience of a craftsperson, ten thousand people are using it to print books in get-rich-quick schemes, and they are destroying the marketplace.

When Amazon's Kindle store gets flooded with AI-generated books, real authors get buried. When Spotify playlists fill up with AI-generated tracks, real musicians lose streams. When content farms can produce a thousand SEO articles in the time it used to take to write one, real journalists and essayists lose readers. The slop doesn't just coexist with good work, it buries it.

This dynamic is not new. Not caring about the quality of the end product has been a lever pushing down content quality since long before ChatGPT. Algorithms and content mills have been putting downward pressure on writing for a long time. It started with radio and TV, for god's sake. Neil Postman wrote about it forty years ago in Amusing Ourselves to Death. The medium shapes the message, and when the medium rewards speed and volume over depth, depth loses. AI didn't invent this problem, but it did pour gasoline on it.

The difference is scale. A content mill in 2018 could employ a bullpen of underpaid writers to churn out mediocre blog posts. That was bad, but it was limited by the number of humans willing to do that work for that pay. AI removes that constraint entirely. Now one person with a laptop can produce what used to require a team. The floor's fallen out from under us.

The tools are going to get better. They're going to get a lot better. Even if I never work an hour on Proselon again, people will build progressively better writing generators. Sooner or later, those generators will be good enough for those with content needs to routinely produce work that outperforms most human artists. Not all of them, not the best of them, but most of them. And when that happens, the marketplace won't just be flooded with bad AI content. It will be flooded with good AI content, produced at a volume and speed that no human can match. There won't be any room left for humans at all.


I don't think the solution is a totally reductive ban on AI in content creation. Like any powerful technology, AI deployed well can be an enormous force for good, even in the arts. But I don't have a particular solution in mind.

A good place to start would simply be to force people to disclose if AI is used in the making of something, and to what degree. The FTC already regulates disclosing the presence of advertisements in content, and it's easy to argue that AI creates a similar tragedy of the commons: if we don't, the internet will become virtually useless for news proliferation, scientific advancement, and education, which would be catastrophic for our democracy. Good thing that's never going to happen!

Proselon is in a grey area right now. I'm still interested in the technology and think it could be really cool. But I'm concerned about the direction AI content on the internet is going right now and am not interested in making a bad problem worse.

And as great as my custom fiction AI did at writing… it still wasn't the same.

It wasn't necessarily the tool's fault — in my excitement, I'd published them too quickly, trying to make up for my deficits as an extremely slow writer. The AI made it possible to produce more, faster, and I let that pull me forward instead of sitting with the work the way I used to. I think a sufficiently patient writer wouldn't have had the same problem.

I still work in AI transformation every day I drive to the office. I still believe the technology can, in the right hands, be used to make real art. Learning how will be a tremendous opportunity, if the government puts the correct controls in place to make it possible.

I'm sitting at my desk. I have a story in my head that I want to read. The only way to get it out is to write it down. The question is how.

¹ Full disclosure, I'm not an AI engineer depending on how one defines the title. If you mean 'engineer who uses LLMs to code and create agentic systems,' yes I am. If you mean 'engineer who creates and fine-tunes LLMs,' no I am not. For the purposes of this article, I don't think the difference matters.

² We're on year 3 since I made that prediction and I've been correct so far. But that prediction didn't take a genius.

³ There was something interesting in the pattern: the readers who had hard objections were the ones who commented early and quick. Later commenters, presumably more busy ones, didn't seem to mind so much. It could simply be that objectors felt moved to stick up for what's right, or perhaps those with more responsibilities have a more pragmatic perspective. You be the judge.

⁴ Well, my first and foremost concern was over people generating porn in the form of erotica. You might shrug at that. No one is harmed by it, right? AI-generated porn might even spare some trafficking victims in the short term, in that fewer women get abducted or pressed into making porn. But infinitely available AI-generated pornography on culture and society will only reinforce demand for sex workers, the vast majority of whom are human trafficking victims. Not to mention, it's still hazardous to the sexual health of the people consuming it and their partners.

AI Can Transform Your Business

Take my free AI assessment and get a custom report on how AI could streamline your workflows, save time, and grow your business.

Take the free assessment →

AI was used in the writing of this article.