• Home
  • Musings
  • AI Guide
  • Services
  • About Me
  • Projects
  • Contact High
  • Contact
Menu

SuperString Theory

Street Address
City, State, Zip
Phone Number
Helping Creatives Thrive at the Intersection of Art and Technology

Your Custom Text Here

SuperString Theory

  • Home
  • Musings
  • AI Guide
  • Services
  • About Me
  • Projects
  • Contact High
  • Contact

Why AI Feels Like Cheating - And Why It Isn’t

July 14, 2025 Michael Moroney

Ex Machina (2015) - watching Ava draw

By Michael-Patrick Moroney

There's a recognizable moment in creative work nowadays - someone hands in some writing, a new piece of music, a design concept, and somebody else leans in and asks: "But did you actually do it?"

It's not a technical objection. It's an ethical one. And it's aimed at something more and more common but still culturally awkward: using AI in the creative process.

The reaction is reflexive. Suspicion. Distrust. Even bitterness. And the metaphor that most often comes to mind - though rarely stated explicitly - is that AI is like a performance-enhancing drug (PED). Not a tool, but a shortcut. A way of bypassing the hard work and cut to the payoff.

Icarus (2017, Netflix doc) - Bryan Fogel testing PEDs

The analogy is true, to a degree. In sports, we think of PEDs as a violation of effort, not fairness. When a gold medal is chemically augmented, it doesn't matter. It kills the narrative we like to imagine: that greatness is earned. Steroids kill the narrative. The process matters.

AI provokes the same emotional reaction. If a lyric in a verse is great because a composer used GPT-4, or a student gets an A on an essay because they got a brilliant prompt, is the work still valid? We've had few Mental Work analogs - maybe ghostwriting, maybe plagiarism - but those are forgeries, not tools. So we cling to the closest feeling: This isn't real. It’s fake. It's amplified.

But that simile fails on closer examination. Because what AI actually does - when well done - is not like doping, it's more like something deeply intimate, even intuitive: flow.

Flow is the mental state in which time is distorted, ambiguity disappears, and pieces of thought fall into place with eerie accuracy. Steven Kotler, a researcher who's worked on the idea of flow for decades, calls it "effortless effort." The science behind it becomes more evident: when we're in flow, parts of the prefrontal cortex of the brain are temporarily shut down (transient hypofrontality), so a combination of neurochemicals - dopamine, anandamide, norepinephrine - flow. Pattern recognition goes supernova. Risk becomes tolerable. Work becomes play.

This condition is cherished by programmers, athletes, artists. But it's always been rare - something you find by accident, if you're lucky.

AI changes this.

The Queen’s Gambit (2020) - Beth Harmon hallucinating chess pieces

Employing with intent, AI models can be flow accelerators. They reduce drag at junctures of greatest importance - idea development, organization, editing - and maintain you anchored in the task. They pass working memory, perform menial labor, and uncover new insights that always look obvious once they've happened. For most, especially those with already-existing craft training, the process closely resembles the leading edge of flow: swift, smooth, startling.

Yes, from the outside, that is definitely suspicious. All you can see is speed. Ease. Production. And if you're not there for the process, it's easy to assume that it was all staged.

But ease is not dishonesty. Ease of the creative sort - if it happens - is accomplished. The issue is, it doesn't happen that much. We've built a world that places value on hard work that's apparent, and we inherently question anything that appears to happen too easily, even when it's from us.

This is where AI breaks precedent. We’ve had machines that replace labor. We’ve had calculators, search engines, spellcheck. But we’ve never had a tool that can collaborate with our thinking process. That’s not a shortcut. It’s something else entirely - what some researchers are starting to call “distributed cognition”.

Gattaca (1997) - Evokes the fear of enhanced intelligence and the ethics of augmentation.

In my own work, I have used AI not to finish pieces of work, but to begin. To prototype thoughts in styles that I do not normally write. To carry out exhaustive research in areas I barely know. To test personal theories - some of which turn out to be wrong, ridiculous, or naive - and others surprisingly well-formed. I have used it to discover non-English perspectives never presented to me (although LLMs need even more non-English data). And most of all, I have used it to be disproven - to find out more quickly that I have a bad idea.

The tool does not supplant judgment or taste. It tests it. It expands the scope. And sometimes, it gives you something finer than what you might have written on cold.

But that does imply a shift in how we think about creative work. We are not just moving to a new period of new output. We are moving to a new period of new cognition. One in which the building blocks of thought - language, reference, style, rhythm - are able to be controlled at scale and pace, and where the human role isn't to wrestle but to determine.

We don't have to abandon our standards of effort or originality. But we may need to update them to accept the fact: AI, like flow, doesn't steal the work. It accelerates your arrival at it. It clears the runway. You still have to fly.

And that is not deceptive. That is design.

The Velvet Sundown isn’t great. But that’s not the point. →
Let’s Collaborate