Stop Debating Whether Writers Should Use AI. Ask What the Writing Is For.
The journalist meltdown over Megan McArdle tells us more about identity than craft.
A new word entered the discourse last summer, and it tells you everything you need to know about where this conversation actually lives. The word is “slopper” — coined on TikTok, popularized by Rusty Foster’s Today in Tabs newsletter, and deployed with the kind of gleeful contempt usually reserved for people who microwave fish in the office. A slopper is someone who uses ChatGPT for everything. Ordering dinner. Writing emails. Thinking, or whatever passes for it.
Foster didn’t invent the word, but he sharpened it into a weapon. When Washington Post columnist Megan McArdle mentioned on X that she uses AI to research, transcribe interviews, generate pushback on her arguments, and fact-check her columns, Foster’s response wasn’t to engage with her workflow. It was to question her entire career. Twenty-five years of bad opinions, he wrote — no wonder she’s happy to hand the work to a machine. Then the kicker: “Kind, good, happy, secure people never go AI.”
That’s not an argument about writing. That’s a purity test.
And it’s worth sitting with for a minute, because the entire debate over AI and writing has this same structure — moral intuition dressed up as craft criticism, with nobody stopping to ask the question that would actually resolve it.
The Question Nobody’s Asking
Here’s what happened. McArdle posted about her AI workflow. The internet lost its mind. A Rutgers philosophy professor said she should be fired — “at the very least,” a qualifier that invites you to imagine what punishments he considers proportionate for using a search engine with better syntax. Charlotte Alter, a journalist, offered the most elegant version of the critique: “Research is thinking. Outlining is thinking. Writing is thinking. Any portion of that done by AI is less thinking done by you.”
Then Richard Hanania wrote a piece for the Boston Globe arguing that writers should be allowed to use AI not just for research but for composing text — that the ideas are what matter, and the writing is just the delivery vehicle.
McArdle herself wrote a column explaining her approach and pushing back on the outrage. Mathew Ingram, a veteran media journalist, pushed back on the purity-test framing while defending McArdle’s actual practices as reasonable. Katie Parrott at Every opened up her entire AI-assisted drafting process to show that writing with AI involves more friction, not less — more decisions, not fewer. Becca Rothfeld, now a staff writer at The New Yorker, argued that McArdle was violating the Post’s own standards policy. Marisa Kabas at The Handbasket went further, calling the whole thing symptomatic of an AI-poisoned future of journalism.
Everyone is talking past each other. And they will keep talking past each other, because they’re arguing about a tool when they should be arguing about a job.
The question isn’t “Should writers use AI?”
The question is “What is this writing for?”
Two Jobs, One Word
Here’s the problem: we use the word “writing” to describe two fundamentally different activities, and then we argue about them as if they’re one thing.
Job One: Conveyance. You have information, ideas, or instructions in your head. You need them in someone else’s head. The writing is a means to an end — a vehicle. Emails, reports, memos, marketing copy, most journalism, documentation, business correspondence. The measure of success is whether the cargo arrived intact. Did the reader understand the thing? Good. The writing worked.
Job Two: Art. The writing itself is the point. Poetry, literary fiction, personal essays, the kind of journalism where the voice and the thinking are inseparable from the content. You’re not moving cargo from A to B. You’re building something that didn’t exist before, and the how of the building is the what of the product. A Mary Oliver poem summarized in bullet points is not a Mary Oliver poem. The container is the thing.
Most writing is Job One. Almost all of it, actually. The overwhelming majority of words produced in a given day — across every industry, every profession, every inbox — exist to move information from one place to another. They are functional. They are utilitarian. They are, in the most literal sense, a means to an end.
This is not a value judgment. Utilitarian writing can be done well or badly. A clear, well-structured memo is a beautiful thing in its own way. But its beauty is instrumental. You admire it the way you admire a well-designed bridge — for how well it does its job, not for its existence as an object.
The small fraction of writing that qualifies as Job Two — where the process of writing is itself the product — operates on completely different logic. When Alter says “writing is thinking,” she’s describing something real. For an essayist working through an idea, the act of writing is the act of reasoning. Outsourcing the prose isn’t saving time on a task. It’s skipping the task entirely.
Both of these things are “writing.” They share a word the way “bank” means both a financial institution and the side of a river. And the AI debate is two groups of people standing on opposite banks, shouting about the same word, meaning completely different things.
What the Camera Did
This has happened before. Not with words, but with images.
When photography arrived in the mid-1800s, it detonated the art world — but not in the way people expected. The panic was that the camera would kill painting. It didn’t. What it killed was painting-as-documentation.
Before photography, if you wanted a visual record of something — your face, your estate, a battlefield, a botanical specimen — you hired a painter. Many painters were essentially human cameras. They had technical skill, they could render reality with precision, and they were paid to produce accurate visual records. This was a job. An important one. And the camera made it obsolete almost overnight.
But here’s what didn’t happen: the camera didn’t kill painters who painted because the painting itself was the point. The Impressionists didn’t stop. The Expressionists hadn’t started yet. If anything, photography freed painting from documentation duty. Once you didn’t need a painter to record what things looked like, painters could explore what things felt like, what they meant, what they looked like when you stopped pretending objectivity was the goal. Monet didn’t lose his job to the camera. The camera made Monet possible.
The painters who got replaced weren’t the artists. They were the conveyors — skilled professionals doing Job One with brushes, who got outperformed by a machine that did Job One faster and cheaper.
AI is doing the same thing to writing right now. And the people panicking are making the same mistake the portrait painters would have made if they’d argued that cameras would destroy all visual art, instead of recognizing that cameras would destroy their particular job within visual art.
The Identity Problem
So why is this so hard to see? Why can’t smart people — journalists, professors, professional writers — just sort their work into Job One and Job Two and act accordingly?
Because the sorting threatens something deeper than workflow. It threatens identity.
Here’s the uncomfortable truth: a lot of people who think of themselves as writers — capital-W Writers, people whose professional identity is built around the craft of putting words together — are actually doing Job One most of the time. They’re packaging information. They’re conveying arguments. They’re summarizing research and structuring narratives and explaining complex things in accessible language. These are valuable skills. They are also, fundamentally, conveyance.
When Alter says “Research is thinking. Outlining is thinking. Writing is thinking,” she’s describing her experience of her own process — and she’s probably right about it. For a features journalist working through a complex story, the writing process is genuinely intertwined with the thinking process. But that’s not a universal truth about all writing. It’s a truth about her writing, and maybe about a relatively small category of writing that operates the same way.
The person drafting a project update is not thinking-through-writing in the Charlotte Alter sense. The person writing marketing copy is not on a journey of intellectual discovery. The lawyer drafting a contract brief is not exploring the human condition. These people are doing Job One — important, skilled, valuable Job One — and AI makes Job One faster.
But if you’ve built your identity around being a Writer, and someone points out that most of what you do is conveyance, that’s an existential threat. It’s not that the tool is bad. It’s that the tool reveals something you didn’t want to see about the nature of your work. Noah Smith described this dynamic when software stocks crashed earlier this year — the fear wasn’t just about lost revenue, it was about the end of an entire professional identity built around technical expertise. Writers are having the same reckoning, just louder.
This is exactly what happened to portrait painters. The ones who panicked about the camera weren’t worried about art. They were worried about themselves. The camera didn’t just threaten their livelihood — it reclassified their work. Yesterday you were an artist. Today you’re a human Xerox machine, and the Xerox machine just got invented.
Foster’s “slopper” framework makes a lot more sense through this lens. It’s not a craft argument. It’s a tribal boundary. Clean people don’t use AI. Dirty people do. The line isn’t about the quality of the output — it’s about the moral status of the producer. And once you’re drawing moral lines around tool usage, you’ve left the territory of craft criticism and entered the territory of identity defense.
The Honest Position
Let’s be direct about where this lands.
If you’re doing Job One — conveyance — and you refuse to use AI on principle, you’re not protecting craft. You’re protecting a self-image. The economy has never paid a premium for artisanal status updates. Nobody ever got promoted for hand-coding an email that could have been drafted in thirty seconds. The reckoning was always coming. The typewriter replaced the scribe, the word processor replaced the typewriter, email replaced the memo, and now AI is replacing the part of writing that was always, honestly, just packaging. Fighting it on principle is like insisting on hand-addressed envelopes for your gas bill.
If you’re doing Job Two — art, genuine essay writing, literary work where the process is the product — then AI defeats the purpose, and you should say so clearly. Not because AI is bad, but because using it for art-writing is like hiring someone to run your marathon. You can cross the finish line that way, but you didn’t do the thing. The thing was the running. If you write to think, and you outsource the writing, you’ve outsourced the thinking. Alter is right about this. She’s just wrong to universalize it.
And if you’re Hanania — arguing that anyone should be able to use AI to compose text because the ideas are what matter — you’re half right. For Job One, absolutely. If someone has good ideas and bad prose, AI is a reasonable accommodation. We don’t insist that people with mobility issues take the stairs to prove they really wanted to reach the second floor. But Hanania’s position breaks down for Job Two, because for art-writing, the prose is the idea. You can’t separate them. A well-composed essay isn’t a good idea wrapped in nice sentences. The sentences are the thinking.
The McArdle position is the most defensible and the least interesting. She’s basically saying she uses a very good search engine with extra features. Nobody should be scandalized by this, and the fact that people are tells you the debate was never really about her workflow.
Some writers already get this intuitively. Noah Smith put it bluntly: he thinks AI could probably be trained to write like him, but he’s not sure why anyone would want it to. That’s a Job Two writer who understands exactly what makes his work valuable — and it isn’t the efficient arrangement of words on a screen. It’s the thinking that produces them.
The Question That Resolves Everything
Here’s the test. Before you argue about whether a piece of writing should involve AI, ask one question:
If a machine produced this text and a human produced that text, and you couldn’t tell the difference, would it matter?
For a project status update? No. The update is cargo. If the cargo arrives, the job is done. Use whatever vehicle gets it there.
For a poem? Yes. Absolutely. Because the poem isn’t the text on the page — the poem is the human act of wrestling language into meaning. A machine-generated poem that reads identically to a human one isn’t the same thing. It’s a replica of the output without the process, and for art, the process is the product.
For a news article? Depends. Straight reporting — who, what, when, where — is mostly conveyance, and AI-assisted conveyance that’s accurate is fine. A reported essay where the journalist’s perspective and voice are the value proposition? That’s closer to art. The writer’s thinking process matters.
This isn’t a bright line. It’s a spectrum, and reasonable people can disagree about where any given piece of writing falls on it. But the spectrum itself resolves the debate, because it reveals that the McArdle defenders and the McArdle critics are both right — about different things. They’re just too busy performing their team allegiances to notice.
What I Might Be Wrong About
The binary might be too clean. Most writing lives somewhere in the middle of the conveyance-art spectrum, and my framework might undercount how much thinking-through-writing happens even in utilitarian contexts. Maybe the lawyer drafting a brief is doing intellectual work that’s inseparable from the writing process. Maybe the marketing copywriter is making creative choices that constitute a kind of art. If that’s true, then the “just use AI for Job One” advice is too glib, and the real answer is something more granular — something about which specific cognitive tasks within a piece of writing benefit from human friction and which don’t.
I’m also not sure the photography analogy holds perfectly. Painting and photography are different media — the camera didn’t produce paintings, it produced photographs, a genuinely new thing. AI writing produces... writing. The same medium. That might matter in ways I haven’t fully thought through. When the output is indistinguishable from the human version, the identity questions get harder, not easier.
And there’s a downstream problem I’m skating past: if AI handles all the conveyance writing, where do young writers learn the craft that eventually lets them do art-writing? The journalism pipeline has always been: do a lot of Job One, get good at sentences, eventually develop a voice, graduate to Job Two. If AI eats the bottom of that pipeline, we might get a generation of people who want to write essays but never built the muscle. That’s not an argument against AI. But it’s a real cost, and pretending it doesn’t exist is its own kind of denial.
There’s also a version of AI skepticism that my framework doesn’t address at all — the Freddie deBoer position, which isn’t about purity or identity but about whether AI is actually good enough to matter yet. DeBoer’s been saying, essentially: stop telling me what AI will do and show me what it’s doing now. That’s a different and maybe more interesting objection than the moral one. If AI writing is still mediocre — and for a lot of use cases, it is — then the whole conveyance-vs-art framework is premature. I don’t think it is. But I’d rather engage that argument than the one about whether McArdle is a slopper.
The Real Debate
The AI-and-writing conversation will keep going in circles until people stop treating “writing” as a single thing and start asking what each piece of writing is actually for. The tool isn’t the issue. The job is the issue. And until you’re honest about which job you’re doing — conveyance or art, cargo delivery or construction — you can’t have a coherent opinion about whether AI belongs in the process.
The camera didn’t kill painting. It killed painting-as-documentation, and it freed painting-as-art to become something it couldn’t have been before. AI won’t kill writing. It will kill writing-as-packaging, and the writers who survive will be the ones honest enough to ask which kind of writing they were doing all along.
Tyler Cowen may have already shown us what the resolution looks like. His recent generative book on marginalism is written entirely by him — every word, no AI drafting — but it’s designed so readers can use an attached AI to interrogate the ideas, generate their own indexes, and explore connections across chapters. Job Two on the creation side. Job One on the consumption side. The writer does the thinking. The machine helps the reader access it. That’s not a compromise. That’s a division of labor that respects what each party is actually good at.
That’s the question the McArdle discourse is dancing around. Not “Should she use AI?” but “What was the writing for in the first place?”
And if the answer makes you uncomfortable, the discomfort isn’t about the tool.

