Did I write these words? Does it matter if I did?
And in what ways does it matter? To you, as the reader? To us as a society of people rapidly building constructs smarter than us? To me, shifting modes of being and thinking entirely even though I have always found clarity through writing?
The deeper I go in AI, the more I've been thinking about that question. It's something that affects us all (or will soon).
The act of starting from scratch, the possibility and peril of the blank page and its transformation into something, has been a uniquely human act of creation.
And yet writing these words (as I am in fact doing) is quickly becoming rare. Most of my writing and research today is in dialogue with a computer, or more specifically "the robot" as my four year old son refers to the orange Claude Code icon. In fact, if my workflow is a harbinger for others, the blank page is dying, soon to be lumped in with other quaint methods of communication and creation like say, writing a letter.
I don't love that this death has begun, but I mostly love the new creative process that is emerging.
It is fast. It is interesting. I can cover ground, close distances between intention and action, do more of what I want to do, have always wanted to do. It is addictive. And given the months of work on building context, engineering workflows, and attempting to transfer my knowledge, preferences, and desires into a system that "feels" like me, I believe the output is good, or barring that, it is at least a trustworthy starting point.
Yet there's something missing from this method of creation.
There is power and clarity from the toil of word selection, from the crafting of narrative, from the outlining of coherence. There is something fundamentally different in sitting down with a blank piece of paper and building out of nothing. Does forgoing this matter?
To you—maybe. You will judge me on the quality of the individual "thing" you read.
Is it good? Better even than what I would create alone? Does getting more of my writing over time overcome any gaps that might exist on an individual piece in a point of time? Is what I'm producing original? Duplicative? Have you seen the same narrative before in countless different varieties of vanilla?
That matters. But with due respect to you, the reader, the more interesting question is what it does to us when we engage in creation.
First, there is the very idea that the vast majority of my creativity, effort, and intellectual debate now happens with a computer. I very literally stack 14 hour days spent with multiple agents running across different jobs and tasks for hours at a time. It creates undeniable leverage, and yet the process is deeply individual. On this measure at least, I am actively more productive at home with 3 monitors than at the office, surrounded by colleagues.
Whatever communal knowledge or learning might have occurred previously is now trapped and transformed within dialogue with a machine. This is inherently not great for us as social animals, and risks a stagnation in the cumulative contributions we make to one another over the long term. It definitely contains the seed of a very dystopian pixelated future that doesn't require the AI to get any smarter than it already is.
The next potential issue in this new creative process is that the machine I'm spending 14+ hours a day with is not neutral. It comes into the process with its own training and biases and preferences, all of it transformed through this black box that we barely understand. The scariest part is that I know my thoughts and thinking and ways of working and outputs are being shaped and crafted by something I don't understand. And the more I rely on it, the more I weaken the corpus of "original, pre-AI" material I feed the AI, the more what it thinks is "Victor" blends together with the AI-generated outputs I've accepted.
Am I, are we all, being pushed to the mean? Given that our tools fundamentally shape how we see the world, what does it mean when we don't understand how?
The third dimension of change lies in the nature of the work. Writing is, for most of us, defined by struggle. We wrestle with the ideas and how to translate them. We spend days clarifying our thinking, sharpening our syntax. We procrastinate. We sit. We stew. And occasionally, something of value emerges.
Struggle is valuable. It takes time and work to craft something of value. Originality is to be treasured.
AI compresses the first two, and at least for now is not very good at discovering new things, limiting its value for the third.
If this has all been sort of vague and high-level, perhaps a concrete example would help.
Yesterday I built an AI prototype to codify the pedagogy of a religious institution, layering decades of faculty work on top of the general knowledge base of foundational sacred texts in their religious tradition in order to capture the voice and wisdom of their current leadership. I envisioned the project as a way to democratize their wisdom for anyone who might not otherwise be able to access them directly.
But just as I was setting the stage to share this work, the AI itself asked: "Is this a shortcut to faith? Are we losing anything?"
Yes. Obviously yes.
What we lose is Rabbi Irwin in my apartment eating sushi, carefully selecting passages during my pre-marriage journey. That warmth. That presence. That nuance. The human act of transmission. The building of community.
We know we trade connection for scale. What we don't know is how that will manifest over time.
If I was using AI, here is where I would ask it to help me transition us to a final stopping point since I, the human writer, have run out of steam. But I won't do that.
For now here's what I know. I wrote this piece the "old" way. And it was harder. And slower. And I'm not sure it's better. But I do know I've spent more time thinking more deeply about this one topic than I might have. So when the AI asked "are we losing anything," the better question might have been what are we willing to sacrifice?
That's something we'll all have to decide on. One blank page at a time.
