
I was born seven weeks early and spent my first week in an incubator. In 1975 this meant a plastic shell around me with a convection heater controlled by sensors attached to my body to keep me perfectly warm. The NICU was noisy, and stayed brightly lit twenty-four hours a day so the doctors could do their work. The machine was there to give me a controlled environment designed to prevent infection, and support undeveloped organs, but it came at a cost. I lost the nurturing touch of my mother, oxytocin from skin-to-skin contact, the seeding of gut flora from breast milk, the synchronization of brain waves between mother and child. All the intangibles that cannot be quantified or replaced by a machine.
I had a twin sister who was born with underdeveloped lungs. She stayed three weeks in her incubator while my mother, a nurse, demanded they let me go home after one because I was healthy. Our childhoods diverged from there: she was sickly, I was robust. That was the story, but several years later, cracks began to appear in my resilience.
We shared a bedroom until we turned seven. She could not fall asleep unless she held my hand, so I would hold her hand until she fell asleep and then go to sleep myself. After switching rooms I struggled to sleep, so I would wait until the house got quiet at night, turn on the light, and stare at the ceiling light for hours, stimulating my nervous system until I eventually crashed. The lightâ€"paradoxicallyâ€"calmed me down.
As I got older I found other ways to stay up. First reading into the wee hours, then homework, then Tetris, then computers. The bright, interactive screen offered endless opportunities for stimulation. I stopped skateboarding, inverted my sleep schedule, and dropped out of high school and soon I collapsed into a deep depression. One night watching Tombstone, a Western starring Val Kilmer as Doc Holliday, a college-educated drunk who played classical piano, inspired me to trade my screen addiction for piano and wine. This was the beginning of a cycle repeated for decades: total immersion in a technology, followed by total rejection of that technology. It ended for practical reasons when I surrendered to a career in tech after a back injury took away other prospects.
Now tethered to technology to pay my bills, I began searching for other ways to process my tech addiction without completely abandoning it. Around 2010 a chance encounter led me to the book Technopoly by Neil Postman, which gave me an intellectual framework for my relationship with technology. Every new technology is a bargain, and comes with great trade-offs. It gives something and takes something away. Writing preserved knowledge but weakened memory, and ended the oral tradition. The clock organized society but estranged us from the lightâ€"dark cycle of the sun.
My own story fit in perfectly. In my first week of life the incubator had given me stability, but taken human warmthâ€"the regulating presence that only a mother's body can provide. Light itself became the symbol of my relationship with technology: illumination, clarity, and stimulation on the one hand; sleeplessness, dysregulation, and disconnection on the other.
As I contemplated these examples, and related them to my own personal experience, it became clear to me that the question is rarely whether a tool is good or bad. The question is what does it give, what does it take, and is the trade-off worth it? With the rise of AI, this tension is revealing itself in a major way. It is noteworthy then, that even while people are rapidly adopting this new technology, there is widespread unease about its implications. Postman may now be a largely forgotten cultural critic, but his thesis is emerging organically through people's own experience.
AI content is rapidly taking over the internet, and is making in-roads offline as well. Anyone with even basic pattern recognition skills can spot it a mile away. "AI slop," as it is known. You see it in email marketing campaigns, ad content, newspaper articles, even politicians' public statements. It is omnipresent. People warn of impending model collapse, which will occur when AI is consuming more AI-generated content than human-created content. I think there is a greater risk though: I am starting to see organically created content by people take on the tropes of AI slop. I see it in my own writing. It is a mind virus slowly infecting everyone. The greater danger is that we lose touch with writing that comes from a real place of human inspiration.
The mess we're in with AI content frustrates me to the core. And honestly, anyone who knows me would probably assume I'd avoid the technology entirely, but that isn't where I've landed. I believe our best bet to push back against AI-driven sameness is to use AI consciouslyâ€"not as something that speaks through us, but as something that helps us speak with our own voice more clearly. That's exactly how I used it while shaping this essay.
I have always struggled with writing. I skew toward abstract thinking: strong conceptual perception, making intuitive leapsâ€"connections that feel obvious in my body but not always on the pageâ€"and unmasking insights, but I have been much weaker with expression. I tend to over-explain, be repetitive, wandering, and engage in tangents that flow in a way that is obvious to me, but may be impenetrable to the reader. This is great for processing my thoughts, and working out my ideas, but less ideal for communicating those things effectively to others.
Now rather than scanning, re-reading, consulting dictionaries, and so on, I have created a single prompt that I use to get instant feedback on my writing. I get direct highlighting of the areas that need work, and suggestions for improvement. It is undeniable that there is something lost in this process. Rereading an essay isn't just a tedious necessityâ€"it actually forces my mind to slow down and reinhabit the mental state I was in when I wrote it, allowing insights to rise that weren't available the first time through. But this is one more area where I am weak. It might take me five readings to spot the typos, misphrasings, and confusing threads that someone like my mom will catch immediately. I would rather have human editors, but I don't. So AI may have taken away the slow, reflective deepening that comes from rereading, but it has given me the ability to finish a piece, and that is irreplaceable.
Even as I rely on it, though, I can feel the contradiction building. So is this hypocrisy, or is it a bargain made consciously?
There is a larger story here about the democratization of content creation. I spent years in a band during the DIY music era of the 1980s and 1990s. Punk, indie, and rap declared that anyone with even marginal skills could make music. Four-track cassette recorders, affordable mixers, and samplers meant you no longer needed a professional studio. Rap was built on turntables and records, recombining fragments of existing music. Skill bent the limitations into something new.
In the late 2000s I could feel the ground shifting again. GarageBand removed any constraints on skill whatsoever and let people produce music without touching an instrument. Sampling plus software fully democratized creation, but it began a long slow march toward uniformity of the sounds we hear on the radio.
The tools we use shape the culture we live in. While the gains are hard to denyâ€"more people creating music than ever beforeâ€"the losses are real: homogenization as common tooling makes all music sound similar. Algorithms reward familiarity and create feedback loops. Working alone on laptops has eliminated the friction and collaboration between different personalities and musical sensibilitiesâ€"working in the same room, nervous systems, emotional textures all interacting chaoticallyâ€"that once created the unique dynamics from which great artists emerged.
Now we have gone a step further. AI brings "creation" to people who need only type a prompt. Certainly there is value in knowing how to ask good questions, to extract the most value from these tools, but the problem is that much of what gets produced this way feels hollow. Videos now flood social media of people pantomiming performances of music tracks where the lyrics and music have been entirely generated by AI. The human element is almost completely absent.
When I first started using AI, I did it without thinking. I've watched myself slide into that passivity for longer than I'd like to admit. I look back at that writing and cringe, but I leave it there as a reminder: there is danger in letting tools use us; letting their logic override our own. When we remain passive consumers who compose prompts at best, we eliminate what is human about creation. As algorithms consume their own output as training data, originality dilutes. We lose touch with what is human. However, if we use these tools to augment our abilities with intention, to shore up our weaknessesâ€"while retaining our own voiceâ€"we can hold on to what is human.
In a world where more and more human acts are mediated by artificial intelligence, the most radical act is staying connected to what is irreducibly human while using whatever tools help us express that most fully. I don't believe we'll achieve that by rejecting this technology. We face an avalanche of AI-generated content that threatens to bury authentic, human expression. We can't compete on scale, but maybe we can establish a legitimate foothold of genuine humanity. Some days it feels like shouting into a storm and seems easier to retreat offline and forget about it, but it will follow us offline, regardless. And we do not need a majority of content to be of human origin to prevail. In a country where five percent of the population is Muslim, ninety percent of the meat is halal. A small, disciplined, inflexible minority who demands something more has the power to shift the entire landscape.
What does it mean to create from purely human sources? I spent a year rejecting electric instruments. I wanted vibration from natural materials responding to touch. I sit at the piano and engage hammers striking metal, copper-wound strings, ivory keys worn smooth by countless hands. The instrument is as much about limitations as possibilities. I needed a keyboard that was portable and used no electricity. I learned that Civil War chaplains carried suitcase-sized reed pump organs to battlefields. They were pricey antiques, but digital sampling had made them nearly worthless. I found one cheap. One note, an A, stuck open and droned. The constraint forced chords that could live with a persistent A droning over them. I wrote melodies that moved around the fixed note. Soon after a song came to me, fully formed. I called it "I'm Electricâ€"Coming Back to Life." It became the foundation for an entire album about the process of thawing out my frozen nervous system.
That stuck note made me think about how other composers handled limits. The piano is a machine humanized over centuries. Each piece of wood shaped by human hands. Every string tuned by ear. Every mechanism designed to respond to the subtlest human touch. Yet unlike most instruments, notes are fixed and cannot be bent. It is also impossible to sustain a note indefinitely; once a hammer strikes, decay begins immediately. When I play I channel centuries of composers who accepted these limits and pushed the instrument to its breaking point. Bach explored the mathematical possibilities of well-tempered tuning, perhaps most fully realized in The Art of Fugue Contrapunctus XII and XIII: The Mirror Fugues; two pieces that mirror each other mathematically in reverse. Cecil Taylor deconstructed harmony and rhythm to discover new percussive realms, exemplified in Free Improvisation #3.
AI has constraints too, hidden beneath its illusion of infinite possibility. LLMs are, at root, statistical systems that merely recombine words that already exist. A model can only move within the space of its training data. Human minds can step outside that boundary entirely, following their intuition to places data could never lead them.
One story that stays with me whenever I think about human insight comes from physicist James Clerk Maxwell. He helped lay the foundations of our understanding of electromagnetism, and in 1865 he had a moment that captures what I mean. Sitting by a fire, watching the flames shift and scatter lightâ€"probably lost in thought in that absentminded way that I get when I am deep in thoughtâ€"when something in their movement aligned with an idea he had pondered for years. In the quiet of that moment, the pieces finally aligned, and in a single conceptual leap he realized that light and electromagnetic radiation were the same phenomenon.
Nothing in the data of the day could have predicted what he saw.
No recombination of existing physics could have produced his insights.
That vision could only come from a human mind in direct contact with something beyond statistical analysis. It came from consciousness itself, in tune with the underlying order of the universeâ€"a resonance between a human mind and nature that no model can approximate and can never directly connect to.
The broken organ with its stuck note became a metaphor for working with constraints rather than against them and finding beauty in limits instead of endless possibility. Humans and tools have danced this way for centuries. The piano embodies it. Our task is to identify the true limitations of AI, push against them, deconstruct them, and treat them as constraints that force us toward the fundamental creativity only humans can access.
In each case, the pattern repeats: constraints shape creative breakthroughs. We will never out-produce the avalanche of AI content, but we can keep one truth illuminated that no algorithm can reproduce: there is a creative light inside every human being. In my case, a human born prematurely, stuck in a brightly lit cube, and left with a frayed nervous system, now finally at home in his own skin, creating something that only a human could create.