Barely a day goes by without another headline about AI: that it’s going to save us or destroy us, revolutionize everything or hollow it out. The discourse tends toward the binary: transformative tool or existential threat. I find myself wanting to offer something more modest. A lived experience, still unresolved.
I’m a geophysicist, and I’ve spent this past semester on sabbatical. For academics, sabbatical is a reprieve from teaching, a chance to step back from the daily machinery of running a research group and go deep on something. My wife calls it vacation. Friends and family say things like “well, you have time, you’re on sabbatical.” They’re not entirely wrong, but the goal at least was something more intentional: to sharpen my skills in wave propagation modeling, to develop new research directions, and to come out the other side qualitatively different. Not just more productive, but changed.
I think that’s happened, though not entirely in the ways I expected.
My understanding of how low-frequency sound moves through the atmosphere has genuinely deepened. I have tons of new research ideas. But the transformation I didn’t anticipate was learning to think with AI rather than just use it. A lot of people would argue that emfeebles you. I’m finding the opposite.
But here’s what keeps me honest about that. My graduate students have access to the same tools I do. They can generate figures, draft text, run analyses. And yet meaningful scientific progress eludes them. I don’t think this reflects badly on them. Yet. They’re early, they’re developing, they’re building the very intuitions and judgment that AI will eventually amplify. The hybrid only works if there’s something substantial on the human side to begin with. They’re still building that. I spent decades building mine.
A colleague in the arts said something to me once that I’ve thought about ever since. Students increasingly choose degrees based on vocation, on what will pay, on what will land a job. That’s not irrational; education is expensive and the world is uncertain. But it means that science and art occupy an odd space together, both committed to something harder to justify on a spreadsheet: a search for truth, for understanding, for making sense of our place in the universe. Not purely because it’s useful, but because it matters.
Which raises the question I can’t stop turning over: what exactly is the human side?
Here’s a small example that gets at it. Infrasound propagation can be described mathematically, and some of my colleagues would say that’s all it is: applied math. But when I think about a problem, I don’t just run equations. I build mental images. I imagine waves moving, bending, scattering. I try to get a felt sense of what the wavefield is doing. That’s how I figure out which questions are worth asking. It’s where, for me, the interesting science lives, in the gap between what the math says and what my picture of the phenomenon suggests.
Can AI do that? I’m not sure. It can compute a wavefield. The other day, I successfully vibe-coded a finite difference infrasound propagation code that might otherwise have taken years to develop. But whether it can understand one, whether understanding requires being an embodied creature that has experienced physical space, sound, the world, I genuinely don’t know.
That uncertainty sharpens when I think about Kasparov. His book Deep Thinking is worth reading if you haven’t. After Deep Blue defeated him in 1997, the consensus settled on a comfortable middle ground: human-AI teams would be the ceiling. The best chess wouldn’t be played by machines or humans alone, but by both together. That turned out to be true. For about a decade. Then the machines left us behind entirely. What strikes me about Kasparov’s account is how much that affected his sense of identity. It’s a humbling story, and not just about chess.
So when I tell myself the hybrid is the answer, I wonder how much weight that observation can bear.
Here’s the argument I can’t get around: there’s no principled reason intelligence has to be biological. Nothing in our brains is specially imbued, no property that physics and chemistry can’t in principle replicate. The current AI systems aren’t doing independent science yet. I’m not sure how long that stays true.
What I keep returning to is something harder to name. It’s not intelligence; maybe AI already has that, or will soon have way more of it than us. It’s something more like presence. We exist. We experience. We are embedded in the universe we’re trying to understand, which means we have a place in it and a stake in making sense of it. Whether that matters for science, whether presence is a scientific input or just a consolation, I genuinely don’t know.
But I suspect my colleague in the arts would have something to say about it.
Which brings me to something I should probably disclose. I wrote this essay with AI. We went back and forth. I brought the ideas, the experience, the questions I couldn’t resolve; it helped me find the shape of the argument and the words to carry it. The Seismological Society of America and several other journals now ask authors to declare AI use in submitted work. While I understand the impulse, I think it mistakes the tool for the thought. Did we have to disclose when we used calculators? Spell-checkers? Google Scholar at midnight?
The anxiety, I think, isn’t really about AI. It’s about authenticity, about whether the knowledge and the ideas are genuinely yours. That’s a fair concern. But if science is fundamentally about truth, about understanding our place in the universe, then how you get there is secondary. The question is whether you got there. These ideas are mine. The AI helped me find the words for them. I’m still the one who has to live in the universe I’m trying to understand.
Brilliant- this needs to be read by a wider audience- get it out there!
Very thought provoking!
I appreciate the way you worked with AI.
Because I know you to be a truth seeking individual, a true scientist, I see the value of this collaboration.
What concerns me greatly is AI in the hands of those who are motivated by
profit, greed, and power, those who not engaged in truth seeking at all, nor the well being of this planet and all of its creatures.