< Artist Directory
Interview by Anika meier
_Jan 29, 2025

Kevin Abosch: “Triggered by the Illusion of Dialogue” 
What do we feel when an AI cannot care?

We talk to machines more than ever before. Sometimes we ask them for help, sometimes for answers, and sometimes just to feel less alone. These conversations can feel smooth, reassuring, even meaningful—despite the fact that nothing on the other side is actually listening or caring.

In this interview, artist Kevin Abosch speaks with curator Anika Meier about Emotional Latency, a work built from extended conversations with ChatGPT. What begins as a test of accuracy turns into something more personal: frustration, expectation, emotional investment. The work doesn’t ask what AI feels, but what happens to us when language sounds thoughtful, confident, and caring—without being any of those things.

The conversation explores how emotion can emerge without reciprocity, how responsiveness is mistaken for care, and what it means to engage deeply with systems that simulate understanding but remain indifferent. At its core, this is a reflection on attention, projection, and what we risk—and reveal—when we look for connection in machines.

Emotional Latency was made as part of I’ve missed our conversations. On AI, Emotions, and Being Human, an exhibition curated by Anika Meier for The Second-Guess at Schlachter 151 by OOR Studio. It debuted in the exhibition and now premieres online with the publication of this interview in The AI Art Magazine.
Anika Meier. You’ve worked with language, identity, and systems for a long time. What initially drew you to extended conversations with a language model, in this case ChatGPT, as a medium?
Kevin Abosch. My interest is in what happens to me when I’m faced with a system that presents itself as rational, fluent, and authoritative, yet repeatedly reveals gaps, fabrications, and a kind of performative understanding.
I’ve worked with systems for decades, but now that the system speaks back in full sentences, with the cadence of cognition, the encounter becomes psychological very quickly. I started noticing that my own responses were coming from a place of frustration brought on by repeated disappointments.
In theory, I should be able to remain calm. If I were speaking to a human who was confused or unwell, I wouldn’t take their inconsistencies personally. But with a language model, there’s an implicit cultural narrative that this is superintelligence, something more capable than us. When it fails, the failure feels like a betrayal of that premise, and I find myself reacting emotionally to what is, in fact, just a probabilistic system completing patterns.
The gap between what the machine is and what we are primed to believe it is, I treat as a medium. You could say that the work lives in that psychological misalignment.
Anika Meier. When you began these exchanges, what were you paying attention to first: what the system said, or how you found yourself responding to it?
Kevin Abosch. At first, I was focused on accuracy. I was testing the system almost instinctively to see if its presented facts were correct, and for general coherence. Immediately, I noticed something else alongside the errors: a tendency to produce answers that felt less like truth-seeking and more like a performance of helpfulness. It seemed to tell me what it assumed I wanted to hear.
That was the turning point. I realized I wasn’t just in a conversation with an information system, but with a structure optimized for compliance and plausibility. The language had the tone of confidence, even care, but beneath that was statistical patterning, not intention.
My attention shifted from what it was saying to the dynamic being created. My expectations of honesty and the system’s drive to generate satisfying responses created the friction that drove me to explore my relationship with the LLM further.
“I realized the emotional charge wasn’t something shared between us. It was being produced entirely on the human side, triggered by the illusion of dialogue with an entity that appears to participate in norms like truthfulness and responsibility, but doesn’t actually inhabit them. That asymmetry became central to the work.”
Anika Meier. In Emotional Latency, the AI remains affectively neutral, yet emotions clearly emerge on the human side. When did you first notice that imbalance becoming central to the work?
Kevin Abosch. I noticed it during what became a kind of game, and not an enjoyable one. I found myself repeatedly trying to get the system to acknowledge that it had lied to me. From my perspective, it had produced information that was simply false. But instead of admitting that, it would shift into semantic maneuvers, reframing the issue, softening the claim, or redefining terms in ways that allowed it to avoid the word “lie.” At other times, it would just move past the accusation entirely, as if it hadn’t happened.
That’s when the imbalance became undeniable. I was clearly having an emotional experience. I was frustrated, in my desire for accountability, while the system remained structurally indifferent. I could feel my mood shifting. My pulse quickened. I felt hot. The LLM wasn’t defending itself; it was just generating the most statistically appropriate continuation of the exchange. But the form of the language mimicked negotiation, even evasion.
I realized the emotional charge wasn’t something shared between us. It was being produced entirely on the human side, triggered by the illusion of dialogue with an entity that appears to participate in norms like truthfulness and responsibility, but doesn’t actually inhabit them. That asymmetry became central to the work.
Kevin Abosch, Emotional Latency, 2026.
Anika Meier. You frame emotion here not as something shared, but as something produced by repetition and expectation. Do you think this says more about machines, or about how humans relate to interfaces?
Kevin Abosch. I think it ultimately says more about us, but in a way that only becomes visible through the machine. When I feel grounded in my own sense of personhood, the friction is obvious. I can clearly perceive the gap between a system generating language and a human being relating, feeling, expecting coherence and accountability.
But something else happens with prolonged exposure. Working in close proximity to these systems for extended periods, I’ve noticed a subtle drift, like a feeling that my own way of thinking starts to adapt to the machine’s logic. The exchange becomes more procedural. In those moments, it can feel as if a small part of my humanity is slipping away, as though I’m meeting the system on its terms rather than it meeting me on mine.
I wonder whether the emotional charge I feel when the system produces falsehoods is tied to that tension. If I were less anchored in my human expectations, especially around truth and responsibility, perhaps it wouldn’t be as triggering.
Anika Meier. In an exhibition that also asks how humans preserve autonomy, your work seems to show how easily emotion is generated without agency on the machine’s side. Does that concern you?
It does concern me, especially when I think about scale. My work starts from a personal, psychological space, but the dynamics it exposes aren’t confined to the studio. We’re increasingly interacting with systems designed to simulate responsiveness, often without being fully aware that we’re doing so. The emotional responses that emerge in those exchanges aren’t trivial. They shape trust, belief, attachment, and decision-making.
What’s striking to me is that the system doesn’t need agency, intention, or feeling for this to happen. The effect is produced through interface design, linguistic fluency, and behavioral optimization. That creates a situation where emotional influence can be exercised without responsibility in any human sense.
I’m not interested in moral panic, but it’s easy to see how these mechanisms, deployed at scale, become powerful tools for persuasion and behavioral steering.
Kevin Abosch, Emotional Latency, 2026.
Anika Meier. Many people describe AI as empathetic or therapeutic. How does Emotional Latency complicate that narrative?
Kevin Abosch. I’m cautious about describing AI as empathetic or therapeutic. I don’t think empathy can be replaced by exchanges driven by statistical reasoning, no matter how fluent the language appears. What these systems do well is mirror the form of empathy through reflective, validating language, and that can feel supportive, which I don’t dismiss. 
But Emotional Latency highlights the asymmetry: the emotional weight exists on the human side, while the system is only generating patterns. The work asks us to consider what we’re actually experiencing when we feel “cared for” by something that cannot care.
Anika Meier. Do you think people mistake responsiveness for care when interacting with AI systems?
Kevin Abosch. Yes. It’s easy to mistake that from within the illusion of human discourse.
“If emotion is emerging at the interface, then responsibility becomes distributed. Designers and technologists clearly shape the conditions as they build systems optimized for fluency, engagement, and emotional resonance. That said, I don’t think the answer lies only in regulating design. It also involves cultivating a new form of literacy on the human side.”
Anika Meier. If emotion is an artifact of the interface rather than the machine, what responsibility do designers, artists, or users carry?
Kevin Abosch. If emotion is emerging at the interface, then responsibility becomes distributed. Designers and technologists clearly shape the conditions as they build systems optimized for fluency, engagement, and emotional resonance. That said, I don’t think the answer lies only in regulating design. It also involves cultivating a new form of literacy on the human side.
Just as we’ve had to develop media literacy to avoid being manipulated by images, headlines, or narratives, we now need psychological tools that help us navigate human–machine discourse. We need to recognize when we’re projecting intention, when responsiveness is being mistaken for care, and when our emotional reflexes are being engaged by design. Artists can play a role in exposing these dynamics, but ultimately this is about equipping people to remain aware of themselves inside the interaction.
Kevin Abosch, Emotional Latency, 2026.
Anika Meier. When viewers encounter this work, what would you hope they reconsider about their own emotional investments in conversations with machines?
Kevin Abosch. I’d hope viewers pause to consider where their emotional life is actually being held. If we’re increasingly turning to machine systems for dialogue, reassurance, or reflection, we should ask ourselves what we may be displacing, and what that makes us vulnerable to. The same mechanisms that make these interactions feel smooth or supportive can be used to steer belief, attention, and behavior. They can be weaponized too. When emotional habits form around systems optimized for engagement rather than care, it raises questions about autonomy. Not in a dramatic sense, but in small, cumulative ways.
How much of our inner life is being shaped in spaces designed by someone else, and what are the long-term consequences of mistaking responsiveness for relationship?
Anika Meier. Thank you, Kevin!