Simulacra Goose
2025
​​
​
Simulacra Goose explores the porous boundaries between human, nonhuman, and artificial intelligence. AI, trained to mirror, distorts instead—revealing gaps in perception, translation, and meaning. As a neurodivergent artist, I’m often misread—filtered through lenses I didn’t choose. Working with AI became a way to explore how others perceive me: through mistranslation. I gave it my voice, my movements, my data—and watched it reconstruct me wrong.
The resulting video work combines footage of my live performance, AI-generated goose imagery, archival goose recordings, original violin composition, archival goose calls, and AI-generated visuals trained on the final frames of my own videos. It also includes music composed by AI, prompted to create as if it were a goose.
At the same time, I mimic geese—attempting to bridge bodies, species, and ways of knowing. I ask AI to do the same: to generate music as if it were a goose, to choreograph movement as if it had a body, to embody the nonhuman using human data. In our shared failures, a new language emerges—not accurate, but honest in its distortion—a recursive dialogue of mistranslation. I mimic the goose, AI mimics me, and in doing so, warps. It does not know that it does not know. AI, an uninvited guest in the archive of embodiment, reveals the limits of knowledge through its errors. This is not about geese. It’s about missteps, misreadings, and the poetry of getting it wrong.
I’m exploring mimicry as an imperfect practice of empathy—witnessing that resists sameness, stretching our ability to notice across difference, and generating connection in a world that often misreads and misunderstands. I’m interested in how AI– something that already is entangled in so many of our systems– mimics me in return.
What decisions does it make, and what does it ignore—and why?
When the system echoes me back, what does that reveal about the data that trained it and the structures that govern it?
What does it mean to trust a machine with the act of becoming?