PETA’s AI simulation challenged students to justify humanity’s survival

“When They Came for Us,” an AI-powered immersive experience, utilizes AI in an unexpected way by forcing participants to defend their right to live before an alien overlord.

PETA demonstrated the technology outside of the Student Center this week. | JARED TATZ / THE TEMPLE NEWS

When Nasheerah Wilson strolled through campus on an unseasonably warm March day, they were expecting to try out some food trucks — not to become the subject of hunger themselves. 

But before they could place a food order, a towering alien materialized to interrupt their lunch. 

“I don’t know if it was the wind or what, but I swear I felt a presence there,” said Wilson, a Philadelphia local. “It was so surreal. It really felt like I was speaking with someone.”

What Wilson sensed wasn’t extraterrestrial, but a simulation: They had stumbled into PETA’s “When They Came for Us,” an AI-powered immersive experience designed to force participants to justify humanity’s treatment of animals or face an otherworldly reckoning.

Peta2 brought the simulation to the Howard Gittis Student Center doorstep from March 10-13. Each visitor was allotted seven minutes to convince an alien judge that they deserve to live despite humanity’s history of “exploiting helpless beings.” By strapping on the provided Meta goggles, participants were transported into a world where survival hinged on their testimony. 

Once inside the simulation, players saw their avatar encased in an otherworldly cyber prison and faced with a gangling, one-eyed alien overlord with a feminine presence. Fellow humans underwent interrogation in the background, met with their own fates. The game dynamically adapted to each participant’s unique words and reasoning, though all players had to answer one unchanging question: “Do you want to live?” 

The experience was designed with the concept of “speciesism” in mind. Peta2 volunteer Christian Maxey defines speciesism as discrimination against someone else based on their species, an ingrained bias the simulation seeks to challenge. 

One of the experience’s slogans, “every animal is someone,” is deliberately paradoxical: The intentional dissonance incited by that phrase is exactly what the simulation is trying to disrupt, mirroring the way people can be conditioned to overlook non-human beings. 

“It’s problematic because it’s a bias that people are completely unaware that they have, or don’t care that they have,” Maxey said. “But all animals have unique desires, feelings, and families. Each is deserving of empathy.” 

AI is embedded in students’ daily life on campus. It enhances security cameras, amplifies Temple’s research efforts and is warned against sinisterly by most syllabi, which disavow its use and equate it to plagiarism. Yet its unexpected presence in this advocacy initiative sparked reflection among students. 

“I’m a bit of an AI-hater,” said Evelyn Consolla, a freshman sport, tourism and hospitality student. “I think it’s better to have a conversation with a person than to have one with a robot, but it was useful for this experience. It challenged my beliefs.” 

Developed in collaboration with the German digital agency Demodern, the software utilizes artificial intelligence through OpenAI, the creators of ChatGPT. 

One player outcome presents a third-person perspective of the player’s avatar, symbolically replacing them with a bunny rabbit playfully grooming its ears. In some versions, as the experience intensifies, the overlord may begin to teleport disorientingly around the player, engrossing them further in the unsettling dilemma. 

Its immersive nature left some participants contemplating their moral stance far beyond animal rights after removing the headset. 

“If you put yourself in the headspace of truly asking ‘What if this was real?’ it becomes a scary thing,” said Laura Butterworth, a senior sport, tourism and hospitality student. “While we don’t like to think about it, every single day you are at the mercy of someone else. The game asks, why should you be the one to live? This is a safe space to go through that.”

Others attempted to challenge the AI, expecting predictable responses, only to be caught off guard. 

“It felt like an intimidation tactic, so I questioned it, and it said ‘It’s not intimidation, it’s urgency,’” Wilson said. “I thought, ‘Well, that’s true.’”

Kenneth Montville, PETA’s associate director of innovation and execution, has been developing virtual reality experiences since 2014. Montville emphasized that a facet of PETA’s advocacy isn’t about making unrealistic demands, but about recognizing shared capacities among all beings. 

“We’re not advocating for dogs to vote or for hamsters to have drivers licenses,” Montville said. “But animals are alike to us in the ways that matter: The ability to feel fear, pain, love, affection.”

He views technology as an evolving tool for social justice, citing the blend of AI and virtual reality as a tool to put people in a position to confront their own biases, as the player’s reality will respond accordingly to reflect each participant’s choices. 

As ethical discussions continue to arise and evolve under shifting political landscapes, Montville sees speciesism as a necessary addition to contemporary social justice movements. 

“These philosophies all support each other,” Montville said. “It’s the same philosophy used to justify any supremacist sort of worldview. As people are becoming more aware, it’s time to start including speciesism as well.”

Through “When They Came for Us,” PETA aims to spark a broader dialogue — not just about animal rights, but about the ways in which technology can enable people to question their beliefs in a space that is both safe and transformative. After taking the headset off, students return to their lives, yet a bridge has been built between having power and experiencing powerlessness. 

“Our lives with other species on this planet have become more and more intertwined,” Montville said. “As society grows and topics like race, sexuality and gender become more prominent in society, we should be advancing our treatment of any being we deem as so-called ‘other’ and create a system that isn’t inherently flawed against that. As everyone knows, that’s how we end up in some pretty terrible situations.” 

Be the first to comment

Leave a Reply

Your email address will not be published.


*