Reflecting the ethical implications of AI and automated decision-making.

The Biometric Mirror is a suite of provocative interactive systems that empower the public to play a crucial role in the ongoing debate about the ethics of AI. These systems immerse individuals in the heart of a challenging AI analysis, sparking reflection on the ethical use of AI through interactive storytelling and deep personalization.

AI is omnipresent, and its potential to bring about a radical, predominantly positive transformation across all aspects of society is a promise we're all familiar with. What was once deemed too intricate for computers to compute is now within reach. The prevailing view, therefore, is that AI will have a positive impact on society.

However, the technology is increasingly used to analyze and predict human behaviours, experiences, and profiles. These applications demonstrate that the ethical and social impact of AI is often overlooked. Because indeed, what happens if algorithms make mistakes? Or if they amplify biases? How can we amend our own data? And how can we find out more about the logic behind an AI?

These questions raise the need for us to better understand the impact of these technologies on society and to reveal public attitudes and concerns. And that is the exact gap the Biometric Mirror tries to fill: provide a front row seat for members of the public; invite them to interact with a thought-provoking AI, and form their own understanding of where its strengths and weaknesses lie, without going down the rabbit hole of technical, legal, or sociological conversations—just you and the AI; and your thoughts on the technology.

The Suite

Currently, 4 versions of the Biometric Mirror have been developed to suit a variety of purposes and environments.

Biometric Mirror at the University of Melbourne

The original Biometric Mirror was developed at the University of Melbourne as an interactive installation that confronts people with how artificial intelligence interprets their faces. Using real-time facial analysis, it generates provocative judgments about traits such as attractiveness, trustworthiness, and emotional state, sparking critical reflection on the ethics and biases of biometric technologies.

Beauty Temple at Science Gallery, with Lucy McRae

Presented as part of Perfection at Science Gallery Melbourne, Beauty Temple transforms the Biometric Mirror into a speculative vision of a future beauty salon. Developed in collaboration with artist Lucy McRae, it invites participants to undergo AI-driven “evaluations” of beauty and desirability, framing biometric technologies within the context of consumer culture and raising questions about the commodification of the human body.

Face Value at the University of Technology Sydney, with the Human Technology Institute

Face Value explores how facial recognition is understood in a civic and regulatory context. The installation challenged visitors to consider how automated facial analysis intersects with questions of consent, governance, and accountability, positioning the Biometric Mirror as a tool for democratic dialogue about responsible AI.

How They See Us at SparkOH!

How They See Us adapts the Biometric Mirror into an immersive science centre experience for young people. By visualising how algorithms “see” and classify them, visitors confront the often-uncomfortable realities of AI surveillance in daily life. The work also questions the emerging role of virtual therapists and chatbots, asking whether we should trust machines with matters of wellbeing, identity, and care.

Publications

Pursuit, 2021. Niels Wouters and Jeannie Paterson: TikTok Captures your Face.

Pursuit, 2020. Niels Wouters and Ryan Kelly: The Danger of Surveillance Tech Post COVID-19.

Proceedings of the 2019 Conference on Designing Interactive Systems. Niels Wouters, Ryan Kelly, Eduardo Velloso, Katrin Wolf, Hasan Shahid Ferdous, Joshua Newn, Zaher Joukhadar and Frank Vetere: Biometric Mirror: Exploring Ethical Opinions towards Facial Analysis and Automated Decision-Making.

Pursuit, 2018. Niels Wouters and Frank Vetere: Holding a Black Mirror up to Artificial Intelligence.

The Conversation, 2018. Niels Wouters and Frank Vetere: AI Scans your Data to Assess your Character but Biometric Mirror Asks: What if it Gets it Wrong?