Built This Way BIAS – exhibition that aims to expose bias in AI

B.J. Quinn


Following the easing of lockdown restrictions, the Science Gallery at Trinity College Dublin reopened its doors to the public last October with a launch party for BIAS: Built This Way — a free, interactive, wholly thought-provoking exhibition that explores preferences, prejudices and digital equity. The exhibition is due to run until February 28th, when the gallery is sadly expected to close for good. 

“BIAS is a lens that distorts how we understand the world, warping our perception of reality in ways that are often unconscious and deeply ingrained,” says BIAS curator Julia Kaganskiy.  “But where does it come from? And what can we do about it?”

Featuring new commissions from fresh Irish and international artists, and collaborations with expert researchers from ADAPT, the SFI Research Centre for AI-Driven Digital Content Technology, the exhibition will interrogate how prejudice can move quickly from human to machine as algorithms and artificial intelligence systems are encoded by humans with very human values, preferences and predispositions.

Ireland has the largest number of AI start-up companies in Europe and has recently published the National Artificial Intelligence Strategy, positioning Ireland as an international leader in using AI. At a local level, understanding AI has never been more essential. These technologies have global consequences – AI systems can benefit innovation that help both our economy and society, but when used irresponsibly can reinforce and amplify social bias. 

The exhibition tackles some heavy subjects and complex questions – Can understanding human bias help build more ethical AI? Can understanding machine bias help build more equitable societies? What can a deeper look in humans and machines teach us about ourselves? A subject which can appear very complicated and hard to understand at first. Thankfully, the Science Gallery has a team of friendly volunteers on hand, who are not only extremely knowledgeable but explain these dense concepts in ways simple to understand.

Cloud Face by Shinseungback Kimyonghun is one of the first pieces you’ll come across. Apophenia is the name given to the human tendency to find meaning in random data: between things that have no relationship to one another. The most common example for humans is our habit of seeing faces in the clouds. Consisting of 24 square mounted photos of clouds, Cloud Face shows us what happens when machines make the same mistake we do, as each photo has been classified in error as human faces by artificial intelligence. 

Still, think of Cloud Face as an appetizer. One of the most enjoyable exhibits – and the one that attracted the largest crowds during my visit – is SKU-Market by artists Laura Allcorn and Jennifer Edmond. Essentially, SKU-Market is a satirical mini-market that explores how our purchasing habits and online behaviours can be interpreted, misconstrued and used to shape our lives in surprising ways. You are instructed to mobile-scan various items – a selection which includes equal rights t-shirts, daily astrology apps, VR headsets, rosé and bamboo toilet paper, to name just a few. Then at the Self-Checkout section, you print your receipt to see how the machine’s algorithm interpreted your purchases to create a vivid picture of who you are. 

According to SKU, I “sometimes rebel against myself but don’t have to” – not a great start. I’m also “highly cautious with sneaking suspicions” – getting warmer. And I “might also like boundaries, independent films, and black and white choices” – bingo! Of course, it’s all meant to be taken with a pinch of salt, but there is an inherent thrill of being psychoanalyzed by a machine.

The exhibition continues on the top floor of the Science Gallery, where you’ll come across more fascinating installations, including Risa Puno’s and Alex Taylor’s Most Likely To Succeed. This interactive exhibit, developed with Accenture Labs, gives you the chance to prove your skills in an exciting tactile game requiring focus and dexterity. By altering a tilting platform, players are tasked with getting as many balls through a maze as possible before time runs out! Sounds fun, but there’s method behind the madness: in theory, the piece aims to examine the construction and application of AI models through the subjective lens of fairness.

Of course, not all the exhibits require you to ‘tilt’ this and ‘scan’ that; some are waiting for you to look but not touch. Zizi – Queering the Dataset by London-based media artist Jake Elwes is a perfect example. Its aim is to tackle the lack of gender representation in the training datasets often used by facial recognition systems. Across three digital screens, we watch a procession of faces of drag artists in constant transition. What this does is cause the AI to shift away from the normative identities it was originally trained on and into a space of queerness. 

Public information notices are also dotted throughout the exhibition, detailing important studies on how we interact with technology. One which is sure to prompt interest describes the ways in which phones are designed to grab our attention and hold it there for as long as possible. The exhibit goes through every feature, colour and sound on your phone, and explains how each is optimised by teams of designers and psychologists to keep you hooked. 
With the Science Gallery set to close its doors this February, BIAS offers a grand send-off by exploring what a deeper look at the world can teach us about ourselves.

BIAS: Built This Way is free entry and no pre-booking is required, but, in following Covid protocol, you will be asked to sign in at the front desk. 

BIAS was curated with Julia Kaganskiy and ADAPT, the Science Foundation Ireland Research Centre for AI Driven Digital Content Technology.