Skip to Content

Fair Representation in Arts and Data

Stamps Associate Professor Sophia Brueckner has long known that small things can make a big impact. However, the fact really hit home for her very recently through her work with the ongoing research project, Fair Representation in Arts and Data.” In the last year, she’s been part of a team of dedicated University of Michigan (U‑M) researchers who used several of the most popular face detection algorithms designed to distinguish a variety of factors (including gender and race) to analyze the entire collection at UMMA.

We’re trying to draw parallels between bias and exclusion in the museum world and bias and exclusion in technology,” Brueckner says. With our research, we hope to create a more aware — and more inclusive — local community and world.”

Video created by Shannon Yeung (BFA 22) for the UMMA exhibition White Cube Black Box”.

Funded by the U‑M’s Arts Initiative, Brueckner explains that the year-long collaboration between data scientists, artists, and museum curators has focused on exploring how bias is present and problematic, the process where bias happens, and if they could recognize some trends in the diversity of UMMA’s collection.

There’s simply no other project like this anywhere, and it’s really important research to have in this day and age,” says the project’s lead investigator and data scientist, Dr. Jing Liu.

Video created by Shannon Yeung (BFA 22) for the UMMA exhibition White Cube Black Box”.

Liu, who is also the managing director of the Michigan Institute for Data Science, shares that previous to working on Fair Representation in Arts and Data,” she had already been kicking around the idea of how artwork could be used to demonstrate to the public, in a very intuitive way, both the power of data science and the harm of data science.”

We know that data science and artificial intelligence (AI) systems have implicit bias and that momentum needs to be built up around the topic,” Liu says. For a few years now we have thought about educating the public in some way. When we found out that there was funding for pilot projects, I knew it was a chance to be a part of doing something really substantial.”

Working Together to Make Change

The project’s initial findings are certainly thought-provoking. Some key highlights were not too surprising to the research team. For instance, they uncovered that the algorithms often failed in recognizing females in the collection and that the collection is very white-heavy. 

We essentially did face detection over UMMA’s entire collection,” Brueckner explains. We found all the faces in the collection and then we applied algorithms that are available publicly and open source, which helped identify race classification and gender classification to those faces.”

She said that it’s very hard to understand sometimes why these algorithms are making certain decisions, but the researchers have found the results really interesting. She points to one unexpected discovery: when left to an algorithm to categorize visual input, the most representative face found in the collection is a painting of a clown.

This color etching is a representation of a clown.  Georges Rouault, Cirque de l'Etoile Filante. Plate XIII: Le Renchéri (p.106), 1935.
Georges Rouault, Cirque de l’Etoile Filante. Plate XIII: Le Renchéri (p.106), 1935. Full information

We applied a different type of algorithm that looks at which features are really important in detecting a face and then look at the averages,” Brueckner says. That the algorithm concluded on the clown’ is funny, but it actually sort of makes sense, because clowns have exaggerated facial makeup.”

Motivated by the knowledge that both the algorithms and UMMA’s collection are biased, the Fair Representation in Arts and Data” team inspired the UMMA exhibition, White Cube Black Box.

The phrase White Cube’ is a term that refers to museums historically being exclusionary and having blank white walls and removing all the context, which makes the work actually quite inaccessible for those who aren’t highly educated in the subject matter or coming from certain communities,” Brueckner explains. And, the term Black Box’ is used in engineering to talk about how a lot of these technologies that we rely on are opaque.” 

She and the rest of the research team are currently talking about the next steps of expanding the project. In the meantime, Brueckner is looking forward to public feedback – UMMA visitors can see the initial findings on display at the Apse at UMMA inside of the You Are Here exhibit. Curated by Jennifer M. Friess, associate curator of photography at UMMA, You Are Here centers on the idea of being present as the world reopens after the COVID-19 pandemic.

Friess shares that the idea of being present is different for everyone. In her view, It can be really joyous to be back in a museum, but if a person doesn’t see themself represented in a collection, and in the works that are on view, then it can be quite alienating.

White Cube Black Box really makes such a good counterpoint and another way to speak to the idea of being here,” Friess says. Underneath monitors, where the findings and research play out in a narrative way, we’ve asked the question Are you here?’ as a type of reverse of the exhibit title, and people can really take in the data and contemplate and make connections.”

Museum visitors seated on a couch near the Are You Here exhibition
Photo by Marc-Grégor Campredon.

Like Brueckner, she’s excited about the public’s reception. She explains that just before the pandemic, UMMA had an exhibit entitled Take Your Pick. The exhibition started with a selection of 1000 photos purchased from flea markets by collector Peter J. Cohen and on loan to UMMA. The general public was invited to vote on their favorite images, and Cohen gifted the top 250 selections to UMMA for their permanent collection.

The new research is so timely and such a good fit for the museum right now, because they found that Take Your Pick is the most diverse collection we’ve had,” Friess says. It really is a kismet of ideas coming together.”

For Liu, the power of people from different disciplines coming together to share ideas has not escaped her. She shares that in her usual experience when different groups of people talk about the same topic, they tend to talk over each other. 

But, with our project, artists and scientists sat down and talked with each other and learned from each other and challenged each other to strengthen our collective effort,” she says. I want to see more of this and I’m really hoping that our project is an example that helps to steer things away from the status quo.”

A museum visitor studies a large painting by Kehinde Wiley: a portrait of a black man, eyes upcast, on a background of flowers.
Photo by Mark Gjukich.

Seconding her wish is Brueckner, who also hopes that the team’s work will steer people towards becoming more educated consumers who will vote for more data privacy and security. If the project can provide an inroad to making people aware of the pitfalls in technology that is being embedded locally, and all across the world, a bit earlier, then she’ll be plenty happy.

This type of biased data collection is already being deployed around the world, and some of it is useful and some of it is frankly pretty scary,” she says. Recently, there was a disturbing case in Detroit where a child was misidentified at a skating rink and was given a ban all because of an algorithm. We need to raise awareness that these algorithms are still deeply flawed.”

Visit the U‑M Arts Initiative website to learn more about Fair Representation in Arts and Data and other Arts Initiative projects.

Story by Jaishree Drepaul-Bruder. Photos by Mark Gjukich and Marc-Grégor Campredon. Animated images and video by Shannon Yeung (BFA 22).