Data Feminism: Why data are never neutral and why we (should) care. A workshop series on feminist approaches to Data and Information Science

Although people tend to think of data as neutral, and the people and professions that manage and provide access to data as objective stewards, this is not the case. Information and Data Science are disciplines that should seek to understand data not as neutral or objective, but rather as the products of complex social processes. Data are never neutral.

Data are the product of decisions made throughout the processes of collecting, processing, and analyzing phenomena, the way information is visualized, and where and how information can be accessed. These decisions result in data sets that reflect the attitudes, opinions, beliefs, perspectives, and biases of the people who produce and manage data.

These circumstances lead to information products and data sets not meeting or reflecting the needs of all of their users. Especially the needs of marginalized communities are often not taken into consideration by those who create and maintain information. In the field of Data Science, which overlaps significantly with Information Science, jobs are predominantly held by persons who identify as male. An international study by the Boston Consulting Group in 2019 found that only 15% of the participants who work in Data Science and Data Analytics identify as female. In her book Invisible Women, Caroline Criado-Perez gives multiple examples such as cars or cell phones being designed only for a model user that conforms to the idea of a white, able-bodied man. In some cases, these design processes result in products, that are not equally safe or functional for all users. Furthermore, this problem impacts algorithms that occupy invisible but very powerful positions in people’s lives. Safiya Noble and Cathy O’Neil have shown that algorithms can be highly biased (for example towards race or gender), a finding that has significant implications not only for the field of Information Science, but also more broadly for social systems that rely on information infrastructures, such as healthcare and/or the judicial system.

Data Feminism”, a book by Catherine D’Ignazio and Lauren F. Klein, published Open Access by MIT Press, takes an intersectional feminist view on Information and Data Science, informed by the idea that information and data can form and reinforce systems of power, leading to inequalities and the marginalization of communities. The book provides a comprehensive overview structured through the application of seven principles of Data Feminism. D’Ignazio and Klein take as their starting point the understanding that data are not neutral, and throughout the course of “Data Feminism” they illustrate a process through which it is possible to understand, demonstrate, and communicate the ways in which feminist data practices can reveal the biases that underlie data and information systems. It is this goal which motivated our own workshop.

The Workshop Series

The idea for the Data Feminism workshop series at the IBI (Institut für Bibliotheks- und Informationswissenschaft) was initially born as a plan for a reading group but due to the COVID-19 pandemic, we decided on a more structured participatory workshop series, loosely based around the book’s major themes. The four online workshop sessions took place monthly from November 2020 to February 2021.

We designed the workshop sessions as interactive spaces and encouraged active participation by minimizing the amount of presentations and/or lectures that we included. We used a lot of participatory elements in smaller groups to encourage networking and co-creation around certain issues and discussions. For further communication in between the sessions the participants had the option to use a Slack channel created for the workshop. These approaches, based on an agreed upon Code of Conduct, achieved the creation of a safe space for sometimes sensitive topics that were discussed in the sessions. In an informal survey after the last workshop session, survey respondents commented that they liked the interactive design and valued the welcoming atmosphere of the workshop series. One respondent concluded: “Despite being virtual, the space felt very warm, welcoming, I felt comfortable to speak, listen, [and] think around those very exciting topics.

We were very happy to be able to invite, and compensate guest speakers at three of our workshop sessions. Each used their session to present some aspect of their own research that was relevant to the workshop’s main themes, and to lead the workshop’s participants in an interactive activity.

Prof. Patricia Garcia, PhD from the University of Michigan School of Information introduced her perspectives on critical refusal as a feminist data practice and her work on creating the Feminist Data Manifest-No.

Dr. Nicole Shephard shared her expertise on (missing) data on race and ethnicity in Germany. The session revolved around the question, if we are ‘counting what counts’ and how missing data can affect anti-racism work and the struggle for social justice through intersectional data practices.

Katrin Fritsch from the MOTIF Institute for Digital Culture concluded the workshop series by introducing her knowledge on the social construction of technology with a special focus on the relationships between gender and technology, leading to a discussion on developing a new feminist future of technologies through narration and imagination.

Reflections – Looking back, looking forward

We were very happy to see such a high level of interest in the topic of Data Feminism and grateful to welcome a diverse community consisting of students, academic staff, and professors from the HU as well as persons from outside of academia and even outside of Germany.

Looking back, the workshop clearly showed that the concepts introduced in the book “Data Feminism” reach far beyond our own disciplinary bubble. The wide interest in the topic of Data Feminism and the diverse background of the participants demonstrated that feminist approaches to data practices are not only specific to Information Science and Data Science. In almost every academic discipline, as well as in our everyday lives, there is a large demand for discussion around the question of what is counted, by whom, using which methods, and for what goal(s) – it is thus a transdisciplinary issue and should be treated as such.

We are looking forward to continuing the next phase of this program. To find out more you can check for updates on our website: https://www.ibi.hu-berlin.de/de/forschung/infomanagement/datafeminism.

 

Works Cited:

Boston Consulting Group (2020): What’s keeping women out of data science?. https://www.bcg.com/de-de/publications/2020/what-keeps-women-out-data-science.aspx (abgerufen am 01.04.2021)

Criado-Perez, Caroline. Invisible Women: Data Bias in a World Designed for Men. S.l.: VINTAGE, 2020.

D’Ignazio, C., & Klein, L. F. Data feminism. Cambridge, Mass.:The MIT Press, 2020.

Ledford, Heidi (2019): Millions of Black People Affected by Racial Bias in Health-Care Algorithms. Nature 574(7780), 608-609. https://doi.org/10.1038/d41586-019-03228-6.

Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018.

O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. London: Penguin Books, 2017.

 

Laura Rothfritz is a research assistant and PhD candidate at the department of Information Management at the Berlin School of Library and Information Science at Humboldt-Universität Berlin. Her research focuses on socio-technological aspects of mistrust and distrust in data infrastructures. 

Maricia A. Mende is a Student Assistant and has been studying at the Humboldt Universität zu Berlin since 2018. She is studying towards a Bachelor’s degree in Library and Information Science.

Rebecca D. Frank is a Juniorprofessor (Assistant Professor) at the IBI and the Einstein Center Digital Future (ECDF). Her research examines the social construction of risk in trustworthy digital repository audit and certification.

 

Grafik: Maricia Mende

Privacy Preference Center