Engaging Science, Technology, & Society

Ethics of Emerging Technologies: An Interview with Geoffrey C. Bowker

ELEN NAS
UNIVERSITY OF CALIFORNIA, IRVINE
UNITED STATES

Abstract

The challenges of research in ethics and technology require attentive listening. Geoffrey Charles Bowker, began to give attention to the theme over twenty years ago. What ensues is a contemporary commentary on the ethics of emerging technologies with extracts from an audio interview (by Zoom) between Elen Nas and Bowker, which took place during the early part of the Covid-19 pandemic. The main question of this open dialogue is regarding the use of technologies such as AI to ask—what has changed with the problems identified now and in the past in relation to ethical computing? To paraphrase Bowker—it is impossible to think of new technologies that would not change human values then and now.

Keywords

technology; bioethics; infrastructures; computing; interview

Introduction

This piece provides brief commentary on an interview I conducted with my research supervisor, Professor Geoffrey Charles Bowker in the Informatics Department of University of California Irvine, on May 13, 2020 via Zoom. At the time, we were still adapting to the mode of remote teachings and discussions. We set to record his answers to questions which I had previously sent to him in advance by email. We did not have time to go through all the questions, so we set to meet again on May 15th, when we improvised some of the questions and had a more open-ended conversation. Geoff seemed to still be adapting to being at his home most of the time in Long Beach. I called from University of California Irvine on-campus housing. Throughout the interview, I wanted to understand Geoff’s perception of how technological revolutions were impacting life and society, building off his book co-authored with Susan Leigh Star, “Sorting Things Out: Classification and Its Consequences” (1999), which has become a canonical reference for the discussion of infrastructure classifications. I have been conducting interviews with STS scholars since 2016 because I find that in personal conversation, I gain new insight into the context out of which formal academic work emerged, which is usually not captured in their more formal academic work. On top of this, during the Covid-19 pandemic, conducting online interviews was a survival strategy to stay in communication with colleagues.

In this brief meta-commentary piece, I share some of the excerpts that stuck with me from our conversation, namely the recognition that social values are embedded in technologies and the need for new politics of ethical computing. I share audio clips from the interview as source data and have hyperlinked them when referenced within this text. I invite you to visit the linked data and annotate with your own contributions.

I have been particularly interested in how knowledge circulates, and I asked Geoff why the research and development intended to create awareness about the ethical issues of technologies over the last few decades didn’t catalyze a significant change. He started laughingly saying: “I know, the revolution did not happen.” (Nas 2022d). But he then proposed a counter framing—the revolution has already happened.

It’s already happened. We already have the revolution. We have all the people we need, we have everything. What we don’t need to do is just accept their [Big Tech’s] vision of what reality is. . . . I will not give up on the concept that the revolution is already here. That’s the only way in which we ever have the revolution. . . . Is if we just say, “it’s already here, let’s do it.”

In reviewing our conversation, I realized that notions of ethical technology have indeed changed over time, but there is still much work to be done (ibid., 2022f).

I probed Geoff for his motivations behind studying particular research topics. He answered:

[T]he attention to values in design came when we started to look into classification systems, to address the issues of political, social and cultural values. At first, we [Geoff and Susan] did not want to go to the easy case, like going to psychology, where it’s really obvious where the values are. We wanted to take more difficult cases, like medical classification or virus classification, because that shows how we play our politics through design. So, the idea that design culture and political culture are separated is just wrong (ibid., 2022e).

Geoff also touched on how the problems he and Susan identified in technology could be responsible for changing human values (ibid., 2022f):

. . . technology is always, historically, changing what it means to be human. So you get new technology, you get new humanity. It’s impossible to think of new technologies which would not change human values.

I was interested to understand how people began to recognize that technology is not free of human values. He answered that when he started organizing more interdisciplinary gatherings in 2003 to discuss those questions (ibid., 2022b), it was challenging for computer scientists to make sense of what sociologists were saying and vice versa. As greater exchange across disciplines occurred, there has been greater recognition of the social values inherent in technology. He mentioned:

For most computer scientists, it’s much more like they are getting hit over the head with it right now. You’ve got folks like Facebook and Twitter, which are being forced to recognize that their systems are propagating certain values they don’t necessary believe in or that they think are wrong. So they are being forced to feel like from the exterior to recognize their values.
For something like Google, you’ve got Safiya Noble’s work [(Noble 2018)⁠] about the racist nature of many Google searches. I’ve argued this for a number of years with some friends and others that it’s like playing Whac-a-Mole with Google. They fix one error and then another political error pops up. So there is an increasing recognition that they are building social and political values into their . . . programs (ibid., 2022a).

This made me think of more profound questions that are related to the logic embedded in algorithmic systems. As the logic represents a specific culture, it creates tensions where bias is a symptom. Usually, axioms establishing what is true or false, when applied to classifications, are not free of bias. There are also problems of universalizing, which erases specificities that are essential to understanding local contexts.

I experienced this question of the logic embedded in algorithmic systems even while writing this short piece. Since English is not my native language, as I write this, I am going back and forth to the translation and grammar correction tools to check if the words I am choosing actually mean what they are supposed to mean. New artificial intelligence tools for language translation tell me if I am using the right articles, and sometimes the tools suggest reconstructing the phrase. This is the time-consuming and invisible work of writing in a language I did not train in from an early age. Every time I read in English, it is translated immediately into Portuguese in my head. Writing clearly, even in one’s own native language, is hard enough. Moreover, language represents culture and there are many ideas that just don’t translate. This point adds to the argument for why alternative kinds of genre forms—such as the interview, in this case—are important modes of scholarly engagement.

Another topic of conversation that arose is how, as technologies change, so too do our perceptions of space and time, and how we see ourselves, our bodies, and others. When taking these elements into account, technologies are a target for bioethical concerns, not only because they have values embedded in them but also because those values represent a set of information, projections, and desires that impacts on lives. I participated in a course Geoff was teaching called “Life at the Femtosecond” to increase the awareness “that many decisions are being made much faster than one has a chance of processing them” (ibid., 2022c). He explained to me:

My fundamental argument there is that we need a new kind of politics. The old politics was . . . around a public sphere. That we will create a public sphere where we can discuss political issues and rationally achieve outcomes. That is not how politics works right now. Politics is happening inside the computer and inside the computer system. So, what we need is a set of political strategies, and a set of political actions, and a set of political discourse which engages at all the temporal levels and all the levels that work inside computers.

With these initial reflections on my interview with Geoff, I invite readers to also analyze the data artifacts published on STS Infrastructures. In the early 2000s, I started sharing parts of my artistic production through Creative Commons licenses. Despite experiencing several problems when sharing for cooperative purposes, I continue to be inspired by collective intellectual work and cooperation. It is in this spirit that I am sharing some of the interview data. I find knowledge is suppressed by reductive and over-simplistic classification schemes and that algorithmic tools do not represent our desires of sharing for the good of all. So, I believe STS scholars must begin to put into practice updated ethical values and work to be more in step through sharing of all kinds. This text, the extracts chosen, and the questions made to Bowker are results of what makes the interviewer a thinker and its commentary a characteristic piece of authorship that does not require the entire interview to grasp.

Acknowledgements

Thanks to Professor Geoffrey Charles Bowker for conceding the interview, the Informatics Department of the University of California-Irvine for having me as a research visitor, and the editors of ESTS for the suggestions, in particular, Angela Okune, who assisted in the reformat of the content focused on experimental publishing that connects ESTS with STS Infrastructures as a data sharing platform. For the first version of this paper, sent to ESTS, I also thank my colleagues in Australia—Alice Gibson from Monash University, Victoria and Fernanda Del Lama Soares from RMIT University, Melbourne, who helped with the revision and grammar correction. Finally, I thank the Brazilian Coordination for the Improvement of Higher Education (CAPES).

Author Biography

Elen Nas is an artist and social scientist with an M.Sc. in design and Ph.D. in bioethics, applied ethics and collective health. Her focus of research is on the impacts and applications of technology in society. She was a visiting fellow in the Department of Philosophy at Monash University and at the Department of Informatics of the University of California Irvine. Currently, she is a postdoctoral researcher at the Advanced Studies Institute (IEA) of the University of Sao Paulo, Brazil. Website: http://www.prem.li/elennas.

Data Availability

Data that supports the article by Elen Nas can be accessed in STS Infrastructures at https://n2t.net/ark:/81416/p4b88r.

References

Bowker, Geoffrey, and Susan Leigh Star. 1999. Sorting Things Out: Classification and Its Consequences. Cambridge, MA: MIT Press.

Nas, Elen, and Geoffrey Bowker. 2022a. “Bias on ICS.” Audio. Engaging Science, Technology, and Society. STS Infrastructures (Platform for Experimental Collaborative Ethnography), last modified July 19, 2022. Accessed July 29, 2022.
https://n2t.net/ark:/81416/p4g59b .

. 2022b. “Interdisciplinarity.” Audio. Engaging Science, Technology, and Society. STS Infrastructures (Platform for Experimental Collaborative Ethnography), modified July 19, 2022. Accessed July 29, 2022. https://n2t.net/ark:/81416/p4k01h.

. 2022c. “Life at Femtosecond.” Audio. Engaging Science, Technology, and Society. STS Infrastructures (Platform for Experimental Collaborative Ethnography), modified July 19, 2022. Accessed July 29, 2022.
https://n2t.net/ark:/81416/p4f591.

. 2022d. “Revolution.” Audio. Engaging Science, Technology, and Society. STS Infrastructures (Platform for Experimental Collaborative Ethnography), modified July 19, 2022. Accessed July 29, 2022.
https://n2t.net/ark:/81416/p4x59c.

. 2022e. “Values in Design.” Audio. Engaging Science, Technology, and Society. STS Infrastructures (Platform for Experimental Collaborative Ethnography), modified July 19, 2022. Accessed July 29, 2022.
https://n2t.net/ark:/81416/p4sg63.

. 2022f. “Values in Design 3.” Audio. Engaging Science, Technology, and Society. STS Infrastructures (Platform for Experimental Collaborative Ethnography), modified July 19, 2022. Accessed July 29, 2022.
https://n2t.net/ark:/81416/p4j016.

Copyright, Citation, Contact

Copyright © 2022. (Elen Nas, and Geoffrey Charles Bowker). Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0). Available at estsjournal.org.

To cite this article: Nas, Elen. 2022. “Ethics of Emerging Technologies: An Interview with Geoffrey C. Bowker.” Engaging Science, Technology, and Society 8(2): 176–180. https://doi.org/10.17351/ests2022.1253.

To email contact Elen Nas: elennas@usp.br.