Skip to main content

We come in PEACE: Allen School researchers offer a vision for addressing potential unintended consequences of technology

A partially open laptop with the screen illuminated in shades of blue, orange and red, which reflects off the keyboard and surrounding table. The laptop screen is the only source of light, with the background shrouded in darkness.
Hero image credit: Photo by Ales Nesetril on Unsplash

In 2020, a group of researchers unveiled a tool called Face Depixelizer that would take a low-resolution image as an input and, with the help of a generative machine learning model called StyleGAN, produce a high-resolution image in its place. But the model, which was not designed to “fix” the original low-quality image but instead generate an imaginary replacement, had a tendency to predominantly imagine white people — even when the original image depicted someone of another race.

The following year, a group of web developers and accessibility experts signed an open letter urging website owners to avoid using accessibility overlays on their sites. The signatories had become alarmed by the growing reliance on these automated tools, which are marketed under the guise of helping website owners improve the user experience while avoiding potentially costly litigation, when it became apparent that they can actually make the experience worse for people with screen readers — to the point of making a site unusable. To date, nearly 800 individuals have added their names to the letter.

These are just two examples of how technology can have unforeseen, and ostensibly unintended, negative consequences in the real world. Spurred on by these and other cautionary tales, a team of researchers at the Allen School want to assist their colleagues in anticipating and mitigating the consequences of their own work. With support from a five-year institutional transformation grant through the National Science Foundation’s Ethical and Responsible Research (ER2) program, the team hopes their project will usher in a new paradigm in computing-related research not just at the University of Washington, but across the field.

One member of the team, Allen School Ph.D. student Rock Yuren Pang, already had begun thinking about how society increasingly bears the brunt of unintended consequences from new technologies. After enrolling in a graduate-level computer ethics seminar taught by professor Katharina Reinecke, he began to fully appreciate the difficulties researchers face in attempting to understand, let alone mitigate, what those might be.

“Emerging technologies are being used for a growing range of applications that directly impact people’s lives — from how communities are policed, to which job applicants are called for an interview, to what content someone sees online,” Pang said. “As a young Ph.D. student, I thought the question of how we as researchers might think about the downstream impacts of our work to be a really important problem. But I also felt overwhelmed and didn’t know how to even begin tackling it.”

Side by side portraits of Rock Yuren Pang and Katharina Reinecke. Pang is wearing glasses and a patterned denim shirt over a t-shirt standing in the sunshine in front of a concrete and glass building exterior. Reinecke is wearing a cream colored v-neck shirt and beaded necklace with a blurred metal and concrete walkway flanked by bright lighting in the background.
Rock Yuren Pang (left) and Katharina Reinecke

In a new white paper, Pang, Reinecke and Allen School professors Dan Grossman and Tadayoshi Kohno offer a potential starting point. Dubbed PEACE — short for “Proactively Exploring and Addressing Consequences and Ethics” — their proposal offers a vision for empowering researchers to anticipate those consequences “early, often, and across computer science.” 

The latter is important, Reinecke notes; while artificial intelligence may dominate the headlines at the moment, these issues extend throughout the field.

“We can’t just point fingers at AI; every technology, no matter how seemingly benign, has the potential to have undesirable impacts,” said Reinecke, the PI on the NSF grant whose research in the Allen School’s Wildlab includes investigating how people relate to technology differently across languages, cultures and abilities. “When we interviewed researchers across multiple subfields, they generally acknowledged the importance of trying to anticipate the consequences of innovation. But to translate that into practice, they need some scaffolding in place.”

To that end, Reinecke and her co-authors propose a holistic approach that would weave such considerations into the school’s teaching and research while making it easier for researchers to tap into existing resources for assistance in anticipating and mitigating undesirable impacts. Two of the resources the team intends to explore as part of the NSF grant, the Tarot Cards of Tech and guided stakeholder analysis, have Seattle roots. The latter is a pillar of Value Sensitive Design, co-conceived by UW iSchool professor and Allen School adjunct faculty member Batya Friedman, that engages individuals or groups who could be directly or indirectly affected by technology. As part of the process, researchers could save the results of their analysis in the form of a PEACE report that could be shared with collaborators on a particular project and updated anytime.

Researchers will also have the option to share their PEACE reports with an ethics board comprising faculty colleagues from across campus with expertise in areas such as law, bioethics, science and technology studies, and gender, women and sexuality studies. Members of this group will act as a sounding board for researchers who wish to follow up on the results of their exploration — and help them think through how they could address any potential unintended consequences they’ve identified in the process.

As with other elements of the proposed PEACE process, consultation with the ethics board would be entirely voluntary.

“We want to give researchers a low-friction, low-stakes mechanism for seeking diverse perspectives on how a technology might be used or misused. This could help surface potential implications we may not think of on our own, as computer scientists, that can inform how we approach our work,” Reinecke said. “We aren’t saying ‘don’t do this risky piece of research.’ What we’re saying is, ‘here’s a way to anticipate how those risks might manifest’ in order to mitigate potential harm.”

Side-by-side portraits of Tadayoshi Kohno and Dan Grossman. Kohno is wearing a blue polo shirt and standing in front of a pale wood and green glass background. Grossman is wearing a maroon and white checked button-down shirt in front of a plain grey background.
Tadayoshi Kohno (left) and Dan Grossman

In his role as co-director of the Allen School’s Security and Privacy Research Lab and the Tech Policy Lab at UW, Kohno has had ample opportunity to analyze the harm that can result when researchers haven’t thought ahead.

“Many times during my career have I wondered if the original researchers or developers could have prevented a problem before deployment,“ said Kohno. “For years, I and my colleagues have encouraged the people who build new technologies to apply a security and privacy mindset from the start rather than having to fix vulnerabilities later, after damage has been done. That’s essentially what we’re suggesting here — we’re asking our colleagues to apply a societal mindset, and to front-load it in the research process instead of relying on hindsight, when it may be too late.”

Grossman is vice director of the Allen School and often teaches the undergraduate computer ethics seminar, which the school began offering to students on a quarterly basis in 2020. He sees an opportunity for the PEACE project to eventually transform computing education and research on a massive scale. 

“We are in a position to guide the future leaders of our field toward thinking not only about the technical aspects of computing, important as they are, but also the ethical ones — to train future researchers and technologists how to rigorously consider the potential ramifications socially, politically, environmentally, economically or any combination thereof,” said Grossman. “We need the people who understand their proposed technology to grapple with these issues as well as to learn how to interact with non-technologists, such as public-policy experts, who have complementary expertise.”

The team will deploy and evaluate the PEACE project within the Allen School to start, with plans to extend access to other academic units on campus in later years. Eventually, Pang and his colleagues plan to distill the findings from their evaluation of the UW deployment into detailed design guidelines that can be adapted by other institutions and companies.

“I want to create the go-to place for UW researchers to learn, anticipate and bounce ideas off other researchers about the potential consequences of our work,” Pang said. “But I hope this initiative encourages a broader culture in which computer scientists are unafraid to think critically and openly about these issues. And I believe we can do it in a way that supports, not stifles, innovation.”

Read the team’s white paper here. This work is supported by National Science Foundation award #2315937.