In a Q&A with Arizona State University News, CCC Chair Nadya Bliss spoke about national security challenges, where she shared her contributions to the “Addressing the Unforeseen Harms of Technology” white paper.
“As technologists, we tend to be optimistic about new technology and its possibilities. And there are many reasons to be. But history has shown us that we also need to be clear-eyed about vulnerabilities and manipulation, and proactively balance out our excitement over new capabilities with appropriate security practices and techniques to mitigate potential harms.
Bliss is deeply involved in national initiatives focusing on technology research, design, and development. Her discussions frequently center on proactively anticipating potential harms and effectively mitigating their societal consequences before they manifest.
Here is a sneak peek of the Q&A.
Question: New technologies have changed the world, both for better and for worse. Can you give me some examples of unforeseen harms of technology from the last decade or so? Should we have anticipated these? And what is the research community doing to get better at identifying these harms before they take root in society?
Dr. Bliss: Some of the most consequential security challenges of today stem from new technologies that became broadly available at relatively little expense and have been manipulated by bad actors for harmful purposes.
For example, connecting critical infrastructure to the internet was aimed at improving efficiency and security, but ended up leaving pipelines, hospitals and electrical grids vulnerable to attacks like ransomware; social media was developed as an online space for creating connections, but it did not take long for it to fundamentally alter how we consume information and for people to figure out how to manipulate its algorithms to spread misleading or false information to further their agenda; and some people thought handing decisions over to automated tools driven by artificial intelligence would reduce bias — that machines would be truly neutral parties — when in fact, those tools simply reflect the biases of their creators and have in practice often exacerbated inequity and unfairness.
Read the rest of this conversation here.