Research Seminar: Ethical Artificial Intelligence
- Date: Wednesday 4 March 2020, 16:15 – 17:30
- Location: Clothworkers North Building LT (G.12)
- Type: Seminars and lectures
- Cost: Free
This research seminar explores the gap between the ethical design and ethical use of artificial intelligence (AI).
What happens when new AI tools are integrated into organisations around the world? For example, digital medicine promises to combine emerging and novel sources of data and new analysis techniques like AI and machine learning to improve diagnosis, care delivery and condition management. But healthcare workers find themselves at the frontlines of figuring out new ways to care for patients through, with – and sometimes despite – their data.
Based on critical data studies and organisational ethnography, this talk will argue that while advances in AI have sparked scholarly and public attention to the challenges of the ethical design of technologies, less attention has been focused on the requirements for their ethical use. Unfortunately, this means that the hidden talents and secret logics that fuel successful AI projects are undervalued and successful AI projects continue to be seen as technological, not social, accomplishments.
In this talk we will examine publically known ‘failures’ of AI systems to show how this gap between design and use creates dangerous oversights and to develop a framework to predict where and how these oversights emerge. The resulting framework can help scholars and practitioners to query AI tools to show who and whose goals are being achieved or promised through, what structured performance using what division of labour, under whose control and at whose expense. In this way, data work becomes an analytical lens on the power of social institutions for shaping technologies-in-practice.
Professor Gina Neff is a Senior Research Fellow and Associate Professor at the Oxford Internet Institute and the Department of Sociology at the University of Oxford. She studies the future of work in data-rich environments.