AI Fairness and Beyond: Law, Regulation, and Technology
By (Author) Chris Reed
Bloomsbury Publishing PLC
Hart Publishing
8th August 2024
United Kingdom
Tertiary Education
Non Fiction
Legal technology
Comparative law
343.0999
Hardback
248
Width 156mm, Height 234mm
This book proposes a regulatory system for ensuring that AI makes fair decisions. No one wants to be the subject of an unfair decision made by an AI, and fairness is so important to society that we are likely to want to regulate to demand it. But how This book attempts to answer that question. The aim of regulation must be for an AIs decisions to match the human conception of fairness. To understand what that is, the book proposes a holistic understanding of fairness, which tells us what regulation must try to achieve. However, regulation is not an abstract activity it regulates how humans behave, and the humans in question are those who develop and use AI for decision-making. Thus the book next investigates how those humans are attempting to achieve AI fairness. It finds that there is a serious mismatch between how technologists conceptualise fairness, compared to other humans. How can AI regulation bridge this gap Traditional models of regulation cannot solve this problem. Fairness is too nuanced, too contextual, and is ultimately a human emotional response. Instead the book proposes to place the responsibility on the AI community to explain and justify their efforts to achieve fairness, basing regulatory and legal responses on how well that explanation deals with the risks that particular AI presents, and whether it operates in accordance with the explanation in use. The book concludes by examining how far this regulatory model might be useful for some of the other social problems which AI generates. An original and significant contribution to the literature on AI regulation, this book is a must-read for those working in the areas of law, regulation, and technology.
Chris Reed is Professor of Electronic Commerce Law, Queen Mary University of London, UK.