I spotted a webinar running this week that piqued my interest. Autism, Algorithms and the Dangers of ‘Technopsyence’ was listed as part of the UN supported ‘AI for Good’ programme; looking at ways in which machine learning technologies can enhance progress for the SDGs.
Presented by Os Keynes, a PhD researcher from Washington University, it was essentially a critique of the ethics of utilizing AI in the diagnosis of children with autism spectrum disorder.
I joined this talk from my interest in applications of AI to fill in gaps our current data landscape to drive highway safety improvements, and as a father of three autistic children (two formerly diagnosed, one awaiting assessment).
In some aspects of traffic and road safety, the application of machine learning is still relatively generalised and naïve, and therefore the ethical considerations have so far been quite limited, although increasing reliance on autonomous systems is generating some significant ethical challenges around system security, reliability and ultimately, liability in the case of failure. Observing the ethical debate in other domains such as health, education and fintech is helpful in horizon scanning for philosophical and technical conflicts that might emerge in our own sector.
A frisson sparked by combining professional and personal interest, I was excited to see applications of learning in both domains and understand the ethical dilemmas. I was disappointed on both fronts.
Academically, it was an appalling treatment of the subject, and fell a long way short of the ambition of the AI for Good project. The speaker concluded that, because social appreciation of autism, access to treatment and quality of services are currently poor, that we should dismantle any efforts to use AI to detect and diagnose the condition.
Surely, the assistance of these technologies can speed up diagnosis, highlight prevalence of the condition, develop a wider appreciation for those who live with autism and create the necessary pressure to drive improvements in support? Along the way, it may also provide insights in how technology can be more enabling and redress inequalities that might exist?
As a professional, if we stopped developing AI to identify and address the locations and causes where 1.35m deaths that occur on the world’s roads each year, because some engineers aren’t very good at designing roads, or because there aren’t enough traffic police to ensure compliance with laws – the scandalous epidemic would continue. Rather we have to ensure that we are building resilience across the system and implementing the technology that will assist. At Agilysis, we are finding ways of understanding traffic densities, speeds, and community characteristics using innovative machine learning approaches that could accelerate the treatment of national road networks, saving many lives.
As a father, I fight every day for my children to get access to the educational and social support that they need. I don’t need to apologise for every decision I have made, because I love them profoundly, find their autistic traits and perspective to be both inspirational and fasctinating, and we are working together to ensure that the world doesn’t miss out on their unique and valuable contribution. We are currently in a fight to secure a diagnosis for one of my children, not so that we can legitimse our own insecurities or subject them to coercive therapy, but that they might be empowered to live out of their identity to the fullest extent and express their particular needs with clarity, confidence and authority.
So, the ethical question I came away challenged by was a very different one to the one that I was expecting to engage with. Do scientists get to say, “let’s not do this bit of research because we are afraid that it will expose how technically poor we are in some of our related disciplines”?
If AI is to truly be ‘for good’, then we need scientists, programmers, policymakers and engineers who are willing to use their influence and insights to advocate for a world that is better, diagnostically, clinically, environmentally and socially.