“Marie, What the f*ck are you doing?”
An excerpt from “Nadia: Politics | Bigotry | Artificial Intelligence”
This excerpt from the Nadia book, is from early 2015 - a year before the Nadia proof of concept; before the technology was identified; and even before many of the team were recruited.
Critical to the telling of the Nadia story - and the background to this excerpt - is that co-design was already happening. The concept of Nadia was being seeded - not by tech companies (many are surprised by this revelation), but by people with disability. Hence the epigraph of the book.
Also happening at the time, was that the RoboDebt program was making its way through the political process and into the Australian Federal Budget. The deadly and unlawful RoboDebt program would go down in infamy, together with the bureaucrats defending the indefensible.
These two antithetical programs - RoboDebt and Nadia - became a politically explosive interplay.
In this excerpt, the vile and bigoted culture that created RoboDebt, is a disturbing and foreboding presence right from the very beginning of the Nadia project.
EXCERPT: Chapter 5: What is Co-Design?
“One morning, I was walking up Collins Street in the heart of Melbourne CBD, on my way to give a keynote presentation on the NDIS and innovation. My phone rang. Hello, Marie speaking. Scorched into my ear, was yelling from a very senior bureaucrat in Canberra. Marie, what the f*ck are you doing? I’m about to give a presentation in Melbourne. Is there a problem? The yelling continues. There’s a conference room here full of severely disabled people, what’s this all about? (Those exact words were used.) Well, I replied, those people are participating in a co-design workshop that the Technology Authority team are facilitating, and they will be working with us for a while. The yelling continues. There is all this food on the tables. Yes, I said, people are there for the full day and we provide lunch and morning and afternoon tea. Is there a problem? The yelling continues. Yes, of course there’s a problem This is costing a lot of money. And then the clanger came, with even louder yelling. And we just don’t have time for all of this. The venom was disturbing. Look, I said, cutting this bureaucrat off. We don’t have the time not to do co-design, there is too much riding on this. I am about to go into the auditorium. The workshop will continue, please do not disturb it, people have travelled from across Australia to attend. Fine, the bureaucrat snapped – we’ll talk when you’re back in Canberra.
The comment on the food was a nasty barb: the procurement had all been approved, I am a stickler for process. However, the real message being delivered here, was that co-design was a distraction and there was no time for it. The subtext was pure bigotry.
And with that, my resolve to forge ahead with co-design hardened.”
Commentary
What causes this culture, is a question that is explored throughout the Nadia book. This toxic culture persists, and into this environment, AI is being marketed as memes and auto-generated images and content, by the powerful compete strategies of global tech. Fresh from the human devastation of RoboDebt and the unfolding crisis of RoboNDIS, bureaucrats are told to play with AI. Risk has become a dirty word.
The Nadia book describes in detail, the dangers and risks of Large Language Models (LLMs) in service delivery settings, including government and healthcare. Having been in service delivery and technology since the earliest days of the Internet, I believe this is probably one of the first times the risks and dangers of LLMs in service delivery settings have been documented based on experience. It is essential to understand this before playing with these technologies and exposing people to harm. Ignore this at peril.
This book is not only an inside expert account of the momentous history-making Nadia project, it also delivers the business case; the methodology; describing in detail the actual work that was done in creating Nadia.
The book peels through the sinewy layers of politics, bigotry, and the rise of Artificial Intelligence, and delves into new frontiers of Co-Design, Human Rights, Machine Learning, Algorithms, simulated environments, and the foundations of Trust and its destruction.
For sure, vested interests mean that none of this can be told in entirety by tech companies, consulting firms, or various advisory groups. Certainly, the poisonous mix of politics and AI is not covered in any AI strategy or framework. I’ll have more to say about this.
Nadia debuted as a best seller in multiple categories, and is now in the National Collection at the National Library of Australia. One of the leading global visionaries in this field, Dr Chris Hillier from the United Kingdom, has stated: “It is important to get the story out there so that we can learn as a society. It should be on university sociology reading lists as well as those on the story of human-computer interfacing, and AI.”
I hope the Australian Public Sector starts to accept what is being acknowledged globally, including in the fields of global health and access to justice. The Nadia book tells those stories as well.