Skip to content

Existential threats

I’ve been working in AI for more than 20 years now. Nothing is faster than the speed of light, maybe except the speed with which people now become “AI experts” recently.
AI has certainly come a long way, from “AI is if it doesn’t work” to where we are now. But AI and interest in it also has always moved between extremes of hype and disillusionment. In its short history, AI has been predicted to overtake human intelligence multiple times. The current alarmism about AI as an “existential threat” that now also reached Australia is just that - a mix of sensationalist hype, marketing tactics and a result of overinflated egos.

I’m not suggesting there are no issues - there are, and many people have been writing about them though maybe in less alarmist ways. “Existential” they are not, and labelling them as such distracts from actual existential threats.

In Australia the rate of species going extinct is higher than almost everywhere else, how is this for an existential threat? What about the impacts of climate change that Australia is particularly vulnerable to, including future droughts or bushfires, and I’m not sure we have learned enough on how to deal with a next pandemic either. There’s many more things that are likely existential than some (still undefined) existential threat from AI.

There is definitely work to do, and maybe also nice we moved on from hearing “it’ll never work” but it’ll be better working on the issues keeping them in perspective.