Is artificial intelligence (AI) a threat to jobs and if so, will the flow-on effects of being replaced by machines impact human well-being?
This question was posed to a panel of experts at a recent UNSW Alumni event in Sydney: Engineering the Future: Q&AI.
The panel comprising Scientia Professor Toby Walsh, Human Rights Commissioner Ed Santow and Telstra CFO Robyn Denholm had an optimistic outlook.
While acknowledging the enormous social upheaval and individual pain brought about by disruptive influences like AI, Professor Walsh said we should be mindful of the great positives AI can bring to the world of work.
“When jobs get automated, I say to people, we should be celebrating because the job we're replacing is repetitive, dull, dirty or dangerous and we should probably never have been getting humans to be doing that job in the first place,” Professor Walsh says.
“But we also make sure those people feel like they have a useful purpose and they're supported in society and can live in that new world.”
People prefer dealing with people
However, Professor Walsh said that as social animals, humans will always prefer to be dealing with other people, and those jobs that are people facing should expect to survive.
Mr Santow mused that some jobs will never be taken by machines and emphasised the value of the human experience.
“The World Economic Forum did an analysis which types of jobs are safest and which are most vulnerable,” he said. “The one that came out on top was choreographer.”
Engineering the Future: Q&AI, launched by the UNSW Faculty of Engineering, was the inaugural event in a Thought Leadership series.
Introducing the event, UNSW Dean of Engineering, Professor Mark Hoffman said the aim of the Q&A evening was a fostering of original discussion around issues of global significance for the engineering profession.
Focus on the future
“UNSW Engineering is Australia’s largest and longest standing engineering faculty. While we’re proud of our history, it’s the future that we’re really focused on.
“It’s our view that engineering is a service profession, and the whole purpose of technology is to support society. Artificial intelligence is enabled by technology and has far reaching implications.”
Professor Walsh from UNSW’s School of Computer Science and Engineering is one of the leading academic minds on artificial intelligence.
He provided the session with insights on how AI is and will be integrated with humans at work and play.
Ms Denholm who is chief financial officer and head of strategy at Telstra represented the perspective from the private business sector.
Human Rights Commissioner Santow looked at AI and how it could be used advantageously to improve human rights while identifying areas where it may give us cause for alarm.
Definition of artificial intelligence
The event was facilitated by UNSW Faculty of Law Professor Lyria Bennett Moses who is also director of the Allen’s Hub for Technology, Law and Innovation.
When asked to define artificial intelligence, Professor Walsh started by saying that many of his colleagues say AI is “all the things we can't yet solve with a computer”.
“But it's not one thing, it's a collection of tools and technologies that when humans do them, requires some sort of thinking - all the way from perception and perceiving the world, understanding the world, to reasoning about the world and learning from it, and finally acting in that world.”
Ms Denholm saw AI as a direct outcome of how humans interact digitally with the world.
“We digitise so much in the world today, both in the internet and also records, but also as we deploy sensors all round the world,” she said. “I think we need to have something more than the human brain that can do something with it all that information”.
Mr Santow compared AI with modern medicine, in that it is something that is not bounded by a clear scientific consensus about precisely what it means. But he did note two characteristics that distinguished it.
“The first is the ability to wrangle large amounts of data. And the second is the capacity to do so in a way that draws inferences or links like a human might do,” he said.
AI and human rights
However, on the question of human rights, Mr Santow said human historical bias continues to exist when used in an AI setting. He provided the example of NSW Police using AI to help target suspects from a criminal database.
“Over 50% of people on that list is indigenous, but less than 3% of the NSW population is indigenous.
“Historically, indigenous people in Australia have been prosecuted for essentially what are crimes of poverty, and that can lead to an irrational inference that an AI powered system might draw. Namely, that people who are indigenous are more likely to commit a crime and so should be policed more aggressively.”
Professor Walsh wondered whether this could actually give us better human rights, because computers might even reveal more starkly the inherent biases in a society.
AI and cyber security
Ms Denholm agreed with this idea. “Human decision-making process does have inherent biases, and the best way to eliminate it is actually to expose it,” she said.
Another benefit she saw was what occurred in her job on a daily basis. Artificial intelligence was freeing up her company’s cyber security team to concentrate on real threats to a telecommunications network.
She said every day at Telstra, the company experiences billions of digital interactions with its network and has to weed out the serious threats from benign, everyday user activity.
“There are about three billion events a day that someone needs to analyse to see if they are bad actors,” she said.
“So, the artificial intelligence used by the cyber team gets those three billion things down to 1000 things a day that we can analyse.
“That's an example of AI doing something good so that we protect the nation in terms of cyber events.”