Doctors call for AI rules to prevent medical mistakes

Australian doctors are using AI tools, including ChatGPT, to deal with patients every day despite a lack of guidelines or oversight from the nationā€™s medical regulator, an inquiry has heard.

Healthcare experts revealed the issue at the Senateā€™s Adopting Artificial Intelligence inquiry on Wednesday, while calling for restrictions on use of the technology in healthcare to give doctors, nurses and patients greater confidence.

However, technology firms warned the inquiry to carefully consider strict rules on the use of AI to avoid putting Australian innovation at a disadvantage.

Australian Alliance for Artificial Intelligence in Healthcare director Enrico Coiera told the inquiry AI technology was being used throughout medical practices despite the absence of guidelines.

ā€œAI is already in routine use in the healthcare system,ā€ he said.

ā€œDigital scribes are used daily in general practice to listen in on patient conversations and summarise records for them automatically.ā€

But Professor Farah Magrabi, from Macquarie Universityā€™s Australian Institute of Health Innovation, said the technology was being used without professional oversight as it did not qualify for scrutiny from the Therapeutic Goods Administration (TGA).

ā€œThey fall through a gap at the moment because software that is just there for record-keeping is not subject to the TGAā€™s medical device regulations,ā€ she said.

SA Health Excellence and Innovation in Health commissioner Keith McNeill said some doctors were taking the use of generative AI tools further.

ā€œThe younger generationsā€¦ theyā€™re actually already using ChatGPT to generate their discharge summaries, theyā€™re just not telling us,ā€ he said.

ā€œWhat we need now are the guardrails around so that people can use these tools safely and effectively.ā€

Prof Coiera submitted 16 recommendations for AI rules to the inquiry, including the establishment of a national AI healthcare authority.

If used with oversight, he said, AI technology had the potential to improve medical treatments.

ā€œWeā€™re looking at machine-learning to identify your biomarker patterns and work out which drug is going to be the right drug for you,ā€ he said.

ā€œImagine that replicated across every major disease class ā€“ weā€™re talking about a revolution over the next decade or two in the way we target treatment to patients.ā€

Earlier, representatives from tech companies warned senators against introducing tight AI rules to avoid making it difficult for Australian innovators to compete with their US rivals.

Trellis Data chief executive Michael Gately said laws that forced AI developers to pay content creators for their work or reveal data sources could hamper local firms.

ā€œMy preference would have always been to ensure that people are paid for their work under the Copyright Act ā€¦ but I think the would be difficult to implement and would probably impact Australian companies unfairly against global competition,ā€ Mr Gately said.

Nuvento chief executive David Hohnke agreed, telling the inquiry AI rules in Australia should work alongside regulations in Europe and the US.

ā€œIf we do this in isolation, we could harm ourselves and people will go, ā€˜so what, Iā€™ll just use ChatGPT and throw my documents up there and breach out company requirementsā€™,ā€ he said.

But Atlassian global public policy head David Masters said Australia did have scope to set standards for AI use and introduce legal reforms.

The Senate inquiry is expected to issue its findings on the impact of AI in September.

Ā 

Jennifer Dudley-Nicholson
(Australian Associated Press)

0

Like This