
ChatGPT for clinical notes: why not?
Monday, 2 June 2025
The integration of artificial intelligence (AI) in healthcare presents exciting opportunities, but also a range of challenges for clinicians, practices and hospitals. In particular, AI scribes are gaining popularity with doctors as they automate clinical documentation, reduce cognitive load and save valuable time.
However, after speaking with our members and hearing stories from the wider community it’s become apparent that some doctors are using – or considering using – free general-purpose AI tools such as ChatGPT, Copilot and Claude to generate clinical documentation. We explain why this is not a good idea.
Understanding general-purpose AI tools
General-purpose AI tools are built on large language models (LLM) which are fed a huge volume of data and ‘learn’ from this to interpret, summarise, generate and predict new content. LLMs can very efficiently generate content that is grammatically and semantically correct within the context of the prompt and the information the tool has been trained on.
Why general-purpose AI tools are unsuitable as clinical scribes
In our medico-legal webinar on AI technology, we addressed the question: ‘Why can't I use one of the freely available generative AI chatbots as a scribing tool if I don't enter patient information?’ The discussion focused on the various risks associated with using a tool that has not been specifically developed for clinical purposes.
Terms of service concerns:
- By using the free generative AI services, you agree to their Terms of Use and may have limited, if any, ability to control what will happen to the information you share with it.
- LLMs typically retain information permanently so stored data may be used to train future systems.
Privacy and data control risks:
- Servers processing the information entered into these tools are typically located overseas, beyond Australian Privacy legislation protections.
- According to this legislation, if identifiable information travels overseas, it’s up to the user to ensure the recipient doesn't breach Australian Privacy Principles – an impossible task.
- Users are not protected against secondary use of any information they have entered.
Clinical accuracy issues:
- These tools don't ‘understand’ the data they receive, but rely on algorithms and probability to predict what output to generate.
- General-purpose AI technology wasn’t trained to produce clinical output.
- Risks of inaccuracies are therefore higher than when using purpose-built clinical AI tools.
What if I remove identifying details?
Even with a patient’s identifiers such as name, date of birth and address removed, information about the consultation, including clinic address and the appointment time, may still enable these increasingly sophisticated tools to identify the patient.
If a patient can be reidentified by any means, all the usual privacy obligations apply to both the input and output data. In case of a cybersecurity incident that results in a data breach, you may bear some responsibility.
Never input any patient information into a general-purpose AI tool, regardless of whether you consider it de-identified. Always assume data is never truly de-identified.
Selecting an appropriate AI scribe
When considering an AI scribe for clinical practice, first verify that it was specifically designed for clinical purposes. For purpose-built clinical tools, ask:
- Is any data collected or retained by the AI provider?
- If data is collected or retained, how long is it kept?
- Is data stored on overseas servers?
- Does the tool exist on an open loop system (i.e. the data goes outside of your IT environment)?
- Is the AI tool ‘learning’ from the data and patient information being fed into it?
- Is data identifiable?
- Is the data used for secondary purposes, including training the AI?
If the answer to any of these is ‘yes’, there may be a risk of breaching privacy legislation, and you should consider this risk carefully before you decide to use the tool.
Be aware AI scribes currently fall outside the Therapeutic Goods Administration's medical device regulatory framework. Thorough due diligence is essential before incorporating any AI scribe into your clinical practice.
More information
For medico-legal advice, please contact us here, or call 1800 128 268, 24/7 in emergencies.
IMPORTANT: This publication is not comprehensive and does not constitute legal or medical advice. You should seek legal or other professional advice before relying on any content, and practise proper clinical decision making with regard to the individual circumstances. Persons implementing any recommendations contained in this publication must exercise their own independent skill or judgement or seek appropriate professional advice relevant to their own particular practice. Compliance with any recommendations will not in any way guarantee discharge of the duty of care owed to patients and others coming into contact with the health professional or practice. Avant is not responsible to you or anyone else for any loss suffered in connection with the use of this information. Information is only current at the date initially published.
More ways we can help you