Streamlining clinical records with AI: Efficiency, accuracy and ethical challenges

Georgie Haysom, BSc, LLB (Hons) LLM (Bioethics), GAICD, Head of Research, Education and Advocacy, Avant

Monday, 14 October 2024

Accurate clinical documentation is a cornerstone of effective patient care. This responsibility falls squarely on the shoulder of doctors who bear legal and professional obligations to maintain accurate records. However with doctors spending between one-quarter and one-third of their day on paperwork, the allure of artificial intelligence solutions to relieve the pressure is understandable. Despite the promises of efficiency and accuracy, doctors must also be attuned to the inherent risks and benefits AI can bring to clinical practice. For doctors and practices exploring these tools, we have identified some key issues to keep in mind.

Consider why you need the records

At Avant we spend a good deal of time talking about the importance of good clinical record-keeping. 

Avant’s analysis of claims involving medical records found records were assessed as below standard for one in every nine claims. Of the claims where a doctor’s care of the patient was ultimately found to be below standard, one-quarter had poor clinical records. 

Further, some claims could not be defended because of incomplete or missing records. Complaints to regulators resulted in more severe outcomes when records were found to be poor.

Inadequate records can compromise patient care by contributing to missed or delayed diagnosis and medication errors. Poor quality records are often involved in legal or regulatory action including investigations by Ahpra and Medicare investigations. Poorly maintained records can also have broader public health implications if they compromise continuity of care, medical research, epidemiological studies, and public health initiatives.

Tools and potential benefits

At Avant we are receiving increasing numbers of requests for advice from members considering implementing AI tools in their practice to assist with clinical note taking.  AI tools can transcribe or record a doctor-patient consultation, extract clinically relevant information, then collate and structure data into a coherent clinical note based on a template or framework. The same content can be re-formatted into a summary letter or treatment plan for the patient, referral letter, discharge summary, or other paperwork. These AI generated notes form the basis of the record which is then reviewed and added to as required by the doctor. A copy of the final notes can also be provided to the patient.  

The potential time savings could be significant, particularly for anyone whose current process involves more handling of the information: writing or dictating notes, having these transcribed, checking and amending the transcription into a clinical note, then using the notes to draft follow up letters or reports after a consultation However, while time saving, it is ultimately the responsibility of the clinician to ensure that all relevant information – including non-verbal information and examination findings – is incorporated into the record at the end of the consultation.

AI models can incorporate checklists or prompts to reduce the risk of omitting key information. This may improve diagnostic accuracy and continuity of care and assist with prevention or early intervention provided that the clinician reviews and considers the recommendations made by the AI tool.

If the original record is reviewed carefully, creating multiple documents out of the same source data may also reduce the risk of transcription errors. One (small) UK study based on patient vignettes showed AI (ChatGPT) performed as well or better than junior doctors in producing discharge summaries, and was less likely to make transcription errors or omit key information. However, care must be exercised when using an AI tool which is not created for use in healthcare to ensure that it is fit for its clinical purpose.

By removing the need to take comprehensive notes during the consultation, the tools may allow doctors to focus more on patients, and patients to feel they have their doctors’ full attention. If doctors find themselves verbalising observations and thought processes for the benefit of the AI, the process may also increase transparency during the consultation. If the notes produced form the base record which is then reviewed and added to by the clinician, records may be improved. 

Key considerations

AI scribes, like any other tools, offer both benefits and limitations. If you are considering adopting an AI tool to assist with clinical notetaking, consider what you hope the tool can do to assist your practice, and how you plan to use the outputs. Undertake your own due diligence to determine whether the tool will meet your clinical needs. Make sure you have considered the following issues.

Privacy and confidentiality

Before uploading any patient information to an AI tool, you must be confident that patient privacy and confidentiality is protected. That means checking where the data is processed, where and how it is stored (in Australia or overseas), and how it will be used (will the company use it for other purposes?).  

Publicly available AI tools (such as ChatGPT) may not be fit for medical use because the tool was not designed for a medical purpose. It is unlikely to offer sufficient protection for sensitive data. 

If you use any tools that rely on an audio recording, make sure you have patient permission before recording the consultation. We recommend getting explicit permission and making a note of this.  

In addition, if you are using the tool to generate other documentation such as reports or referrals, you will need to check these do not breach patient confidentiality by copying across clinically irrelevant information.

Is your AI 'medical grade'?

Not all AI is the same. Research is progressing on designing AI specifically for medical applications, with sufficient scientific rigour and trained on appropriate and high-quality clinical data. However the technology is still evolving and not all tools will be able to process medical information in a modern Australian context. AI scribing tools also currently fall outside the Therapeutic Good Administration’s medical device regulatory processes.

Applications such as clinical scribing tools require the AI to process the text to identify clinically significant information. Scribe tools may be unable to interpret languages other than standard English or may have difficulty understanding accents. Since AI tools have been ‘trained’ on historical datasets, there are indications that they can perpetuate historical biases, for example based on race, gender or sexuality. It is important that introducing AI support in healthcare does not reverse progress on addressing healthcare inequities.

How will it affect the flow of the consultation?

Verbalising all your observations and considerations for the audio transcript may be clinically inappropriate in some consultations. Examples could include recording the BMI of a patient being treated for anorexia nervosa or recording your concerns about the cause of a patient’s injuries.

Further, the audio will not pick up non-verbal communications or cues. You will need to make sure you are able to capture and incorporate these observations into the record. 

Consider for example whether you would need to be able to add in additional notes after the patient has left the room to add further observations.

How will you check the record?

You remain responsible for making sure the record accurately reflects the consultation. Never assume the AI-generated record is accurate. You need to be able to check the AI has captured and correctly interpreted relevant details. If you fail to check the AI generated record, and it is incorrect or has raised a diagnostic issue you have not considered, you may be vulnerable to criticism, and there may be patient safety implications if there is an error.

As noted above, the audio may be incomplete or misleading because it does not reflect non-verbal cues. It may also miss clinical indications that contradict what is said – for example it may record the patient saying they have no allergies but miss the fact they said that they had experienced an adverse reaction a medication.

Further, AI’s tendency to fill in gaps or extrapolate findings, could lead to a misleading record. Ultimately, the clinician bears responsibility for the notes which are produced by the AI tool at the clinician’s request. 

You will also need to be satisfied the record captures sufficient detail to justify any Medicare item numbers you have billed for the consultation.

Finally, as these tools are new, regularly monitor and review their use in your practice, to ensure that they are fit for their clinical purpose and produce the intended outcomes. 

Conclusion

If you are considering the benefits of AI for clinical notetaking, do your own due diligence to be confident the tool you choose:

  • is appropriate for your practice,
  • can help you deliver safe patient care, and
  • can comply with your legal and professional obligations – including ensuring patient privacy.

Seek advice from your peers and professional colleagues, see the references and further reading below.

Remember you are ultimately responsible for the accuracy of your clinical records. Always exercise your own professional clinical judgement. Treat any output as a draft and ensure you can review and check outputs for accuracy.

References and further reading

Avant webinar: Ask the expert CPD requirements, complaints and AI 

Avant article: Watch out for these pitfalls when recording your consult

Australian Medical Association: AMA position statement on AI in healthcare

CSIRO: AI is already being used in healthcare. But not all of it is 'medical grade'

Coiera EW, Verspoor K, Hansen DP. We need to chat about artificial intelligence. Med J Aust. 2023;219:98-100. doi.org/10.5694/mja2.51992

Tierney AA, Gayre G, Hoberman B et al. Ambient artificial intelligence scribes to alleviate the burden of clinical documentation. NEJM Catalyst. 2024:5(3). doi: 10.1056/CAT.23.0404

More information

For medico-legal advice, please contact us here, or call 1800 128 268, 24/7 in emergencies.

This article was originally published in the Newsletter of the Australian Orthopaedic Association Volume 45 No.1 Autumn 2024

Disclaimers


IMPORTANT:
This publication is not comprehensive and does not constitute legal or medical advice. You should seek legal or other professional advice before relying on any content, and practise proper clinical decision making with regard to the individual circumstances. Persons implementing any recommendations contained in this publication must exercise their own independent skill or judgement or seek appropriate professional advice relevant to their own particular practice. Compliance with any recommendations will not in any way guarantee discharge of the duty of care owed to patients and others coming into contact with the health professional or practice. Avant is not responsible to you or anyone else for any loss suffered in connection with the use of this information. Information is only current at the date initially published.

This article was originally published in the Newsletter of the Australian Orthopaedic Association Volume 45 No.1 Autumn 2024

To Top