Since the November 2022 release of OpenAI’s ChatGPT, the Internet and the public has been abuzz with how generative AI can be applied across different industries, including healthcare.
Hospitals and health systems are already piloting, testing, or deploying the new technology across their facilities. It is an exciting time, but many ask, “How can we ensure care quality and patient safety remain top of mind when using this newly developed tech?”
Now is the time to set expectations for ChatGPT’s current usability and plan for how it will affect the healthcare industry in the near future. As a technology company in the healthcare space, SureTest looks for new technologies and advancements to optimize their uses, not only for ourselves but for our client partners.
Potential Uses of AI
The most straightforward and immediate use for ChatGPT in healthcare lies in its administrative and non-clinical strengths.
A Council for Affordable Quality Healthcare (CAQH) report found that the U.S. healthcare system spent $18 billion more on administrative tasks in 2022 than the previous year due to increased utilization of healthcare services post-pandemic and workforce shortage challenges.
These high costs represent a significant amount of time that providers spend away from patients and more essential operational tasks. AI platforms like ChatGPT have the potential to allow clinicians to focus on patient care by streamlining simple communication like emails, progress notes, or authorization letters.
AI tool’s scalability and flexibility make it easy to integrate across any size of healthcare organization. The technology also promises more than what it currently offers. According to a Case Report from ScienceDirect, there are many suggested future-state applications of generative AI in clinical settings, including:
- As ChatGPT can provide basic information and answer routine questions, it may end up simplifying the patient intake process.
- By leveraging ChatGPT’s scalability, healthcare organizations could provide quick and reliable information to patients, reducing wait times and improving patient satisfaction.
- ChatGPT can handle multiple conversations simultaneously, potentially making it ideal for responding to a high volume of patient inquiries – some believe it could be used to manage spikes in demand for medical services, such as during a flu outbreak or a public health emergency.
- Clinicians are already testing ChatGPT for its potential clinical uses in hospitals and other healthcare facilities – this ER doctor used it to create diagnoses based on a collection of History of Present Illness notes.
3 Potential Drawbacks of ChatGPT in the Healthcare Space
As people and organizations across the healthcare ecosystem experiment with ChatGPT and its uses, it will be essential to keep in mind the numerous limitations of the product in its current state.
At this time, any potential use cases for ChatGPT in healthcare come with massive caveats, including:
(Very) Limited Medical Expertise
ChatGPT is not considered a medical expert, and while it may provide generally correct information, it may not be able to provide in-depth answers to complex medical queries.
“Artificial hallucinations,” or when an AI model generates responses or information that may sound plausible but are not grounded in reality or relevant to the specific input or context provided, pose a significant danger. They could lead to patients receiving incorrect or incomplete information, which could cause severe harm or further complications.
Healthcare and technology leaders must ensure that ChatGPT is programmed to recognize when it is out of its depth and to provide appropriate responses, such as referring patients to a human medical professional.
Legality Concerns
Medical advice given by ChatGPT may be subject to legal liability and will need to comply with regulations and standards. Practicing medicine without a license is illegal (except in the cases of home remedies, offering advice, and writing about nutrition or medical conditions), and using ChatGPT to diagnose or treat patients may be considered malpractice.
The mandated protection of Patient Health Information (PHI) is also a massive issue many feel generative AI is woefully unprepared to deal with. Using ChatGPT with any patient’s protected health information in a manner that complies with HIPAA regulations is currently not possible.
The HIPAA Privacy Rule explicitly emphasizes the importance of restricting access to PHI, and the terms of use of ChatGPT permit the collection and utilization of personal information, including log data and device information, obtained through their services.
If the industry wants to use ChatGPT to help diagnose and treat patients, it will need a secure database with HIPAA-compliant oversight. Healthcare providers should regularly engage with legal counsel to advise on compliance issues and potential liabilities before implementing ChatGPT in their practice.
Lack of Empathy
ChatGPT lacks the human empathy that is necessary for effective communication in healthcare. Many patients may prefer to receive information and advice from a human professional who can provide a personal touch and understand their unique needs and concerns.
If medical professionals utilize ChatGPT, they must make certain it is programmed to provide appropriate responses that demonstrate empathy and understanding, such as acknowledging a patient’s concerns or offering reassurance.
The Future State of ChatGPT in Healthcare
ChatGPT is here to stay and will only continue to evolve. While the exciting promises of this still-developing technology may offer much-needed relief to a strained industry, everyone who uses it must treat it with caution tape.
The SureTest team is keeping a pulse on this modern technology and engaging with our peers and clients to learn more and identify ways to optimize its use. As an innovation-focused company, we are encouraged to see how ChatGPT is expanding the industry’s understanding of AI and opening doors for advanced automation that reduces administrative burdens on care teams.
This trend aligns with our solution of automating EHR, third-party, and enterprise application testing. As the only Managed Test Automation Solution, SureTest helps organizations reclaim thousands of hours every year by freeing clinicians from the burden of manual testing, allowing them to focus on patient care.
Watch SureTest CEO Laura O’Toole discuss ChatGPT in healthcare, the changing role of CEOs in healthcare organizations, and more in her interview with Bill Russell on This Week Health Newsday.