Episode 2: Special guest Tim O’Connell, MD, CEO of emtelligent, a deep-learning-based medical natural language processing (NLP) company, says the technology has a bright future in healthcare

 

NLP has dominated the news cycle with Microsoft’s recent deal to Acquire Nuance for $19.7 billion. With Google and Amazon still at play in healthcare NLP, more changes are sure to come. But what is NLP, how useful is it in healthcare, and what are its drawbacks?

In the second episode of Tell Me Where IT Hurts, host Dr. Jay Anders, chief medical officer of Medicomp Systems, and Dr. O’Connell, explore how NLP is improving clinical documentation, driving actionable insights, and helping reduce clinician burnout Together, they also take a candid look at its shortfalls.

Listen to the full podcast here, or subscribe to Tell Me Where IT Hurts as part of HealthCareNow Radio Network wherever you get your podcasts. 

 

 

NLP is transforming healthcare

Dr. O’Connell says NLP brings numerous benefits to healthcare. It can do everything from scan a patient’s chart and nursing notes for early complications to identify and link medical slang to single concepts to improve care quality and efficiency. Regarding medical slang, he says emtelligent is training its language models to create multiple links to the same concepts.

Breaking it down further, Dr. O’Connell says, NLP is a technology that reads, understands, and structures human prose (e.g., physician documentation). In healthcare, for example, it enables clinicians to access specific patient information without having to sift through reams of data. In another example, it allows a medical researcher studying knee pain across a patient population to quickly identify patients who have had MRIs without reading lengthy physician notes.

While NLP is groundbreaking in healthcare, Dr. Anders points out that the technology is still poorly understood with confusion on how NLP and AI work together. Dr. O’Connell says to think of AI as a broad umbrella term for a computer that mimics things that humans do, like read notes, understand language, and communicate with one another. NLP, on the other hand, is more specific: It is the branch of computer science in which computers read and understand both human and computer produced text.

 

Voice recognition technology: Still waiting for the “magic” to happen

Dr. Anders and Dr. O’Connell also weigh in on the promise and limitation of voice recognition technology. Dr. Anders says while voice recognition technology is often described as “magical” in that it is promised to capture an entire exam room conversation, it has not yet fulfilled that promise. Dr. O’Connell, concurs, saying the technology still raises concerns over patient privacy, security, and accuracy. “NLP enables a deep learning engine to distill recorded conversation that is verbatim into what it thinks is most important,” he says. The trouble is it doesn’t always pick up verbal nuances or communication implied through body language, which a human can easily interpret.

“I don’t see the mic in the sky approach working very well in the current context of patient care and the patient visit,” agrees Dr. Anders, noting that body language alone can convey how a person is feeling.

On the other hand, as a radiologist who dictates notes daily, Dr. O’Connell sees NLP’s strengths and weaknesses. While there are a lot of issues with speech recognition and accuracy, physicians must be able to both type and dictate their notes. For example, he says, NLP is very adept at handling complex narratives in cases where patients have pancreatic cancer. NLP can describe a pancreatic tumor in an infinite number of ways, creating a structured note that is accurate 100 percent of the time, something that would be near impossible when typing up a note.

 

Medicomp and emtelligent’s unique collaboration

Dr. Anders and Dr. O’Connell also discuss the promising partnership between Medicomp and emtelligent to deploy an NLP engine that encodes clinical concepts, diseases, and clinical relationships. The Medicomp MEDCIN platform is designed to better understand unstructured medical text. “We have a great platform here together and I think it is incredibly powerful for anyone who wants to roll up the next generation of healthcare applications today,” says O’Connell.

Dr. Anders and Dr. O’Connell agree that as the technology progresses across the two companies, to improve clinician documentation and provide intelligent prompts, it will also help alleviate clinician burnout. “Some physicians are looking to get out of medicine because they are tired, burned out,” says Dr. Anders.

 

What’s next for NLP and healthcare IT?

In closing, while deep-pocketed mega corporations are entering the NLP space, Dr. O’Connell is bullish on the future and says emtelligent has the distinct advantage of building an NLP engine that caters to clinicians, something the bigger IT companies have not yet achieved. What’s the No. 1 change healthcare IT needs to make? Listen to the full podcast here to find out and to learn more about the future of NLP in healthcare and how it will play an even greater role in clinician documentation.