Home » Artificial Intelligence » Big Data, Cognitive Computing, and the Future of Electronic Health Records

Big Data, Cognitive Computing, and the Future of Electronic Health Records

February 17, 2015

supplu-chain

“Sometime in the not too distant future,” predicts Jordan Novet (@jordannovet), “your doctor could write you a prescription for a smartwatch and an app to track your health. Data will flow out of the device, right into your electronic medical record, where the doctor could use that data to hone her treatment plan for you. And even without treatment changes, the app and smartwatch might by themselves help you make healthier choices.”[1] As in so many other areas of our lives, big data is likely to make a significant impact in the healthcare sector. Unlike some other areas, however, almost everyone is sensitive to how their health records are handled, safeguarded, and used. If Novet’s vision is to be realized, privacy concerns must be addressed. The recent hacking of the Anthem system only highlights the fact that we still have a lot of work to do in this area. Katie Dvorak (@KMDvorak87), associate editor for FierceHealthIT and FierceHealthcare, explains, “From Facebook to the doctor’s office, our information is continually being collected and analyzed. Now the question lies in what data should be accessible, and by whom. When it comes to doctors, people are sharing their information with the view that they are looking out for your best interests, unlike a company such as Facebook.”[2] Dvorak notes that Glenn Cohen, a professor of health law and ethics at Harvard Law School, along with researchers from the University of California, San Francisco and the University of Texas Southwestern Medical Center, published a study in Health Affairs that raises concerns and offers solutions about how big data is used in the healthcare sector, specifically the legal and ethical implications of using predictive analytics models. Dvorak writes:

“Cohen [says] that the more a patient’s privacy is invaded, the more robust consent mechanisms are needed. One key to making data collection easier for doctors is de-identification. The more that happens, he says, the harder it is to identify data from a particular individual — and proceeding without individual consent seems more ethical. What’s more, Cohen says, patients should go through an informational session when entering a practice in order to better understand what it means to have their information in a database, and then possibly opt out.”

The first instinct of many individuals would probably be to opt out and simply avoid any future complications. The problem with that is that many practices might refuse to treat patients that don’t opt in. There is also a greater good to consider about requiring patients to opt in. Kim Krisberg reports, “The advent of electronic health records (EHR), coupled with better computational power and professionals who can analyze massive amounts of information, is bringing together data that previously resided only on paper and often were isolated in siloed systems. Although the exact definition of so-called big data varies depending upon the source, most experts agree that the ability to access and analyze such information holds the key to more efficient, higher-quality health care while significantly shortening the time between research and translation into practice.”[3] Before discussing some of the benefits that are likely to emerge as electronic health records are used to help researchers find cures for diseases or recommend the best treatment protocols, I’d like to discuss some of the current problems with EHRs that artificial intelligence could help solve. Dr. Kevin R. Stone, an orthopedic surgeon in the San Francisco area, notes, “Medicine is an oral science. People talk to doctors about their problems. Doctors listen and ask questions. Doctors tell the patient’s story to other doctors to share information and gain new ideas. Doctors and nurses talk about a patient’s progress. They talk to social workers and physical therapists and all sorts of experts who can help solve the patient’s problems and improve their health care.”[4] He goes on to lament, “The electronic medical record has killed the oral science.” He explains:

“Doctors now hunt and peck for the information to share. Nurses stare at screens, taking half an hour to enter data, something that used to take three minutes. As far as I can see, everyone in health care hates the new quantified medical record except the insurance companies. There are hundreds of editorials by doctors documenting the fact they can only see two-thirds of the patients they used to see if they have to spend their day entering data.”

Ouch! That doesn’t sound like an advancement in medical science. So what does Dr. Stone recommend? Artificial Intelligence. “Apple’s Siri, IBM’s Watson and their relatives,” he writes, “could solve this.” He provides a vignette of how artificial intelligence could be used to improve healthcare and EHRs.

“Here is an example:

‘Siri, I would like to admit Ms. Jones to the hospital for her knee replacement.’
‘Sure, Dr. Stone, shall I use your pre-op order set?’
‘Yes.’
‘Tell me the medications she is on.’ After I speak the medications’ names, Siri might ask, ‘OK, let’s be sure to let her cardiologist know to adjust her blood thinners a few days before surgery. And by the way, the medication she is on has been recalled and this alternative is recommended.’

You can see how this could go. Artificial intelligence has long since solved these highly formulaic situations and could prompt doctors to be better at their jobs. Medical staff would never have to waste precious time looking through a dozen menus and screens of unrelated information. The nurses could dictate their findings on their rounds and, through Siri, a message could notify the doctor if a wound is not looking right. The patients could tell Siri their medical history before coming to the office and then the doctor could review it with them and fill in the color the patient didn’t think of. The point is, we can all talk. We just can’t type and hunt and peck efficiently.”

What Dr. Stone could be describing is a cognitive computing system that uses natural language processing and artificial intelligence. He believes insurance companies would find such a system preferable to the system currently in place. In addition, Dr. Stone notes, such a system “could prompt the doctor to consider a less expensive alternative to a drug, a dressing or a therapy.” Those kinds of recommendations could actually help bring healthcare costs down. David Shamah () notes, “With health care costs spiraling across much of the world, figuring out ways to cut costs is a matter of financial life and death for insurers, healthcare providers, and governments, who (via the taxpayer) end up ultimately footing the bill for health costs.” He agrees with Dr. Stone that natural language processing (NLP) may hold the key to helping bring costs down.[5] Instead of oral notes, however, Shamah discusses a company called CliniWorks, that makes a system that peruses all data fields in written notes and assigns data to where it belongs using natural language processing. According to the company’s CEO, Nitzan Sneh (@NitzanSneh), “It’s a lot deeper than just identifying keywords. We scan free text at very high speeds, and using specialized algorithms we figure out where the information a doctor wrote will be most useful in analysis. Once the data is properly classified, it can be analyzed to make suggestions on more effective treatment, which drugs are better for specific situations, where there is waste that can be cut, and so on. The information is gathered into a HIPAA-compliant searchable repository, enabling rapid data query, analysis, and reporting. … Believe it or not, this is the first time anyone in the US is trying to tie patient outcomes with expenditures during care, scientifically analyzing whether the money we lavish on health care is being spent effectively.” Pfizer, a well-known drug company, is sponsoring a joint project with CliniWorks.

 

With both doctors and drug companies interested in how cognitive computing techniques can be used to improve EHRs, the time has come for the healthcare industry to examine seriously how cognitive computing can be used to help improve treatments and reduce costs. IBM’s Watson is already helping doctors make better diagnoses, but that is only the tip of the iceberg when it comes to how cognitive computing could help change the healthcare sector.

 

Footnotes
[1] Jordan Novet, “The future of our health care: Robotics, AI, analytics, & more,” Venture Beat, 27 October 2014.
[2] Katie Dvorak, “Big data’s burgeoning healthcare role causes increased legal, ethical concerns,” FierceHealthIT, 14 July 2014.
[3] Kim Krisberg, “Big Data Key to Improving Health Care,” Association of American Medical Colleges, January 2014.
[4] Kevin R. Stone, “Could artificial intelligence end the electronic medical record nightmare?The San Francisco Examiner, 28 September 2014.
[5] David Shamah, “Natural language processing and big data: The prescription for saving big bucks in healthcare?ZDNet, 28 October 2014.

Related Posts: