Joe Longo, MBA, CHCIO, Vice President of IT Enterprise Technologies, Parkland Health & Hospital System
Thanks to some federal-driven programs, and the general perceived technological lag that the healthcare industry, many consider the timespan of approximately 2005 through 2015 as the “EMR Gold Rush,” which was fueled by the emergence of acceptable enterprise technology, the infusion of capital into the healthcare IT industry. This gold rush brought many new vendors and outside of the healthcare industry, human capital into the space to develop and proliferate enterprise-level transactional databases, or electronic medical records (EMRs).
This period of time and circumstances was a necessary step to build baseline capabilities. Previously, important data points were not captured, not easily extracted in the case of paper records, and/or not standardized or helpful to the larger medical community or even, in most cases, to the individual organization’s enterprise. The federal government started with some basic standards for data dictionary definitions through their certified EMR designation and subsequent requirements.
"It is advantageous to spend the time and money in developing a solid, functional, and scalable data warehouse infrastructure"
At this point, a majority of the healthcare providers use some form of a certified EMR with at least basic capabilities of sharing key standard data elements across multiple facilities and databases. This is a great start but what else should we be doing beyond the minimum regulatory or governmental program requirements?
Some progressive organizations like Parkland Health & Hospital System were early EMR adopters and ran into typical issues like printing and reporting. In spite of those challenges, once adoption reached critical mass they had highly functional and, in most cases, highly integrated EMRs. Many of these organizations pursued ways to derive more value from their hefty investments beyond capturing the transactional data to meet documentation, regulatory and operational requirements. Unfortunately, there is very little out there to guide us toward a standard architecture or approach to enhance our capabilities beyond the transactional and toward more potent applications of data and technology. Many of us leverage guidance from our vendors, user groups and third-party benchmarks run by independent healthcare IT organizations; however, many are still focused on adoption and usage from a transactional perspective. Those that are not can be variable in their approach and architecture.
It is not reasonable to expect a one-size-fits-all blueprint to derive the most value, beyond transactional, from your EMR investment. Many factors drive variation, including your organization’s size, mission, and needs. I do, however, have some thoughts about an approach and interesting focus areas employed by organizations in which I served . The focus areas include: highly functional data warehouse infrastructure, automated data capture capabilities and a solid platform for real-time information delivery.
It is advantageous to spend the time and money in developing a solid, functional, and scalable data warehouse infrastructure that will handle the vast amount of data that will grow beyond any reasonable capacity planning forecast. The earlier you accomplish this, the quicker you can layer a functional data governance program on top of it and begin to crawl out from under the inevitable backlog of individual report demands. If you also stack on an intuitive visualization tool, you will have a fighting chance at implementing a functional reporting self-service model. Having a solid repository to house the data also allows you to take advantage of the latent data elements most of us toss away since they are not currently defined as required for legal, regulatory, or other documentation requirements.
Increase your data collection volume without affecting the workload of the clinical end users through a focus on automating data capture at the bedside. This will free up clinician time to perform actual patient care but maintain the influx of key data into your systems. A capability like biomedical device integration (BMDI) is a gold mine of unused data and a source of latent value that can be utilized in the future. Many organizations have implemented some variation of device integration middleware and benefit from the return on investment (ROI) of automated capture of data for use in the EMR; thanks to a reduction of key strokes or time taken by the clinician to manually document and decreased transcription errors. However, there is a big opportunity to implement enterprise-wide capabilities for all biomedical medical devices that produce data. EMR documentation requires, at most, approximately 10 data variables at a frequency of no more than approximately 15 minutes (outside of the OR/Anesthesia). The devices themselves, however, produce far more data variables at frequencies of every second, or even more often. Many current predictive analytics were established through assessing outcomes documented in the EMR and tracing back data patterns that correlate to those outcomes. Thus, we should assume that increasing that pool of data should produce opportunities to find more patterns and allow analysis of micro-variations of once-unused data to find new ways to predict or even improve outcomes. Some in the industry are already acting on this theory and taking advantage of the outputs. This same model can be applied to internet of things (IoT) devices and/or home devices. The same approach can add IoT device information to the mountain of data in your warehouse data repository t o be analyzed through technologies such as machine learning, artificial intelligence (AI) or whatever next generation of analytical tools emerge.
When you have the transactional data stored in a functional warehouse for executing analytics, establish a highly effective way to deliver key information in a real-time fashion to the correct caregiver for timely intervention. Most rely on the EMR to produce an alert or even a text page out to a static phone (or pager) number or pool of numbers. That may be fine for most situations, but as we get more prescriptive in our computed output of data, many believe time and relevance will become even more critical. If you haven’t already, it may be time to invest and start the adoption process around a scalable, flexible and secure communication platform that aggregates alerts and alarms regardless of the source. The immediate ROI will emerge from the ability to escalate key data from bedside devices, the EMR, or other life/safety systems directly to the appropriate caregiver based upon a patient-centric roster. The future value will come as you add the above-mentioned actionable data as a source, be it a relevant scoring system or a predictive analytic output.
Your ability to execute these efforts in-house depends on your organization’s buy-in to the larger IT strategy and roadmap. Much of the aforementioned approach will require time, money and skilled staff to accomplish internally. Even then, the fruitfulness of your data analysis can be hampered by your volume and composition of your patient data. If your organization can’t produce data in significant samples, be it disease processes, outcomes or other metrics, it may be advantageous to partner with other organizations with significant relevant data. Better yet, consider collaborating with those few pioneers with huge volumes of relevant data and the associated capabilities. The goal? To produce valuable information to positively affect care and outcomes within your organization while benefitting the larger medical community.