From Data to Understanding – Turning Vision into Action
By Claus Torp Jensen, CTO and Head of Architecture, CVS Health
The notion of artificial intelligence has captivated our minds for decades; sometimes cast as the benefactor of an advanced society, sometimes cast as the villain of a science fiction horror story. How real is artificial intelligence? Is it possible to craft an artificial construct that thinks and behaves in ways we would call intelligent? Is artificial intelligence a pipe dream, or something that modern enterprises need to include in their thinking around innovation and product development?
The word ‘intelligence’ is generally described as the ability to perceive or deduce information and to retain it as knowledge to be applied towards adaptive behaviors within an environment or context. Implied in this definition is the notion of understanding. Artificial intelligence technology enables software systems to reason about large amounts of data, but can such systems truly understand what they are processing? In some ways yes, in other ways no – which one it depends on how well the semantic context of the problem is defined. Fundamentally we need a different approach to data and semantics in order to turn artificial intelligence vision into real-world action.
Numerous articles have been written about needing rich data in order to make Artificial Intelligence successful. Much less attention has, so far, been given to the quality of understanding derived from the data once available. For data to become truly meaningful we need three things:
Data cohesion – while inconsistencies are generally unavoidable, it must be possible to understand data as part of the unified context
Data connectedness – even the smartest algorithm cannot create connections where no clues exist in the underlying data
Data semantics – a result without defined meaning is rarely actionable
Let’s look at a simple healthcare example. A person with serious health problem typically goes through multiple visits to primary care doctors, hospitals, pharmacies and more. At each touchpoint, our hypothetical person interacts with multiple IT systems, perhaps a hundred in total across all encounters. What is the chance that all these IT systems have the same internal identity key for our patient? Literally zero, at least in an environment where social security numbers are no longer an acceptable identity key due to security and privacy concerns.
The word ‘intelligence’ is generally described as the ability to perceive or deduce information and to retain it as knowledge to be applied towards adaptive behaviors within an environment or context.
Were we to apply artificial intelligence to drive a prediction of a health relapse, a key factor in post-acute care, we would need to understand not only a single encounter, but the totality of our patient’s journey across the healthcare ecosystem. Which is impossible without a nascent ability to connect the various data point through some kind of global identifier, combined with appropriate mapping to a myriad of local identity keys.
Once we solve the global identity problem, we run into the next challenge. The different IT systems involved in our patient’s journey do not have the same codification of information. Not only do they not have the same formatting of information but they also do not have the same semantics. Standards like ICD-10 and FHIR are helpful, but unfortunately, only codify a small amount of the data required. In order to turn data into understanding we must invest in a shared semantic model and the mapping to underlying IT systems and data structures.
The last challenge is one of data cohesion. For example, we cannot expect all blood pressure measurements to be consistent. Some people suffer from ‘white coat syndrome’, their blood pressure will spike simply due to being in a doctor’s office. Other people react differently to any kind of physical activity. And yet other people react differently to medication or various medical procedures. To filter noise from the underlying data trend we must understand the context of a metric or measurement. This has always been the case in a managed care setting, yet with the emergence of a multitude of smart connected devices, the problem has become exacerbated. Good data cleansing and reconciliation capabilities are quickly becoming foundational in providing solid data assets to drive artificial intelligence reasoning. And for the most complex data domains, traditional cleansing techniques are no longer sufficient, hence data teams are turning to predictive algorithms, fuzzy logic and more.
Even the simplest healthcare journeys present us with complex challenges before we can turn data into understanding. Turning data into understanding is the new frontier for artificial intelligence, a frontier that we cannot conquer without also applying advanced big data techniques. This is more than a technology problem, it requires a different mindset at a team level, in the enterprise and across the broader ecosystem. Teams must take a data first approach to any IT system or initiative. An enterprise must have a unification strategy for their own data and how to combine them with 3rd party sources. And the healthcare sector must give up the notion that isolated data is an acceptable state of affairs.
Artificial intelligence has a huge potential for prevention activities and for actual patient care. As we are quickly approaching the next decade, to unleash that potential we must eliminate data barriers and break down computational boundaries. While at the same time maintaining robust data privacy and given the health consumer control of what their data can and cannot be used for.
This is our joint challenge, a challenge we must rise to so that we may turn vision into action and transform the healthcare experience.