One Person’s Experience with Healthcare Interoperability

March 22, 2012 / CloudPrime, Healthcare / 0 Comments

…or, Who Suffers When the Dots Cannot be Connected?

I have had the unfortunate experience of having my wife of over 30 years pass away from pancreatic cancer. She lived for 18 months from her initial diagnosis. Prior to that she had been a very healthy 62 year old.  During the course of her illness, she was treated in five different hospitals, was under the care of over 40 physicians, and had numerous surgical and diagnostic procedures. One might say,  “Well, this was certainly an edge case.” But hasn’t experience shown that it is the edge cases that bring out the flaws in the system? While I participated in her long and painful journey I came to realize that in spite of all the assertions made about information exchange and interoperability in healthcare, they are almost nonexistent once you go outside the four walls of a hospital.

The fact is, unless the patient or their family takes responsibility for the information that different hospitals and doctors will require when they come on board, they will have no reasonable way to have access to that data. During my wife’s illness, on numerous occasions, I had to hand carry DVDs, CDs,  or memory sticks so that other physicians could see the results of CT scans and radiology reports. I had to manually maintain a spreadsheet of her medications since there was no centralized system that was kept up to date, even where she was being treated. Obviously, the more manual recording the greater the chance for error, not to mention lost time.

I am writing this blog as a call to action. While many are wringing their hands over healthcare costs, in my opinion IT Vendors and Hospital Administrators are doing a great disservice to patients and medical personnel by not forcing their vendors to make it a high priority to improve interoperability and information exchange. As we all know, there are a number of high level committees and organizations that are working on this problem. However their progress is slow and the need is now.

Many of them have not even thought through how the Cloud can be a game changer.

The reality is that if Apple can provide iCloud so that users can upload all their content of different types to a single user ID and then deliver it to multiple devices, it is not so far fetched that the same capability could be applied to patient records. Patients typically have single identifiers. The notion that information stored in the Cloud is neither secure nor easily accessible has been proven to be a myth.

In addition, there are companies who provide low cost HIPAA compliant secure messaging solutions that can be implemented in minutes that will securely transfer data to and from the Cloud as well as between applications hosted in the Cloud.

It is my belief that if as much attention and investment is focused on medical information exchange as has been placed on making billing systems interoperable, we will have not only improved patient care  but a more efficient use of our medical resources as well.


One Way to Avoid Becoming the Next Kodak

As a boy growing up, my first camera was a Kodak Brownie—easy to use and, for that era, it took excellent pictures. It was somewhat expensive for a 10 year old because of the costs of film purchase and photograph development. Overall, however, I was a happy customer who would eagerly look forward to picking up my photos at the drug store.

Of course, there are no simple answers for why one of the most prestigious companies in the world has found itself having to file for bankruptcy. We do know that one of the first order effects was their inability to shift the center of gravity of their business. New technologies were adopted by their customers that in fact eliminated the cost of film while significantly reducing the cost of development. All the while, the company continued to be in denial about how big the impact would be on their business.

We can only imagine the internal discussions that must have occurred relative to any endorsement by Kodak that digital photography was the future and that they themselves would develop and sell the worlds best digital cameras and printers. The Kodak film people would obviously and immediately do everything possible to prevent that from happening. Just imagine Kodak’s large investment in film manufacturing plants, equipment, and distribution! As a result, they continued with their former core strength of promoting film while developing mediocre digital cameras. Furthermore, their strategy missed the shift of photography and photo software into smart phones, with an eventual even bigger impact on their core market.

Today, there is a technology shift that I believe will be even more profound than the changes brought by digital photography. That is the advent of Cloud Computing. We are all watching advancements occur at a breakneck pace. Initially it was all about virtualization, but we are now seeing very powerful software development tools as well as applications being hosted in the cloud. The result is a new generation of functionality at costs that in some cases are a factor of 10 to 100 times lower than those hosted on traditional servers. In addition, there is unparalled user access—laptops, tablets, smartphones everywhere! Consequently, all these new cloud-based applications come with an entirely new user experience.

The next generation of Kodaks are today convincing themselves that the Cloud will have limited applicability and therefore they can take a “wait and see” attitude, moving to endorse and adopt when they are sure it is real. What I can say with certainty is that by the time they come to that realization and have to analyze the business impact of dismantling infrastructure and a large IT organization, it will be too late. Their competitors who have moved quickly to adopt the Cloud will roll over them with not only significantly better IT cost structures and associated efficiencies, but with a better ability to focus on their businesses and a stronger and growing connection to their customers.


When Did Lawyers Become Technologists?

The current cloud computing debate centers on whether the Public Cloud can be trusted.  Can IT infrastructure start with a private cloud and migrate later?  The private cloud advocates cite concerns such as security, control and adherence to compliance requirements as their primary reasons for not utilizing the public cloud. Clearly, cloud security is in question. But who should make the decision within your organization?

Crayon_Lawyer1
I was amazed to have an attendee at a major industry conference tell me: “his lawyers would never let him use the public cloud”.  My question was –“when did lawyers become technologists?”

Cloud Paradigm Shift

It is universally recognized that there is a major paradigm shift occurring driven by the new usage based pricing model of Cloud Computing.  Just 5 years ago SaaS was perceived as not being financially viable.  Indeed, the Public Cloud has become one of the primary approaches to utilizing mission critical applications.

CIOs around the world are now including Cloud Computing in their future planning. They are trying to determine which Cloud environments should be adopted that make the most sense for their infrastructure requirements. Leading CIOs are allocating resources to determine the most cost effective and scalable cloud investments.

Cloud Phobias and Facts

The fact is that a lot of the fears regarding public clouds are coming from those who do not understand technology.  It’s important to know the facts.

The largest companies in the industry are investing billions of dollars in creating cloud platforms that include state of the art hardware, networking and security. These companies include IBM, Microsoft, HP, Rackspace and Amazon.

The private clouds cannot possibly invest enough money to remain competitive with the capabilities and security that are available in public clouds. In addition as a result of economies of scale, public clouds are the leaders in establishing and implementing compliance standards.

This is also an industry where it’s really all about the applications and solutions. There will be a far more extensive SaaS application catalogue available for the public cloud than for a portfolio of private clouds all of which have implemented their own custom stack.

Let’s face it, application developers have always followed the money…


How Cloak Labs Transforms Enterprise Messaging

Our customers need resilient, secure and easy-to-deploy application  messaging solutions that meet their changing demands. As more and more buzz around the cloud, application migration, security and compliance percolate to the surface, Cloak Labs has peaked the interest of more and more CIOs and Infrastructure Managers.

Transform: Cloak Labs enables enterprises of all sizes and industries to transform how they connect applications. Our cloud-based infrastructure provides scalability, embedded security, resliency and economies of scale. Because of the Cloud, our customers can rely on a service based messaging infrastructure, allowing managers to focus on mission critical tasks instead of deploying and managing costly VPNs and hardware.

Manage and Serve: As a service, Cloak Labs provides a robust network that guarantees the delivery of every message as well as providing “military grade” security and encryption.

Build: With Cloak Labs you can build application interfaces in minutes regardless of the application or transport protocol.

Consume: Cloak Labs’ application messaging services are easy to consume and do not require any hardware installation or IT training.


Response to NY Times’ Steve Lohr — Healthcare Connectivity

horse-before-cartMr. Lohr, great article and thank you for covering this topic.

Any story about hospitals taking steps towards connectivity is great, but I fear that most think that connectivity is a little easier than it really is, that it’s just a matter of getting everyone together. Integrating hospital systems is challenging enough, but it’s “everyone else” that will pose the greatest challenges and thus making connectivity a fragile vision if it cannot be streamlined for smaller practices, independent physicians, clinical labs, etc.

Mr. Lohr points out that only 25% of physician practices today are computerized, and that should improve given incentive payments and consequences for non-compliance. However, the real issue is that we are putting the carriage before the horse; physicians will adopt patient management systems, EHRs, and EMRs, but the connectivity piece will still be unanswered. Furthermore, smaller practices and physician groups most likely will not understand why it is that further steps for compliance need to be taken as most of them are being educated (by the very software vendors selling them their wares) that if they merely install software that allows them to manage patient data digitally, they are going to get compensated.

Recently on a call with a hospital CIO, we discussed how they had to put a connectivity project on hold because of the 200 physician practices outside of her hospital she had to bring onto the network, only 120 had EMRs and none of them wanted to deal with having to deploy and manage a VPN. Seems trivial, but the reality is that doctors are doctors 1st and anything not related to treating patients is a distraction and is perceived to decrease their bottom line (In the Docs’ defense, VPNs are a blunt instrument and I don’t blame them).

Healthcare connectivity is a long, windy road that needs better planning, better ideas, and better solutions. The Direct Project sponsored by NHIN is a great foundation for simplifying and standardizing connectivity, but this battle also needs to be won with hearts and minds.


Direct Project — Hooray!?

Fortunately NHIN will not even tell you that the Direct Project is the end-all solution to making the ubiquitous exchange of health information a reality.

That being said, many interpret it as a simple evolutionary step to secure health information exchange and compliance. Let’s take a look at what the Direct Project is and specifies:

  1. The Direct Project is a specification or recommendation about how secure health information exchange can be achieved via the SMTP protocol,
  2. In order to interface to the Direct Project network, you will still need to rely on a health information service provider,
  3. Each participant in the Direct Project will have a published Health Domain Name or HDN which is used for authentication; this will look like an email address or domain to users and is how other participants will identify each other

There are many other requirements that outline what is needed to adhere to the Direct Project specification, but above are the high-level concepts. In itself, it is a great and elegant approach to solving the problem of interoperability and health data exchange, but there are some items of concern that need to be addressed when determining how to gain widespread adoption:

  1. While some vendors are supporting the Direct Project in software patches and new releases of software, what does this mean for healthcare professionals that do not have applications that will comply? How will they interface to the Direct Project?
  2. What about large hospitals or groups that have many systems from multiple vendors? Will their router be able to interface to the Direct Project network without increasing the work load of already over-stretched IT staff?
  3. While some vendors have updates and patches for adhering to the Direct Project specification, are there other requirements needed in order to comply, e.g. changes to a hospital’s SMTP server?

At Cloak Labs we are very excited about the Direct Project and believe it does provide a solid foundation for improving health information exchange. Our concern, however, is that it will require that healthcare providers to allocate over-stretched resources to meet the requirements of Direct and get their health IT systems to integrate with the network even if their EMR/EHR supports Direct.

We believe that as an added goal of the Direct Project, implementation and integration should not be difficult and that health IT folks need a solution that will minimize the impact to their workflows and current workload.

Cloak Labs is defining a better way for health IT professionals to take advantage of everything the Direct Project has to offer while minimizing the impact to their IT infrastructure and workflows.


Healthcare Integration & Interoperability — Part 3

In the last blog we discussed 4 major file types/documents that are used in healthcare data exchange: HL7, DICOM, CCD and CCR. In order to exchange these data types, application interfaces need to be deployed to allow for the integration of disparate healthcare systems.

Today, we will cover how applications communicate or interface with each other.

Interfaces typically use what is called a transport protocol, which “provides end-to-end communication services for applications within a layered architecture of network components and protocols.”

The most common transport protocol in use is TCP or Transmission Control Protocol. Sometimes referred to as TCP/IP, it allows for applications to stream data to each other. For example, if you have a Patient Management System that generates HL7 SIU messages (scheduling messages) and that application interfaces to an HL7 routing engine, it is likely that the two interfaces would communicate via TCP.

In healthcare, there is a subset of TCP known as MLLP which adds specific delimeters to messages to denote the beginning and end of a message. The receiving application needs to know where one message ends and another begins in order to deliver the correct information to the system. In TCP, you would do this by specifying a length header, or more simply put, the details about where messages start and end. With MLLP, specifiying length headers is not necessary as the transport protocol inserts the delimiters for applications to know where the messages begin and end. Confusing, I know!

Sometimes, it may not be necessary or preferred to use TCP/MLLP for streaming data, and applications will use Simple File Transfer or file drop. This is accomlished by outputing the messages and storing them in a directory on the computer or server. Another application or interface checks the folder for new messages and consumes them when they are written to the directory.

Interfaces need transport protocols to exchange information and it is common for systems to need to communicate over various connectivity services, making integration somewhat challenging. Talking with your health information service provider and/or your integration specialist can help you understand the best method for interfacing healthcare applications together.

Part 1 | Part 2


Healthcare Integration & Interoperability — Part 2

Yesterday we briefly covered what healthcare integration and interoperability is and what it means to the healthcare industry. In today’s segment, we will be discussing some of the file protocols that are used in conjunction with continuity of care and interoperability.

The file protocols that we will focus on today are some of the more popular formats: HL7, DICOM, CCD & CCR.

HL7 application interface

HL7 File Protocol

Much like the blood cell in the human system, HL7 messages are the lifeblood of healthcare data exchange. Established in 1987, Health Level 7 (HL7) is a non-profit organization who’s mission is to “[provide] standards for interoperability that improve care delivery, optimize work flow, reduce ambiguity and enhance knowledge transfer among all of our stakeholders, including healthcare providers, government agencies, the vendor community, fellow SDOs and patients.”

In more simple terms, HL7 is a file protocol through which care providers leverage a standard for sharing patient data. HL7 messages are broken into specific types that relate to a specific event within a patient record, also known as a trigger event:

  • ACK — General acknowledgment
  • ADT — Admit discharge transfer
  • BAR — Add/change billing account
  • DFT — Detailed financial transaction
  • MDM — Medical document management
  • MFN — Master files notification
  • ORM — Order (pharmacy/treatment)
  • ORU — Observation result (Unsolicited)
  • QRY — Query, original mode
  • RAS — Pharmacy/treatment administration
  • RDE — Pharmacy/treatment encoded order
  • RGV — Pharmacy/treatment give
  • SIU — Scheduling information unsolicited

Each on of these trigger events is created by a hospital system and will need to be shared not just across internal systems, but also with hospitals, HIEs, physician groups, clinical labs, etc. that may reside outside of a healthcare providers network. Not each message type is relevant to all applications and many hospitals that maintain dozens of systems will leverage HL7 routing engines to deliver messages to the appropriate destination.

While the HL7 message protocol is a standard widely adopted healthcare providers, it is sometimes seen as Stephane Vigot of Caristix puts it, as a “non-standard standard”. What Mr. Vigot is saying is that even though the protocol specifies syntax and message headers for identifying pertinent information, different systems may use different templates. Take patient “sex” for example: one hospital may register a patient as either male or female and another may have up to 6 attributes relating to the patient’s sex. As a result, when systems are integrated, HL7 messages need to be normalized so that the systems know where to look for the information.

Version 2.x vs Version 3

Probably the most important thing to know about HL7 version 2.x vs. version 3 is that the latter has not been embraced by the healthcare industry yet. Version 2.x is a textual, non-XML based file format that uses delimiters to separate information. Version 3 on the other hand is an XML based file format.

DICOM

DICOM stands for Digital Imaging and Communications in Medicine. Like HL7, DICOM is a file format for exchanging patient data, but is used in conjunction with systems that exchange medical images. DICOM messages are the file protocol of choice for PACS (Picture Archiving and Communication Systems). Value Representations (VR) of a DICOM message.

Continuous Care Document (CCD) & Continuous Care Record (CCR)

These two documents perform very similar functions, and are considered summary documents. Both CCD and CCR are XML based documents and provide a summary of a patients healthcare history. Included in a CCD or CCR document is a human readable section that covers the patients care history as well as pertinent patient information such as demographics, insurance information, and administrative data.

The major difference between the two revolves around how closely one is tied to HL7 standards than the other and how much easier one fits into the current workflow of a particular health IT system. While some see CCD and CCR as competing standards, Vince Kuraitus of e-CareManagement argues that “the CCD and CCR standards are more complementary than competitive.” The basis of his opinion revolves around the “the right tool for the job” metaphor and HIEs adoption of CCD doesn’t say much.

Summary

Integration and interoperability need file protocol standards and as the healthcare IT industry keeps evolving, many of the ambiguities of the current standards will eventually (hopefully) be normalized and conformity will prevail. In the meantime, HL7 2.x, DICOM, CCD/CCR are here to stay and will continue to be the lifeblood of integration and connectivity.

Part 1 | Part 3


Healthcare Integration & Interoperability — A Mini Series

Completely inspired from my trip to HIMSS last week, I thought it made sense to talk about healthcare interoperability, connectivity and the component pieces to making this happen. This mini series is broken up into several parts that will cover:

  1. What is connectivity and interoperability?
  2. File protocols, formats and requirements, i.e. HL7 (including discussions on version 2 vs. 3), DICOM, EHI, and CCD;
  3. Transport protocols and interfaces: MLLP, TCP/IP, FTP, etc.;

Part 1: What is Healthcare Integration and Interoperability?

According to HIMSS, healthcare integration “is the arrangement of an organization’s information systems in way that allows them to communicate efficiently and effectively and brings together related parts into a single system.” †

The 2006 White House executive order defines Interoperability as (section 2 paragraph c):

Interoperability” means the ability to communicate and exchange data accurately, effectively, securely, and consistently with different information technology systems, software applications, and networks in various settings, and exchange data such that clinical or operational purpose and meaning of the data are preserved and unaltered.

Reference

These are great standard definitions and allow you to understand the difference between the two. Integration relates to how systems can work or collaborate for a common purpose, e.g. a patient management system working with a scheduling system. Interoperability speaks to how these systems are connected, in order to provide a continuous flow of information that improves care for the patient.

In order to achieve Interoperability, systems must be connected in a secure way, authenticating all users and allowing one healthcare application to share data with another anywhere in the country, without compromising a patient’s privacy.

From a real world scenario, what this means is that all systems must be integrated in order to achieve interoperability, i.e. a physicians patient management system must be able to authenticate and securely connect to a hospitals EMR; Ambulatory centers to pharmacies; hosted EMRs to wound treatment centers. Patient information can no longer live in just one place.

Interoperability Dimensions

(As defined by HIMSS)

  • Uniform movement of healthcare data
  • Uniform presentation of data
  • Uniform user controls
  • Uniform safeguarding data security and integrity
  • Uniform protection of patient confidentiality
  • Uniform assurance of a common degree of system service quality

No Small Task

Connecting all of these systems is no small task and is as much of an organizational challenge as it is a technological one. People and healthcare systems no longer exist within a vacuum and teams need to collaborate to make integration projects happen. These same people will need to agree on the best way to solve the connectivity problem and rely on the guidance of Health Information Service Providers to come up with solutions that meet the needs of all while adhering to the mission of improving patient care. As we continue to move forward in achieving interoperability, the scope and magnitude and of what needs to happen cannot be underestimated and careful planning must take place.

Throughout the mini-series, we will discuss the component pieces that are involved in achieving interoperability including application interfaces, file protocols, transport protocols, security & authentication, and compliance.

The Goal

Integration and Interoperability are significant pieces of the Meaningful Use objectives and the mission is to improve the care of individuals while providing them with secure, ubiquitous access to their health information. While there is no one way that can solve the challenge of interoperability, understanding the mission and the various parts of the goal can help make achieving connectivity as prescribed by the ONC and Meaningful Use.

Healthcare Interoperability Panel Discussion

Part 2 | Part 3


HIMSS — What You Would Have Learned If You Went

February 25, 2011 / CloudPrime, Healthcare / 0 Comments

I think Ascendian’s CEO Shawn McKenzie’s interview is a great summary of HIMSS 11 and what is happening in Healthcare IT:

If you don’t have time for watching videos at work, then I will try to sum it up the best I can:

Widgets, lot’s of them. Mostly unimportant.

Shawn makes a great point that there is no real plan for Healthcare IT and interoperability. Instead (as we have commented before) there is a focus on EHRs and building “widgets” for healthcare professionals, which is essentially creating healthcare “silos”. While there is a ton of innovation being made at the practice side, very little is going into interoperability and the traditional medi-evil VPN solution for connectivity still reigns.

After walking the floor of HIMSS for days, we learned on our own how true this was. Most EMRs and EHRs didn’t care about interoperability and were content to tell us it was the customer’s problem. This seemed odd to us in 2 ways: 1. The idea is to solve customer problems, not ignore them, and 2. As a business, they are leaving opportunity on the table.

The Direct Project also had a showcase that demonstrated interoperability, but it was not clear who should be interested and why.

Once people realize that connectivity and interoperability are a big issue, they will also realize that the old way of doing things will not be sufficient. Real investment in new technologies that utilize the Cloud and provide real solutions to the connectivity and interoperability problem are needed. To borrow from Mr. McKenzie again, what we have now is the coal but not the train or the tracks.