Convert your FHIR JSON -> XML and back here. The CDA Book is sometimes listed for Kindle here and it is also SHIPPING from Amazon! See here for Errata.

Tuesday, January 31, 2012

Medical Imaging Exchange

As anyone in the imaging market knows, the medical imaging market is not dominated by any vendor even if it quite a bit less fragmented than the EHR market.  Look at these reports on PACS and modalities (e.g., X-Ray, CT, MRI, Ultrasound, et cetera).  It's fairly common for an organization to have one vendor's PACS solution, another's modality, and a diagnostic workstation from a third.

The reason that these different systems work together inside an institution is because they adhere to the DICOM standards.  But there is a little bit more to it.  It is also because someone has taken the time ensure that they are configured appropriately to work together.  Once you get outside the walls of the organization though, you need more attention to interoperability.  The number of systems you want to be able to work with is even broader.

Let's look at a common use case (the one that John Halamka wrote about earlier today):  Getting an imaging study done, and having it be used at a different organization or facility.  The most common mechanism for exchange of the study is to create a CD.  Every time a family member of mine has had a significant imaging study done, we ask for the data on a CD before we leave, and we know to do so as early on as we can to simplify our lives.  Every CD I have ever received has included a viewer on it.  Only one included both Mac and Windows viewers.  Others just included a Windows viewer.  Those viewers are not meant for diagnostic quality viewing.  They are really just simple programs that allow a provider without any other viewing technology to see what is on the CD.  Any provider that I've ever really needed to give a CD to has had better viewing technology than what was on the CD.

Importing a CD from a foreign institution is not 100% successful, because not everyone complies with the  standards.  David Clunie (considered by many to be the father of DICOM) talks about this in a paper he presented at RSNA in 2010.  Most of the CD import failures are attributable to NOT following the DICOM standard, or other relevant specifications. The Great Aussie CD Challenge (pdf) tested more than two dozen CDs in 2007 for standards conformance (an eternity in Internet years), and came to similar conclusions with regard to conformance.  I'd love to see a more recent repeat of this exercise, using newer technology because the situation has improved (PDI was first introduced as an IHE profile in 2005).

This study showed that nearly 80% of CDs can be imported successfully.  There are standards and profiles which could improve the success rate (and which were used in the CD challenge previously mentioned).  The IHE Portable Data for Imaging Profile was strongly recommended by this Johns Hopkins study, in Australia (as a result of the challenge), and by the AMA.

This report shows the number of organizations that support PDI in their products.  It is an amazing list, which includes all of the vendors previously named in the market share reports at the top of this post.
What it doesn't show and is even more important is how many provider organizations are actually using those capabilities. As Dr. Clunie's paper indicates, it is very likely that organizations aren't configuring their systems to USE the AVAILABLE standards and profiles.  One question that they need to address is what the value is for them to do so.  As John Moehrke points on in this post from last year, getting your "Damn Data" seems to require a mandate to do so interoperably.  This isn't an implementation issue with respect to standards, but rather a deployment one.  I've implemented PDI in 2008, and actually done so over top of the XDS-I specifications.  It was one of the simplest standards-based integrations I've ever done.  Getting someone to want to use it?  That's a different story.

Exchanging data on CD isn't ideal.  There are a number of issues beyond dealing with the data in a standard format that also have to be addressed, many of which Dave Clunie points out in his RSNA slides.  Others are addressed in this 2008 Radiographics article by Mendolson and others.  These include things like patient identification, translation of procedure codes used at different organizations, bandwidth and storage issues, et cetera.

Ideally, you'd think it would just be a matter of transmission of the study from the organization where the study was done to the organization where it is going to be reviewed.  Pragmatically, that doesn't work where we live because the value of that "network" isn't perceived by anyone other than the patient, unless the imaging center and the reviewer are somehow financially connected. Payers do see the value, and where I live, the possibility that a payer network would be established to support such an imaging exchange is certainly viable, but has yet to be realized.

Point-to-point exchange (e.g, via Direct) wouldn't really have worked in one case for one of my family members, because we didn't know who the specialist who was going to perform the surgery was until long after the image was taken (but reading the CD was really simple).  You need something like the network in Philadelphia and New Jersey, or in Canada described in the Radiographics article above.  These networks are built upon the IHE XDS for Imaging protocols.  Both IHE and NEMA are also working on the next generation of web-service enabled protocols to support more advance image sharing.

John Halamka proposes the use of The Direct protocol to exchange images.  The Direct project was designed to support clinical data which is several orders of magnitude smaller than imaging study data.  This article documents the size of some typical imaging studies (see Table 3).  These range from 15Mb to as much as 1Gb (for CT Angiography).   Moving forward into the world of advanced digital imaging for pathology, you could get into the terabyte range (that's more than a 1000 CDs, a hundred DVD's or a double handful of Blue-Ray disks).  Clinical data, such as that in a CCD takes from a few Kb to maybe a couple of Mb if someone has abused the standard to provide a full data dump.
Anyone who's had any experience sending multi-megabyte files over e-mail will probably realize the limited viability of Direct for anything other than simple studies (e.g., X-Ray or Ultrasound).  The reality is that providers using the "simplest of EHR systems" probably won't have the storage capacity available to deal with medium-to-large studies.

One of the rationals for the way DICOM is structured is that you don't need all the data all of the time. Studies are large, but what the provider often needs to view is much smaller.  The tie-in to transport in the DICOM standard was to address this issue, and transport only what is needed when it is needed for viewing.  That's a much more efficient use of network bandwidth, and still gives providers what they need, which is, according to the AMA, access to the complete diagnostic quality study data.

With regard to proprietary data, and viewers:  There is no need to physically separate proprietary data in the image stream, but there is a need to tag it so that a "vendor-neutral viewer" can process the data, and DICOM does that.  Every proprietary data item in a DICOM conforming object is clearly tagged as such, and if you read the Vendor's DICOM conformance statements, those tags are documented.  And if they conform to IHE's PDI profile, they will also have a DICOM conformance statement.  If the system conforms to the DICOM standard, you don't need to understand the proprietary data to view the image.  There are really good reasons why some systems use proprietary data, and store it within the DICOM study.  But that data should not impact a standards-conforming viewer.

We don't need a new standard to address most of what John complains about in his blog post.  What we need is for organizations to actually USE the standards which are already widely available, and to configure their systems to do so.  That would address a large part of the existing problem.  Canada has certainly shown with their deployment that XDS-I works for regional imaging exchange.  Regarding the companies that John mentioned on his blog, all three support XDS-I in their solutions.

The one case where a new standard is needed is in the interchange across distributed networks of DICOM study data.  That is work that DICOM has already started.  I should note that the DICOM standard applies to regulated medical devices. I will happily leave the development of the new DICOM standards to experts in diagnostic imaging.  I don't think throwing a lot of non-expert volunteers at the problem will make that work go any faster, or produce better outcomes.



  1. Hi Keith

    Excellent post.

    But please don't refer to me as the "father of DICOM"; I was relatively late joining the process, certainly after it was first released in 1993; I am be relatively vociferous in my evangelism, but other people deserve the credit for developing it in the first place.


    1. I didn't say it, others did. Like all things DICOM, I have to leave that evaluation to the experts ;-)