This story is about a VisiCalc moment that happened to me in 2004. If you don’t get the reference, it’s about that moment after the early adopters of VisiCalc realized what could happen with this new invention.
This story starts in about March of 2003, when the company that I worked for back then decided that they needed to become leaders in healthcare standards. I had previously been involved with W3C standards at a previous employer, where I worked with three or four different co-chairs of various XML related standards, and I had been working with XML and SGML since about 1993. So, I joined HL7 and started participating in the weekly t-con's with Structured Documents, which was working on CDA Release 2.0 at that time.
In about June or July of that year, IHE and HL7 announced that they would be doing a Joint Connectathon and a Demonstration at HIMSS in February of 2004. The initial call describing this activity had about 50 people from 30 different vendors, and was led by Steve Moore and Liora Alschuler.
Steve works for the Mallinckrodt Institute of Radiology, and is an avid Cubs fan. His team creates many of the MESA testing tools used by IHE to test implementations of its profiles. The participants in the connectathon will sometimes refer to Steve as "Mother", because he tells you in no uncertain terms what to do, when, where and what the punishment will be if you don't do it. His job is worse than herding cats, it's more like rounding up all the animals in the zoo after someone left their cages open, and he does this job very well.
Liora figures prominently in this story, as she is the mother of two important developments that appear within it. She runs a consulting firm out of Vermont that focuses on implementation of HL7 standards, especially the Clinical Document Architecture or CDA. She is also one of the co-chairs of the HL7 Structured Documents Technical Committee that developed the CDA standard.
Being demonstrated that year were four new profile from the newly created IT Infrastructure Technical Committee. Leading up that domain are two of the more prominent faces of IHE, Glen Marshall and Charles Parisot, also two of my mentors.
Glen Marshall used to work in the Standards and Regulatory area within Siemens, and is now a consultant and occasional blogger. He is a science geek from way back. In his early high school years he won a prominent science fair with his expertise in chemistry. He got his start in healthcare IT very early as a night computer operator for a hospital. In addition to his IHE leadership responsibilities, Glen is one of the foremost experts in Healthcare Security Standards. There are about 20 people on the planet with his degree of experience. He also plays guitar, and can usually tell you where to find good Sushi. He makes a point of mentoring new talent within IHE. One of the more important lessons I learned from Glen is how to get Charles to stop talking for a minute.
Charles Parisot (a collegue at GE) also appears prominently in this story. He's one of those brilliant people who can keep track of 1000s of thinks at once, and does so all around the Globe. While he lives in France, he spends about half a year in the US, and a significant fraction of his life has been spent inside an airplane. Charles is not only "Mr. IHE" to some, but also considered to be one of the Father's of the DICOM standard. He keeps track of not only ITI activities (he is presently co-chair of the ITI Planning committee), but also is active in other IHE Domains, HL7 ISO, HITSP, EHRVA and just about anywhere else that will listen to him. He's been engaged in US and International projects all over the world.
So, back to the four profiles in the ITI Technical Framework:
Consistent Time (CT)
Enterprise User Authentication (EUA)
Patient Identifier Cross Referencing (PIX)
Request Information for Display (RID)
My employer at that time decided to implement RID to display lists of CDA documents, and individual CDA documents, as well as the Consistent Time Profile. While I was busy pre-testing my application before connectathon (an IHE requirement), Glen, Liora, Steve and Didi Davis were planning the Demonstration. Didi is a fellow biker and blogger, and also one been of the leaders behind the growth of IHE Interoperability showcases at HIMSS over the last five years.
During the process of planning the demonstration, we wanted to highlight the benefits of CDA for interoperability between EHR systems. Reenter Liora with a twinkle in her eye, and a novel idea. The National Institute of Standards and Testing (NIST) maintains a registry of HL7 modeling artifacts, under the direction of Lisa Carnahan. Working for her was Bill Majurski, and his small team of developers. The idea was to send a simple message to a registry, and let it maintain a list of documents available from a number of different systems.
Because of the lateness of this idea, we were scrambling to implement this idea in the weeks before the connectathon, and Steve Moore was tearing out his hair.
The connectathon was held in a hotel ballroom in San Diego. We spend 5 days inside that hotel ballroom when most of us would rather have been at the beach. We managed to get past the testing and make it all work, but mostly at the last minute.
Then comes the showcase. It was a disaster for those of us presenting products. We had done all of this work to demonstrate our products, and who was getting all of the attention? Bill Majurski and NIST, because they had a web page that would show all of the clinical content from 10 different EHRs. It was so bad that we had to move his booth, which took 10 engineers: 9 to move the table, and 1 to ensure we wouldn’t get caught by facilities staff. We barely avoided ripping out the network cable and the carpet. This was my “VisiCalc” moment for Healthcare IT.
The next year, IHE developed that demonstration into the Cross Enterprise Document Sharing profile. Charles led the drive within IT Infrastructure to make that happen. Bill and I and a few others spent a great deal of time editing that profile, and many others contributed to the content.
When we demonstrated the profile the next year at HIMSS, an industry analyst approached Bill Majurski and told him he’d invented a $2B dollar a year industry. I’ve been tracking the data ever since he told me that story. I was able to count $2B in the US alone for the HIE industry based on HIMSS Analytics data in 2008. Last number I heard was $12B, and I don’t know whether that’s US or international, but I suspect the former.
XDS is now used around the world.
Pages
▼
Friday, April 30, 2010
What is the NHIN
John Moore of Chillmark Research asked for this one:
What is the NHIN, NHIN Connect and NHIN Direct, and the differences between them?
NHIN stands for the National Health Information Network. But NHIN is not really a network, rather, it is a concept describing the infrastructure needed to connect healthcare providers from Maine to California to Alaska to Hawaii to Alabama to ...
A better name for NHIN would be the National Health Information Infrastructure, or NHII. At least that's what we called it in 2004. Some of this is covered in a post I did on the history behind ARRA.
The NHIN has been described as the backbone for exchange, much like the Interstate Highway infrastructure. We really already have the necessary infrastructure needed: that is the Internet. What NHIN really did was specify the rules of the road for traveling on the healthcare interstate.
In 2006, the newly create Office of the National Coordinator issued an RFP to test (pilot) technologies that would be used to connect heatlhcare providers across the states of this country. I'm not sure why, but they used the name NHIN for this program, rather that show continuity with the NHII work that had gone on before. Four organizations were awarded contracts for this NHIN Pilot project.
Subsequently in 2007-2008, a new RFP was issued and awarded by ONC across 11 different healthcare organizations to support NHIN Implementations. The Federal Health architects across the federal agencies realized that they needed a platform to help agencies and organizations to connect to these NHIN implementations. This project was an Open Source software project that became NHIN Connect. NHIN Connect provides the software you need to get on the highway and follow the rules of the road. It's been called the onramp to the NHIN.
Finally, we have NHIN Direct. To get from my home to my doctor's office, I never go near the highway. To get from my doctor's office to one of my specialists, I still need to travel from the office parking lot, to the interstate. I drive differently on these back streets and local highways than I do on the Interstate. The rules are different there. The same is true for the small practice. In order to connect to their collegues and to their paitients, they need a different infrastructure. That infrastructure needs to be sommething that they can purchase from Best Buy, or sign up for over the web, using the stuff they already have, to allow them to connect up to the NHIN. NHIN Direct is the way that providers can connect to others without having to be aware of the gravel, concrete and steel that they are driving over. They just want to get into their car and go.
What is the NHIN, NHIN Connect and NHIN Direct, and the differences between them?
NHIN stands for the National Health Information Network. But NHIN is not really a network, rather, it is a concept describing the infrastructure needed to connect healthcare providers from Maine to California to Alaska to Hawaii to Alabama to ...
A better name for NHIN would be the National Health Information Infrastructure, or NHII. At least that's what we called it in 2004. Some of this is covered in a post I did on the history behind ARRA.
The NHIN has been described as the backbone for exchange, much like the Interstate Highway infrastructure. We really already have the necessary infrastructure needed: that is the Internet. What NHIN really did was specify the rules of the road for traveling on the healthcare interstate.
In 2006, the newly create Office of the National Coordinator issued an RFP to test (pilot) technologies that would be used to connect heatlhcare providers across the states of this country. I'm not sure why, but they used the name NHIN for this program, rather that show continuity with the NHII work that had gone on before. Four organizations were awarded contracts for this NHIN Pilot project.
Subsequently in 2007-2008, a new RFP was issued and awarded by ONC across 11 different healthcare organizations to support NHIN Implementations. The Federal Health architects across the federal agencies realized that they needed a platform to help agencies and organizations to connect to these NHIN implementations. This project was an Open Source software project that became NHIN Connect. NHIN Connect provides the software you need to get on the highway and follow the rules of the road. It's been called the onramp to the NHIN.
Finally, we have NHIN Direct. To get from my home to my doctor's office, I never go near the highway. To get from my doctor's office to one of my specialists, I still need to travel from the office parking lot, to the interstate. I drive differently on these back streets and local highways than I do on the Interstate. The rules are different there. The same is true for the small practice. In order to connect to their collegues and to their paitients, they need a different infrastructure. That infrastructure needs to be sommething that they can purchase from Best Buy, or sign up for over the web, using the stuff they already have, to allow them to connect up to the NHIN. NHIN Direct is the way that providers can connect to others without having to be aware of the gravel, concrete and steel that they are driving over. They just want to get into their car and go.
Thursday, April 29, 2010
IHE Week
Normally I would still be in the IHE PCC Meeting, but given that today starts @MassGoverner's #MAHIT conference, I'm sitting in a hotel lobby in Boston. I spent the last three days in Oakbrook, Illinois to discuss IHE Patient Care Coordination Profiles that are being prepared for Public Comment.
We had published the Perinatal Workflow profile for public comment and got some feedback, but because we published off-cycle, we didn't get as much feedback as we would like. So, it will be revised and republished for public comment back on cycle. Hopefully we will recieve more comments later.
On Tuesday the ITI, PCC and Quality, Research and Public Health domains were introduced to the Image Enabled Office profile being developed by the Cardiology domain. This profile reuses existing work similar to the way that the Perinatal Workflow profile does. I expect it to be published for public comment in a couple of weeks and will post the announcement when I receive it.
The Patient Centered Coordination Plan being developed by several members down-under is getting a lot of attention. This profile allows for a "Coordination Plan" to be shared with providers, and enables those providers to report on the healthcare tasks they've taken on back to the Care Coordinator. Written initialy to address coordination of care for chronically ill patients, this profile will support many kinds of case management workloads. The Public Health contingent from QRPH was very interested in this work.
Several new members came to the PCC meeting this week, and because they arrived later in the day, didn't get the benefit of our usual introductory presentation. We clearly have some work to do to explain our processes. We know how we develop CDA profiles, but the tools to make that easy are still under development in HL7. The HL7 Templates registry project should be a huge lift here (when I get further along with it, I'll write a post on that). I got a lot of valuable "voice of the customer" feedback on what that registry UI needs to look like for template developers.
We've been talking about holding our February 2010 meeting in Canada, it being not quite as inaccessible as other parts of the world to many of our US contingency, but also as a way to introduce them to the idea that IHE is International, and that they should plan for some International travel.
I'll be tweeting today from Governer Duval Patric's HIT conference in downtown Boston. Look for #MAHIT on twitter.
We had published the Perinatal Workflow profile for public comment and got some feedback, but because we published off-cycle, we didn't get as much feedback as we would like. So, it will be revised and republished for public comment back on cycle. Hopefully we will recieve more comments later.
On Tuesday the ITI, PCC and Quality, Research and Public Health domains were introduced to the Image Enabled Office profile being developed by the Cardiology domain. This profile reuses existing work similar to the way that the Perinatal Workflow profile does. I expect it to be published for public comment in a couple of weeks and will post the announcement when I receive it.
The Patient Centered Coordination Plan being developed by several members down-under is getting a lot of attention. This profile allows for a "Coordination Plan" to be shared with providers, and enables those providers to report on the healthcare tasks they've taken on back to the Care Coordinator. Written initialy to address coordination of care for chronically ill patients, this profile will support many kinds of case management workloads. The Public Health contingent from QRPH was very interested in this work.
Several new members came to the PCC meeting this week, and because they arrived later in the day, didn't get the benefit of our usual introductory presentation. We clearly have some work to do to explain our processes. We know how we develop CDA profiles, but the tools to make that easy are still under development in HL7. The HL7 Templates registry project should be a huge lift here (when I get further along with it, I'll write a post on that). I got a lot of valuable "voice of the customer" feedback on what that registry UI needs to look like for template developers.
We've been talking about holding our February 2010 meeting in Canada, it being not quite as inaccessible as other parts of the world to many of our US contingency, but also as a way to introduce them to the idea that IHE is International, and that they should plan for some International travel.
I'll be tweeting today from Governer Duval Patric's HIT conference in downtown Boston. Look for #MAHIT on twitter.
Wednesday, April 28, 2010
This is a primer on SOAP and REST
I’ve written it in an interleaved style so that you can compare and contrast these competing technologies for yourself. Using the best tool for the job has been a consistent theme of this blog from my second post. I’ve tried to be as objective as I can. I’ve used both SOAP and REST. They both are good tools. I like the automation and compositional extensibility that comes with SOAP, and the simplicity and layering that comes with REST, and would love to see all these features in one nice “standard” package.
SOAP stands for Simple Object Access Protocol. This is intended to be a lightweight protocol to access objects over distributed networks such as the Internet. SOAP operates by transferring messages from a sender to a receiver, possibly through one or more SOAP intermediaries. An intermediary is a both a receiver and sender of a message. It acts much like a router or switch in a network in that it receives a message and forwards it along to its final destination, possible providing additional services at the same time. Senders, receivers and intermediaries are different types of nodes in the path that a message takes. SOAP nodes can be communicated with via Internet protocols such as HTTP or SMTP. The mechanism by which this communication is established is defined by binding the SOAP end point address to a protocol. HTTP is the most commonly discussed protocol that can be used with SOAP.
REST stands for REpresentational State Transfer. This is intended to be a lightweight way to access resources over distributed networks such as the Internet. The intermediaries of REST are the existing intermediaries of the Internet that use those protocols, including gateways and caches. While intermediaries can provide additional services along the way, those services are typically the existing set of services that these intermediaries provide to users of those existing protocols. RESTful services can be defined in terms of existing Internet protocols such as FTP, HTTP and SMTP.
SOAP messages define an operation on an object. The message body and operation name provide the signature of the message. This signature is used to identify the method to be performed on the object being accessed. Thus, SOAP can define any number of methods on an object.
A REST message is communicated using existing operations defined in the Internet protocols that it is transmitted over. HTTP is the most commonly discussed protocol with respect to REST. The PUT, GET, POST and DELETE operations of the HTTP support the creation, retrieval, update and deletion of resources. These are the four basic functions on persistent storage, commonly (and not always pejoratively) referred to as CRUD.
The messages in the SOAP protocol have two components: The optional message header contains information that may be used or updated by intermediaries along the message path. The body of the message is intended for the final SOAP receiver in the message path. The header and the body appear in the SOAP envelope. All of the content in SOAP is expressed in an XML document starting with the SOAP envelope. The header can be used to control the kind of processing that SOAP intermediaries perform along the path to the receiver. Blobs can be communicated by attaching them to the SOAP message and identifying them in the XML content of the message body.
The messages in REST are blobs that are represented in any MIME type that can be communicated over the existing Internet protocols. RESTful services using HTTP often use XML or JSON as the message content being communicated, but this is not a requirement of a RESTful service.
The messages in SOAP describe operations being performed on an object (resource). These operations can change the state of the object. Most often, SOAP senders and receivers represent the two end-points in a client-server relationship, where the server maintains the state of the object being accessed by the client. Operations can be thought of as the methods of the object, and SOAP as a mechanism to call these object methods remotely. Thus, SOAP most often resembles a “remote procedure call” on an object. The operation being performed on the object is identified in the SOAP message exchange in the MIME and/or SOAP header communicated during the exchange. The SOAP body helps to identify the “method signature” so that the right procedure is called on the server based upon the message inputs. The same service endpoint may be used to perform different operations. Service endpoints are Internet addresses (URLs).
The messages in REST result in the transfer of the state of a resource (object) from the server to the client. At the conclusion of the exchange the resource (object) is “at rest”. Resources are identified by Internet addresses (URLs). The server is not required to maintain the state of an object across service calls.
SOAP requires specialized intermediaries to provide additional services, and the kinds of additional services being provided are limited only to the creativity of developers providing those intermediaries. REST does not require specialized intermediaries, but the kinds of additional services provided to a RESTful client are often limited by services already offered along the communications path between the client and the server (e.g., caching, routing or gateways).
The existence of the SOAP header has allowed the development of specialized profiles of the SOAP standard to define specific header elements that support specific services such as addressing and routing, access control, authentication, message encryption, reliable messaging, et cetera. Thus, the SOAP header allows for new services to be created by composition of intermediaries and receivers. For example: A header element containing a SAML assertion appearing in a SOAP message can be used to communicate user identity information. An intermediary service can perform appropriate authentication and access control checks before passing the operation on to its final destination, logging the result.
Similar capabilities can also be provided through REST, but there is no defined mechanism in REST to enhance a service by composition through intermediaries. In fact, the whole notion that an intermediary is present is hidden from the end user of a RESTful service. Other services can implement their capabilities by using the services of other RESTful services. For example, the Twitter message that many received notifying them of this post was generated by a server that drew on the capabilities of two RESTful services (the Atom feed from this blog, and the RESTful interface of Twitter).
SOAP and REST operate at two different layers of the network stack. SOAP tells you how a message is wrapped and can be extended. REST says nothing about the message, but talks about how the resource is identified and communicated with.
SOAP is a Standard. RESTful is more like a philosophy. They both provide a remarkably similar set of capabilities.
Coming Soon: WSDL and WADL
P.S. Thanks to Brian Ahier for the topic Suggestion
SOAP stands for Simple Object Access Protocol. This is intended to be a lightweight protocol to access objects over distributed networks such as the Internet. SOAP operates by transferring messages from a sender to a receiver, possibly through one or more SOAP intermediaries. An intermediary is a both a receiver and sender of a message. It acts much like a router or switch in a network in that it receives a message and forwards it along to its final destination, possible providing additional services at the same time. Senders, receivers and intermediaries are different types of nodes in the path that a message takes. SOAP nodes can be communicated with via Internet protocols such as HTTP or SMTP. The mechanism by which this communication is established is defined by binding the SOAP end point address to a protocol. HTTP is the most commonly discussed protocol that can be used with SOAP.
REST stands for REpresentational State Transfer. This is intended to be a lightweight way to access resources over distributed networks such as the Internet. The intermediaries of REST are the existing intermediaries of the Internet that use those protocols, including gateways and caches. While intermediaries can provide additional services along the way, those services are typically the existing set of services that these intermediaries provide to users of those existing protocols. RESTful services can be defined in terms of existing Internet protocols such as FTP, HTTP and SMTP.
SOAP messages define an operation on an object. The message body and operation name provide the signature of the message. This signature is used to identify the method to be performed on the object being accessed. Thus, SOAP can define any number of methods on an object.
A REST message is communicated using existing operations defined in the Internet protocols that it is transmitted over. HTTP is the most commonly discussed protocol with respect to REST. The PUT, GET, POST and DELETE operations of the HTTP support the creation, retrieval, update and deletion of resources. These are the four basic functions on persistent storage, commonly (and not always pejoratively) referred to as CRUD.
The messages in the SOAP protocol have two components: The optional message header contains information that may be used or updated by intermediaries along the message path. The body of the message is intended for the final SOAP receiver in the message path. The header and the body appear in the SOAP envelope. All of the content in SOAP is expressed in an XML document starting with the SOAP envelope. The header can be used to control the kind of processing that SOAP intermediaries perform along the path to the receiver. Blobs can be communicated by attaching them to the SOAP message and identifying them in the XML content of the message body.
The messages in REST are blobs that are represented in any MIME type that can be communicated over the existing Internet protocols. RESTful services using HTTP often use XML or JSON as the message content being communicated, but this is not a requirement of a RESTful service.
The messages in SOAP describe operations being performed on an object (resource). These operations can change the state of the object. Most often, SOAP senders and receivers represent the two end-points in a client-server relationship, where the server maintains the state of the object being accessed by the client. Operations can be thought of as the methods of the object, and SOAP as a mechanism to call these object methods remotely. Thus, SOAP most often resembles a “remote procedure call” on an object. The operation being performed on the object is identified in the SOAP message exchange in the MIME and/or SOAP header communicated during the exchange. The SOAP body helps to identify the “method signature” so that the right procedure is called on the server based upon the message inputs. The same service endpoint may be used to perform different operations. Service endpoints are Internet addresses (URLs).
The messages in REST result in the transfer of the state of a resource (object) from the server to the client. At the conclusion of the exchange the resource (object) is “at rest”. Resources are identified by Internet addresses (URLs). The server is not required to maintain the state of an object across service calls.
SOAP requires specialized intermediaries to provide additional services, and the kinds of additional services being provided are limited only to the creativity of developers providing those intermediaries. REST does not require specialized intermediaries, but the kinds of additional services provided to a RESTful client are often limited by services already offered along the communications path between the client and the server (e.g., caching, routing or gateways).
The existence of the SOAP header has allowed the development of specialized profiles of the SOAP standard to define specific header elements that support specific services such as addressing and routing, access control, authentication, message encryption, reliable messaging, et cetera. Thus, the SOAP header allows for new services to be created by composition of intermediaries and receivers. For example: A header element containing a SAML assertion appearing in a SOAP message can be used to communicate user identity information. An intermediary service can perform appropriate authentication and access control checks before passing the operation on to its final destination, logging the result.
Similar capabilities can also be provided through REST, but there is no defined mechanism in REST to enhance a service by composition through intermediaries. In fact, the whole notion that an intermediary is present is hidden from the end user of a RESTful service. Other services can implement their capabilities by using the services of other RESTful services. For example, the Twitter message that many received notifying them of this post was generated by a server that drew on the capabilities of two RESTful services (the Atom feed from this blog, and the RESTful interface of Twitter).
SOAP and REST operate at two different layers of the network stack. SOAP tells you how a message is wrapped and can be extended. REST says nothing about the message, but talks about how the resource is identified and communicated with.
SOAP is a Standard. RESTful is more like a philosophy. They both provide a remarkably similar set of capabilities.
Coming Soon: WSDL and WADL
P.S. Thanks to Brian Ahier for the topic Suggestion
Tuesday, April 27, 2010
What to look for in a CDA Developer
What skills make for a good CDA developer?
The key skill that a CDA engineer will have is a great deal of familiarity with structured documentation. That doesn't necessarily mean "clinical documentation". My own background includes experience with SGML, XML and HTML well before I ever encountered the healthcare domain and CDA.
Expert use of CDA requires expert skills in structured documentation and some knowledge of the healthcare space. In my own experience, the former is harder to impart to an engineer than the latter. If I were to rank the importance of familiarity with certain technologies, I'd make HTML the least important, XML in the middle, and SGML the most important indications of experience with structured documentation.
Engineers with SGML experience will usually have a pretty good understanding with "transformational" and validation technologies which are essential for effective generation and use of CDA. Those with XML experience will be pretty familiar with XSLT and schema's [W3C, RelaxNG, or Schematron], which are specific examples of those technologies. I use a great deal of XSLT in CDA development (in fact, you might call it my preferred programming language).
Most people familiar with the disciplines behind structured documents will also have a good grounding in information retrieval and terminology (controlled vocabularies). These are two other key technologies that are associated with the use of the CDA standard.
I know one CDA consultancy that has hired a number of people with this background and that has been quite successful.
The key skill that a CDA engineer will have is a great deal of familiarity with structured documentation. That doesn't necessarily mean "clinical documentation". My own background includes experience with SGML, XML and HTML well before I ever encountered the healthcare domain and CDA.
Expert use of CDA requires expert skills in structured documentation and some knowledge of the healthcare space. In my own experience, the former is harder to impart to an engineer than the latter. If I were to rank the importance of familiarity with certain technologies, I'd make HTML the least important, XML in the middle, and SGML the most important indications of experience with structured documentation.
Engineers with SGML experience will usually have a pretty good understanding with "transformational" and validation technologies which are essential for effective generation and use of CDA. Those with XML experience will be pretty familiar with XSLT and schema's [W3C, RelaxNG, or Schematron], which are specific examples of those technologies. I use a great deal of XSLT in CDA development (in fact, you might call it my preferred programming language).
Most people familiar with the disciplines behind structured documents will also have a good grounding in information retrieval and terminology (controlled vocabularies). These are two other key technologies that are associated with the use of the CDA standard.
I know one CDA consultancy that has hired a number of people with this background and that has been quite successful.
They got the memo...
Not too long ago, I helped HL7 to provide feedback on the Meaningful Use Certification Rule. I also made some comments on that proposed rule here.
One of the key points I helped to make in the HL7 feedback starts:
One of the key points I helped to make in the HL7 feedback starts:
Finally, HL7 comments that the proposed rule does not make any provision for consultation with authoritative bodies with respect to interpretation of standards and implementation guides when such questions arise...I recently recieved the following e-mail from a representative of NIST:
NIST is requesting HL7’s input on several 2011 ARRA Meaningful Use draft test procedures which reference HL7 standards. To orient you to the draft test procedures and the details of the request, NIST will host a webinar for you ________, with a follow-up discussion during the HL7 meeting in Rio. You are receiving this because you are either an HL7 WG Co-chair of a relevant HL7 WG or recognized as an SME for the relevant focus area (or both).There are days when I want to hug government employees. Today is one of them. Way to go NIST!
Thursday, April 22, 2010
HL7 Ambassador Presentations on CDA and CCD
One of my roles in HL7 is to give "Ambassador Presentations" on some of the standards that I have had a direct role developing. The HL7 Ambassador Presentations are short talks presenting on various HL7 activitities. The talks are at a high level and last about 20 minutes leaving plenty of time for questions. The two presentations I give are on CDA and CCD (and can be combined into a one hour talk). The presentations describe what these pubications are, the business need they fill, how they work at a VERY high level, where and how they are being used nationally and internationally, and how organizations can learn more about them.
Recently I gave the CDA/CCD presentation to a packed room for the New England Chapter of HIMSS, and have already been asked to give it in a few other places in the region. I'll post dates when I know more.
HL7 offers these talks to organizations who are interested in having this information presented to their members. Other talks are also available on:
If you happen to be hosting an event in the metro-Boston area, that's where I reside, and I'm willing to travel short distances to present, schedule permitting.
Recently I gave the CDA/CCD presentation to a packed room for the New England Chapter of HIMSS, and have already been asked to give it in a few other places in the region. I'll post dates when I know more.
HL7 offers these talks to organizations who are interested in having this information presented to their members. Other talks are also available on:
- An Executive Overview of HL7
- The HL7 Electronic Health Record Functional Model
- The HL7 Personal Health Record Functional Model
- HL7 and Service Oriented Architectures
- HL7 Clinical Genomics Pedigree Model
- Public Health and Emergency Response
If you happen to be hosting an event in the metro-Boston area, that's where I reside, and I'm willing to travel short distances to present, schedule permitting.
Wednesday, April 21, 2010
A Quick Overview of the ebXML RIM objects in XDS Metadata
On NHIN Direct Content Packaging Workgroup calls and wiki page discussions, there've been several questions about XDS metadata. I post a response here to some of the issues raised because it is of general interest to all users of IHE XDS and related profiles.
The IHE Cross Enterprise Document Sharing (XDS) profile is a protocol for sharing clinical documents in health information exchanges used widely around the world. This profile, along with it’s sister profiles in the ITI Technical Framework: Cross Enterprise Document Sharing using Reliable Messaging (XDR), and Cross Enterprise Document Sharing using Media (XDM) use the same metadata to facilitate exchange.
This metadata is described using the ebXML RIM Standard. The original XDS.a protocol used ebRIM 2.1, but the newer version (XDS.b) uses the ebRIM 3.0 standard. The ebRIM standard was adopted by ISO as ISO Standard 15000-3: ebXML Registry Information Model.
The ebRIM object model (pdf) defines a number of fundamental data types and approximately 25 classes. Nine of these classes are used in an XDS Registry, but only seven are needed to communicate the metadata in the various IHE transactions. This metadata is all documented in volume 3 of the ITI Technical Framework (pdf). Of these seven classes, six derive from the ebXML Registry Object, and the seventh is the Slot class used to contain extensible metadata associated with Registry Objects.
A Registry Object has a few fundamental attributes including a name, description, status and a local identifier, and may be composed of additional slots and sets of classifications and external identifiers.
Slots are simply named string lists that can provide additional metadata for an object in a name and value list pair. A typical representation of a Slot in XDS metadata is:
‹Slot name='XDSMetadataObject.name'›
‹ValueList›‹Value›Metadata Value‹/Value›‹/ValueList›
‹/Slot›
External Identifiers in the ebRIM allow additional identifiers to be associated with a Registry Object. An External Identifier has a UUID that indicates the identification scheme being used. The XDS metadata includes a human readable name in the External Identifier to show which metadata element is represented. A typical representation of an External Identifier in XDS metadata is:
‹ExternalIdentifier
identificationScheme='urn:uuid:2e82c1f6-a085-4c72-9da3-8640a32e42ab'
value='someExternallyMeaningfulIdentifer'›
‹Name›
‹LocalizedString value="XDSDocumentEntry.uniqueId"/›
‹/Name›
‹/ExternalIdentifier›
Classifications allow Registry Objects to be organized in various ways. Classifications are most commonly associated with a controlled terminology. The ebXML registry notion of Classifications makes use of Classification Nodes that appear within a hierarchical (tree-structured) Classification Scheme. The Classification Node and Classification Scheme ebXML RIM objects are used internally by the XDS Registry but not by edge systems.
Most healthcare standards require that both the code and an identifier for the coded terminology be represented, and some also allow for human readable forms of the concepts to be exchanged. In XDS metadata, all three of these components are required in classifications. A typical representation of a Classification in XDS metadata is:
‹Classification
classificationScheme='urn:uuid:cccf5598-8b07-4b77-a05e-ae952c785ead'
classifiedObject='objectID'
classificationNode='codeValue'›
‹Name value='Display name for codeValue'/›
‹Slot name='codeSystem'›
‹ValueList›‹Value›identifier for code system‹/Value›‹/ValueList›
‹/Slot›
‹/Classification›
Documents and submissions are also organized (classified) by who created or submitted them. In this case, a controlled terminology is insufficient, since there are four different axes which can be used in this organization. These are the author's name, organization, specialty, and role with respect to the patient. These are simply represented as slots in a classification. These slots are named authorPerson, authorRole, authorInstitution and authorSpecialty. When the author Classification is present, at least one of these Slots must be included.
‹Classification
classificationScheme='urn:uuid:93606bcf-9494-43ec-9b4e-a7748d1a838d'
classifiedObject='Document01' nodeRepresentation=''›
‹Slot name='authorPerson'›
‹ValueList›‹Value›AuthorName‹/Value›‹/ValueList›
‹/Slot›
‹/Classification›
The XDS family of profiles recognized three kinds of objects that need to be registered, Documents, Folders and Submissions. Documents are external objects that have associated metadata. The Extrinsic Object is designed for this purpose in the ebRIM, and is how this metadata is stored in the XDS metadata. Folders and Submissions are collections of objects that may have been submitted by multiple parties and also have associated metadata. The Registry Package object is designed for that function and is used by XDS for metadata for these objects.
These three XDS metadata objects have an optional Name and Description that provide the ‘title’ and descriptive text. While optional according to the IHE profiles, these are very strongly recommended. These are found in the ‹Name› and ‹Description› elements of the ‹ExtrinsicObject› or ‹RegistryPackage›.
These objects also carry an availability status that is managed by the registry and is reported during query operations. This is found in the status attribute of the ‹ExtrinsicObject› or ‹RegistryPackage›. Each metadata object within a registry has a (universally) unique identifier that identifies the metadata element. This identifier is found in the id attribute of the ‹ExtrinsicObject› or ‹RegistryPackage›. Finally, the mimeType attribute of the Extrinsic Object element provide the MIME type of the content associated with this metadata.
To complete the XDS Metadata, you must associate Registry Objects with the Registry Packages they are members of, and classify the Registry Packages to describe them as Submissions or folders. The necessary classification for a submission set appears below, and assumes that a Registry Package exists in the submission with the identifier SubmissionSet01.
‹Classification
classificationNode="urn:uuid:a54d6aa5-d40d-43f9-88c5-b4633d873bdd"
classifiedObject="SubmissionSet01"/›
The necessary association between that submission set and the Extrinsic Object representing Document01 follows:
‹Association associationType="HasMember" sourceObject="SubmissionSet01"
targetObject="Document01"›
‹Slot name="SubmissionSetStatus"›
‹ValueList›‹Value›Original‹/Value›‹/ValueList›
‹/Slot›
‹/Association›
The table below is a simplified list of the XDS Metadata requirements for a registry submission (the same requirements used for XDM). I have not include XDS Metadata attributes required by the ebXML RIM (e.g., mimeType, id or scheme identifiers), and have removed display names from the list, since every code in the metadata requires a display name, treating them as parts of the same object.
This table with the overview above should be sufficient for an engineer with some knowledge of XML and HL7 Version 2 data types to put together an XDS Submission.
The IHE Cross Enterprise Document Sharing (XDS) profile is a protocol for sharing clinical documents in health information exchanges used widely around the world. This profile, along with it’s sister profiles in the ITI Technical Framework: Cross Enterprise Document Sharing using Reliable Messaging (XDR), and Cross Enterprise Document Sharing using Media (XDM) use the same metadata to facilitate exchange.
This metadata is described using the ebXML RIM Standard. The original XDS.a protocol used ebRIM 2.1, but the newer version (XDS.b) uses the ebRIM 3.0 standard. The ebRIM standard was adopted by ISO as ISO Standard 15000-3: ebXML Registry Information Model.
The ebRIM object model (pdf) defines a number of fundamental data types and approximately 25 classes. Nine of these classes are used in an XDS Registry, but only seven are needed to communicate the metadata in the various IHE transactions. This metadata is all documented in volume 3 of the ITI Technical Framework (pdf). Of these seven classes, six derive from the ebXML Registry Object, and the seventh is the Slot class used to contain extensible metadata associated with Registry Objects.
A Registry Object has a few fundamental attributes including a name, description, status and a local identifier, and may be composed of additional slots and sets of classifications and external identifiers.
Slots are simply named string lists that can provide additional metadata for an object in a name and value list pair. A typical representation of a Slot in XDS metadata is:
‹Slot name='XDSMetadataObject.name'›
‹ValueList›‹Value›Metadata Value‹/Value›‹/ValueList›
‹/Slot›
External Identifiers in the ebRIM allow additional identifiers to be associated with a Registry Object. An External Identifier has a UUID that indicates the identification scheme being used. The XDS metadata includes a human readable name in the External Identifier to show which metadata element is represented. A typical representation of an External Identifier in XDS metadata is:
‹ExternalIdentifier
identificationScheme='urn:uuid:2e82c1f6-a085-4c72-9da3-8640a32e42ab'
value='someExternallyMeaningfulIdentifer'›
‹Name›
‹LocalizedString value="XDSDocumentEntry.uniqueId"/›
‹/Name›
‹/ExternalIdentifier›
Classifications allow Registry Objects to be organized in various ways. Classifications are most commonly associated with a controlled terminology. The ebXML registry notion of Classifications makes use of Classification Nodes that appear within a hierarchical (tree-structured) Classification Scheme. The Classification Node and Classification Scheme ebXML RIM objects are used internally by the XDS Registry but not by edge systems.
Most healthcare standards require that both the code and an identifier for the coded terminology be represented, and some also allow for human readable forms of the concepts to be exchanged. In XDS metadata, all three of these components are required in classifications. A typical representation of a Classification in XDS metadata is:
‹Classification
classificationScheme='urn:uuid:cccf5598-8b07-4b77-a05e-ae952c785ead'
classifiedObject='objectID'
classificationNode='codeValue'›
‹Name value='Display name for codeValue'/›
‹Slot name='codeSystem'›
‹ValueList›‹Value›identifier for code system‹/Value›‹/ValueList›
‹/Slot›
‹/Classification›
Documents and submissions are also organized (classified) by who created or submitted them. In this case, a controlled terminology is insufficient, since there are four different axes which can be used in this organization. These are the author's name, organization, specialty, and role with respect to the patient. These are simply represented as slots in a classification. These slots are named authorPerson, authorRole, authorInstitution and authorSpecialty. When the author Classification is present, at least one of these Slots must be included.
‹Classification
classificationScheme='urn:uuid:93606bcf-9494-43ec-9b4e-a7748d1a838d'
classifiedObject='Document01' nodeRepresentation=''›
‹Slot name='authorPerson'›
‹ValueList›‹Value›AuthorName‹/Value›‹/ValueList›
‹/Slot›
‹/Classification›
The XDS family of profiles recognized three kinds of objects that need to be registered, Documents, Folders and Submissions. Documents are external objects that have associated metadata. The Extrinsic Object is designed for this purpose in the ebRIM, and is how this metadata is stored in the XDS metadata. Folders and Submissions are collections of objects that may have been submitted by multiple parties and also have associated metadata. The Registry Package object is designed for that function and is used by XDS for metadata for these objects.
These three XDS metadata objects have an optional Name and Description that provide the ‘title’ and descriptive text. While optional according to the IHE profiles, these are very strongly recommended. These are found in the ‹Name› and ‹Description› elements of the ‹ExtrinsicObject› or ‹RegistryPackage›.
These objects also carry an availability status that is managed by the registry and is reported during query operations. This is found in the status attribute of the ‹ExtrinsicObject› or ‹RegistryPackage›. Each metadata object within a registry has a (universally) unique identifier that identifies the metadata element. This identifier is found in the id attribute of the ‹ExtrinsicObject› or ‹RegistryPackage›. Finally, the mimeType attribute of the Extrinsic Object element provide the MIME type of the content associated with this metadata.
To complete the XDS Metadata, you must associate Registry Objects with the Registry Packages they are members of, and classify the Registry Packages to describe them as Submissions or folders. The necessary classification for a submission set appears below, and assumes that a Registry Package exists in the submission with the identifier SubmissionSet01.
‹Classification
classificationNode="urn:uuid:a54d6aa5-d40d-43f9-88c5-b4633d873bdd"
classifiedObject="SubmissionSet01"/›
The necessary association between that submission set and the Extrinsic Object representing Document01 follows:
‹Association associationType="HasMember" sourceObject="SubmissionSet01"
targetObject="Document01"›
‹Slot name="SubmissionSetStatus"›
‹ValueList›‹Value›Original‹/Value›‹/ValueList›
‹/Slot›
‹/Association›
The table below is a simplified list of the XDS Metadata requirements for a registry submission (the same requirements used for XDM). I have not include XDS Metadata attributes required by the ebXML RIM (e.g., mimeType, id or scheme identifiers), and have removed display names from the list, since every code in the metadata requires a display name, treating them as parts of the same object.
XDS Metadata Attribute | ebXML Type | Classification or Identification Scheme | Opt | Data Types |
XDSDocumentEntry | ||||
author | Classification | a7058bb9-b4e4-4307-ba5b-e3f0ab85e12d | R2 | |
authorInstitution | Slot | R2 | XON | |
authorPerson | Slot | R2 | XCN | |
authorRole | Slot | R2 | ||
authorSpecialty | Slot | R2 | ||
classCode | Classification | 41a5887f-8865-4c09-adf7-e362475b143a | R | |
comments | Description | O | ||
confidentialityCode | Classification | f4f85eac-e6cb-4883-b524-f2705394840f | R | |
creationTime | Slot | R | DTM | |
eventCodeList | Classification | 2c6b8cb7-8b2a-4051-b291-b1ae6a575ef4 | O | |
formatCode | Classification | a09d5840-386c-46f2-b5ad-9c3699a4309d | R | |
hash | Slot | R | SHA1 hash | |
healthcareFacilityTypeCode | Classification | f33fb8ac-18af-42cc-ae0e-ed0b0bdb91e1 | R | |
languageCode | Slot | R | ||
legalAuthenticator | Slot | O | XCN | |
mimeType | ExtrinsicObject.mimeType | R | ||
patientId | ExternalIdentifier | 6b5aea1a-874d-4603-a4bc-96a0a7b38446 | R | CX |
practiceSettingCode | Classification | cccf5598-8b07-4b77-a05e-ae952c785ead | R | |
serviceStartTime | Slot | R2 | DTM | |
serviceStopTime | Slot | R2 | DTM | |
size | Slot | R | Integer | |
sourcePatientId | Slot | R | CX | |
sourcePatientInfo | Slot | O | ||
title | Name | O | ||
typeCode | Classification | 41a5887f-8865-4c09-adf7-e362475b143a | R | |
uniqueId | ExternalIdentifier | 2e82c1f6-a085-4c72-9da3-8640a32e42ab | R | OID or OID^identifier |
URI | Slot | R | URI | |
XDSSubmissionSet | ||||
author | Classification | a7058bb9-b4e4-4307-ba5b-e3f0ab85e12d | R2 | |
authorInstitution | Slot | R2 | XON | |
authorPerson | Slot | O | XCN | |
authorRole | Slot | R2 | ||
authorSpecialty | Slot | R2 | ||
comments | Description | O | ||
contentTypeCode | Classification | aa543740-bdda-424e-8c96-df4873be8500 | R | |
intendedRecipient | Slot | O | XON or XCN | |
patientId | ExternalIdentifer | R | CX | |
sourceId | ExternalIdentifer | 554ac39e-e3fe-47fe-b233-965d2a147832 | R | OID |
submissionTime | Slot | R | DTM | |
title | Name | O | ||
uniqueId | ExternalIdentifer | 96fdda7c-d067-4183-912e-bf5ee74998a8 | R | OID |
XDSFolder | ||||
codeList | Classification | 1ba97051-7806-41a8-a48b-8fce7af683c5 | R | |
comments | Description | O | ||
patientId | ExternalIdentifier | f64ffdf0-4b97-4e06-b79f-a52b38ec2f8a | R | CX |
title | Name | O | ||
uniqueId | ExternalIdentifier | 75df8f67-9973-4fbe-a900-df66cefecc5a | R | OID |
This table with the overview above should be sufficient for an engineer with some knowledge of XML and HL7 Version 2 data types to put together an XDS Submission.
Tuesday, April 20, 2010
What will happen to HITSP Specifications after April 30?
There are three different answers to this question. The short answer is that the HITSP specifications will not be disappearing from the web. There are two slightly longer answers:
The first longer answer is that work is under way in ANSI to determine what will be happening to this material from HITSP. A communication will be made to HITSP members before the expiration of the HITSP contract with ONC. I expect to hear something soon from ANSI/HITSP leadership on the status of the web site. Given that there are current activities supporting meaningful use making use of these specifications, including the NHIN Connect project, I do not expect them to disappear from the web.
The second answer is supported by the copyright notice appearing in all HITSP specifications and appearing below:
The first longer answer is that work is under way in ANSI to determine what will be happening to this material from HITSP. A communication will be made to HITSP members before the expiration of the HITSP contract with ONC. I expect to hear something soon from ANSI/HITSP leadership on the status of the web site. Given that there are current activities supporting meaningful use making use of these specifications, including the NHIN Connect project, I do not expect them to disappear from the web.
The second answer is supported by the copyright notice appearing in all HITSP specifications and appearing below:
COPYRIGHT NOTICEThis indicates that the specification can be posted in their entirety on another web site. I have already downloaded all current specfications, and many prior versions from the HITSP web site. If nothing official happens through ANSI, I promise they will become available from another location.
© 2010 ANSI. This material may be copied without permission from ANSI only if and to the extent that the text is not altered in any fashion and ANSI’s copyright is clearly noted.
Monday, April 19, 2010
Policy Challenges & Infrastructure Requirements to Facilitate Patient/Consumers’ Meaningful Use of HIT
ONC is requesting feedback on Stages 2 and 3 for meaningful use. The HIT Policy Committee's Meaningful Use workgroup will be holding hearings on April 20th to discuss patient and family engagement.Panel 3: Policy Challenges & Infrastructure Requirements to Facilitate Patient/Consumers’ Meaningful Use of HIT
Last week I posted my feedback on the questions to the first and second panels. Today I'm posting my feedback on the questions for the third panel.
a. What is required for vendors to be able to export data from EHRs in such a way that consumers and patients can use the data in meaningfully?
A constrained set of standards that allows for unambiguous exchange of personal health data. The current set of standards provides too much flexibility in exchange, and does not cover the various kinds of exchanges (e.g., discharge summary, history and physical, consult) needed beyond the exchange of simple summary records.
b. What is the role of providers in making data available to patients in a meaningful way?
Providers need to 1) make the data available to patients, 2) educate patients on what is contained in the records, 3) and provide resources that help patients make better use of this information.
c. What are the meaningful uses of that data once exported? What evidence of measureable benefits exist?
Tracking health conditions over time, understanding treatment options and costs, understanding treatment effects, allowing for better care coordination between providers at different locations. With regard to evidence, I expect that studies are needed.
d. What are the privacy and trust issues that might affect this from happening?
This is often offered as a challenge to sharing information, but I think the obverse is true. More information sharing and transparency will create greater trust between patients and providers.
Last week I posted my feedback on the questions to the first and second panels. Today I'm posting my feedback on the questions for the third panel.
a. What is required for vendors to be able to export data from EHRs in such a way that consumers and patients can use the data in meaningfully?
A constrained set of standards that allows for unambiguous exchange of personal health data. The current set of standards provides too much flexibility in exchange, and does not cover the various kinds of exchanges (e.g., discharge summary, history and physical, consult) needed beyond the exchange of simple summary records.
b. What is the role of providers in making data available to patients in a meaningful way?
Providers need to 1) make the data available to patients, 2) educate patients on what is contained in the records, 3) and provide resources that help patients make better use of this information.
c. What are the meaningful uses of that data once exported? What evidence of measureable benefits exist?
Tracking health conditions over time, understanding treatment options and costs, understanding treatment effects, allowing for better care coordination between providers at different locations. With regard to evidence, I expect that studies are needed.
d. What are the privacy and trust issues that might affect this from happening?
This is often offered as a challenge to sharing information, but I think the obverse is true. More information sharing and transparency will create greater trust between patients and providers.
Friday, April 16, 2010
Incorporating Patient-Generated Data in Meaningful Use of HIT
ONC is requesting feedback on Stages 2 and 3 for meaningful use. The HIT Policy Committee's Meaningful Use workgroup will be holding hearings on April 20th to discuss patient and family engagement.
Yesterday I posted my feedback on the questions to the first panel. Today I'm posting my feedback on the questions for the second panel.
Panel 2: Incorporating Patient-Generated Data in Meaningful Use of HIT
a. What is the role of patient-generated data in improving health of individuals? What is the evidence?
If you broaden the question, I think you will find the evidence you seek. How does self monitoring, data gathering, trending and review of available information on event W by individuals increase the ability of individuals in responding to event Z related to W. The key factor here would be understanding what the impact of knowledge of relationships between W and Z are on the individual monitoring the event.
b. How can patient-reported data be integrated into EHRs and the clinicians’ workflow to improve care management?
Read the use case for the Medication / Registration Summary and the HITSP response to that. That would be a huge first step. If a provider can give me the data, and I have get the tools to manage and update it, and give it back to them, they've just saved me a great deal of time in obtaining healthcare, and themselves in entering it accurately, and eliminated a key "new patient" frustration. If that can be done ahead of time through integration with HIT, so that I can send the data to my provider the same way that I can routinely communicate with my bank (using an application of MY choosing), then you've provided me with tools that will allow me to make better use of both my and my provider's time.
c. How can future conceptions of personal health information platforms and information tools facilitate patient-centered care, including transparency, coordinated care, patient activation, while protecting patient privacy?
This question assumes that patient centeredness, transparency, coordinated care, patient activitation and privacy are all key requirements of a future personal health information platform, and I almost want to ask what your evidence for that is.
Experience tells us that killer applications come from unexpected directions. One unexpected direction might be the move towards "less privacy". If you look at services like 23 and me, or Healthcare related social networking sites, you'll see that activated patients who have information and share that information with others are getting around barriers to transparency, and towards being better coordinators of their own care. Often times that activation occurs through the possession of an important piece of information (e.g., a life threatening diagnosis). I think an important way forward is to enable more safe spaces where people can discuss healthcare issues publically without stigma, so that "invasions of privacy" due to knowledge of a person's health conditions stops being threatening. One advance in this direction in the policy sphere was the Genetic Non-Discrimination Act.
d. What is the role of the patient in ensuring data in EHRs is accurate?
The patient's role (perhaps even better to say responsibility) in verifying their clinical data is just like the consumer's role in ensuring that their credit score is accurate. Consumers that want good care will do what they can to make sure their providers have the right data, but to start with, they need to have the data their providers have, and appropriate resources to help them interpret it. That means that a) consumers need to become more educated about the topic, b) providers, and our social structures (e.g., education) need to educate consumers. If you want to make lasting change the behavior of consumers in general, you need to start young.
e. What are your recommendations for meaningful use criteria for 2013 and 2015 that are achievable by a broad spectrum of providers?
It is way too early to answer this question fully, since the broad spectrum of healthcare providers are still struggling to understand what meaningful use for 2011 means.
Some early thoughts: Reduce optionality presented in the 2011 criteria to establish a single set of standards. Look at health information exchange functions such as lab and diagnostic imaging result delivery, referrals, web-enabled appointment scheduling and personal health records. Align other incentives on the use of technology (outside of those incentives already specified within HITECH).
Yesterday I posted my feedback on the questions to the first panel. Today I'm posting my feedback on the questions for the second panel.
Panel 2: Incorporating Patient-Generated Data in Meaningful Use of HIT
a. What is the role of patient-generated data in improving health of individuals? What is the evidence?
If you broaden the question, I think you will find the evidence you seek. How does self monitoring, data gathering, trending and review of available information on event W by individuals increase the ability of individuals in responding to event Z related to W. The key factor here would be understanding what the impact of knowledge of relationships between W and Z are on the individual monitoring the event.
b. How can patient-reported data be integrated into EHRs and the clinicians’ workflow to improve care management?
Read the use case for the Medication / Registration Summary and the HITSP response to that. That would be a huge first step. If a provider can give me the data, and I have get the tools to manage and update it, and give it back to them, they've just saved me a great deal of time in obtaining healthcare, and themselves in entering it accurately, and eliminated a key "new patient" frustration. If that can be done ahead of time through integration with HIT, so that I can send the data to my provider the same way that I can routinely communicate with my bank (using an application of MY choosing), then you've provided me with tools that will allow me to make better use of both my and my provider's time.
c. How can future conceptions of personal health information platforms and information tools facilitate patient-centered care, including transparency, coordinated care, patient activation, while protecting patient privacy?
This question assumes that patient centeredness, transparency, coordinated care, patient activitation and privacy are all key requirements of a future personal health information platform, and I almost want to ask what your evidence for that is.
Experience tells us that killer applications come from unexpected directions. One unexpected direction might be the move towards "less privacy". If you look at services like 23 and me, or Healthcare related social networking sites, you'll see that activated patients who have information and share that information with others are getting around barriers to transparency, and towards being better coordinators of their own care. Often times that activation occurs through the possession of an important piece of information (e.g., a life threatening diagnosis). I think an important way forward is to enable more safe spaces where people can discuss healthcare issues publically without stigma, so that "invasions of privacy" due to knowledge of a person's health conditions stops being threatening. One advance in this direction in the policy sphere was the Genetic Non-Discrimination Act.
d. What is the role of the patient in ensuring data in EHRs is accurate?
The patient's role (perhaps even better to say responsibility) in verifying their clinical data is just like the consumer's role in ensuring that their credit score is accurate. Consumers that want good care will do what they can to make sure their providers have the right data, but to start with, they need to have the data their providers have, and appropriate resources to help them interpret it. That means that a) consumers need to become more educated about the topic, b) providers, and our social structures (e.g., education) need to educate consumers. If you want to make lasting change the behavior of consumers in general, you need to start young.
e. What are your recommendations for meaningful use criteria for 2013 and 2015 that are achievable by a broad spectrum of providers?
It is way too early to answer this question fully, since the broad spectrum of healthcare providers are still struggling to understand what meaningful use for 2011 means.
Some early thoughts: Reduce optionality presented in the 2011 criteria to establish a single set of standards. Look at health information exchange functions such as lab and diagnostic imaging result delivery, referrals, web-enabled appointment scheduling and personal health records. Align other incentives on the use of technology (outside of those incentives already specified within HITECH).
Thursday, April 15, 2010
Meaningful Use of HIT in the Real Lives of Patients and Families
ONC is requesting feedback on Stages 2 and 3 for meaningful use. The HIT Policy Committee's Meaningful Use workgroup will be holding hearings on April 20th to discuss patient and family engagement.
There are three panels. The full list of questions for each panel can be found at the link above. The questions addressed to the first panel and some of my own responses to it appear below.
Panel 1: Meaningful Use of HIT in the Real Lives of Patients & Families
a. What are consumers’ health information needs in the context of their real lives?
What is the impact of my diagnosis on my health? What are effective treatments for it? How much will they cost me, considering all the different treatment options? What changes can I make in my life that will improve the outcomes? Who is best at treating it in my area? What does this lab result mean? How does it compare to previous tests I've had? Are they getting better or worse? What can I do to make it "better"?
I or my family member needs to be able access to detailed records on a prior health event with respect to emergency care being recieved in the evening or on the weekend. The provider with that chart is 500 miles away and it is after hours, so nobody is available who can go get that chart and send it to the attending provider.
I'm on vacation out of town and ran out of/or forgot to bring my medications, how can I easily get replacements without having to make 3 calls?
My daughter has a sore throat. Should I worry or not? Should we do more than treat the symptoms? This week no, last week yes, depending on what's going around, but I'm not in the loop for that information.
Given a standardized basket of healthcare goods that approximates my needs, which providers (have better outcomes), and how much will my care cost? Same question for insurers.
b. How do results of ethnographic studies of individuals with chronic health conditions inform our understanding of how HIT can improve their use of health information and connectivity with their providers to improve their health?
Frankly, I don't know the answer to this one. This question is so specific that it begs of an answer that may be already known or suspected. If I were a lawyer, I might object on the basis that council is leading the witness. I'd be interested in more information on what the real question is here.
c. What is the evidence base for patient benefit from their direct use of PHRs and other HIT that interacts with EHRs?
I think the real question should be "what does the evidence show about the effects of patient involvement and education with respect to the effectiveness of their healthcare." Technology is one way to enable involvement and education. Technology solutions are still evolving, and I don't think we have yet seen the "killer app".
d. What is the role of mobile applications in improving health of individuals? Is there a specific role for underserved populations?
Today, mobile applications are already used by patients to keep track of significant health events, including monitoring (glucose, blood pressure), diet, and excercise. In terms of specific roles for underserved populations, HIT is presently used in my state to support and provide health information to homeless persons. These roles are still evolving. The "killer apps" aren't developed yet, the first step is to make the data available so that they can be created.
e. How can we use HIT to make information and knowledge actionable for patients?
By enabling patients to obtain the answers to the questions listed under item 1a above.
f. How does HIT enhance collaboration between patients and their providers and change how the patient’s health is managed?
Patients today can already remotely review their health information, including diagnoses, medications and lab results; schedule appointments; request prescription renewals; communicate directly with their healthcare providers; or obtain more information on specific diagnosis, medications, et cetera. They can also update their health history information, and complete questionaires and assessment instruments online rather than having to have an in-person encounter. Providers can respond to these requests when they have time available without having to interrupt existing workflows, at dramatically reduced costs for the patient and provider.
There are three panels. The full list of questions for each panel can be found at the link above. The questions addressed to the first panel and some of my own responses to it appear below.
Panel 1: Meaningful Use of HIT in the Real Lives of Patients & Families
a. What are consumers’ health information needs in the context of their real lives?
What is the impact of my diagnosis on my health? What are effective treatments for it? How much will they cost me, considering all the different treatment options? What changes can I make in my life that will improve the outcomes? Who is best at treating it in my area? What does this lab result mean? How does it compare to previous tests I've had? Are they getting better or worse? What can I do to make it "better"?
I or my family member needs to be able access to detailed records on a prior health event with respect to emergency care being recieved in the evening or on the weekend. The provider with that chart is 500 miles away and it is after hours, so nobody is available who can go get that chart and send it to the attending provider.
I'm on vacation out of town and ran out of/or forgot to bring my medications, how can I easily get replacements without having to make 3 calls?
My daughter has a sore throat. Should I worry or not? Should we do more than treat the symptoms? This week no, last week yes, depending on what's going around, but I'm not in the loop for that information.
Given a standardized basket of healthcare goods that approximates my needs, which providers (have better outcomes), and how much will my care cost? Same question for insurers.
b. How do results of ethnographic studies of individuals with chronic health conditions inform our understanding of how HIT can improve their use of health information and connectivity with their providers to improve their health?
Frankly, I don't know the answer to this one. This question is so specific that it begs of an answer that may be already known or suspected. If I were a lawyer, I might object on the basis that council is leading the witness. I'd be interested in more information on what the real question is here.
c. What is the evidence base for patient benefit from their direct use of PHRs and other HIT that interacts with EHRs?
I think the real question should be "what does the evidence show about the effects of patient involvement and education with respect to the effectiveness of their healthcare." Technology is one way to enable involvement and education. Technology solutions are still evolving, and I don't think we have yet seen the "killer app".
d. What is the role of mobile applications in improving health of individuals? Is there a specific role for underserved populations?
Today, mobile applications are already used by patients to keep track of significant health events, including monitoring (glucose, blood pressure), diet, and excercise. In terms of specific roles for underserved populations, HIT is presently used in my state to support and provide health information to homeless persons. These roles are still evolving. The "killer apps" aren't developed yet, the first step is to make the data available so that they can be created.
e. How can we use HIT to make information and knowledge actionable for patients?
By enabling patients to obtain the answers to the questions listed under item 1a above.
f. How does HIT enhance collaboration between patients and their providers and change how the patient’s health is managed?
Patients today can already remotely review their health information, including diagnoses, medications and lab results; schedule appointments; request prescription renewals; communicate directly with their healthcare providers; or obtain more information on specific diagnosis, medications, et cetera. They can also update their health history information, and complete questionaires and assessment instruments online rather than having to have an in-person encounter. Providers can respond to these requests when they have time available without having to interrupt existing workflows, at dramatically reduced costs for the patient and provider.
Tuesday, April 13, 2010
Next steps for Patient Engagement
I've been reading a number of interesting articles recently about patient engagement. This one talks about how HIT can complicate healthcare conversations. A report from Brian Ahier talks about how patients are starting to use Personal health records. Another post talks about how Health Data is Useful… if it Informs Conversations. Finally, one commentor posted on a blog (that I can no longer find) reports on how his provider's office said: "We don't like our patients to use the Internet."
The theme apparent in all of these articles that providers and healthcare IT will need to learn to work with a new breed of patient. We will all have some technological needs that are as yet unmet and even as yet undiscovered. But underlying all of that, the theme is really that change is coming. I've talked before about how generational change may be necessary when I discussed Healthcare Reform.
As consumers, we can spend hours researching the best TV brand or automobile to purchase. Why wouldn't we, and why shouldn't we do the same for our own health? We need to teach people how to be better patients, and give them the tools that they need to do that. This is one of my favorite public service ads on that topic (from ARHQ):
I want to encourage patient engagement. So now I have to think about what tools patients like my mother, me, or my children need to have to be engaged with healthcare providers. It starts, I think, with providing the skills and data to make good decisions. We need to teach our youth the skill to be a good patient the same way that we teach them other survival skills for our society such as how to use a checkbook, keyboard, or steering wheel. My kids could already do more with computers at the age of 3 than I could at the age of 13. Surely, if we succeed they will be better patients that I am today.
As patients, I hope that they will be able to come to their doctor with their own copy of their electronic medical records and the set of questions that they have that they have researched from the Internet, their friends and their family. If their grandmother can do it, and I can do it, they can learn it. But right now, this is NOT a skill that is taught to them in school.
At the same time, we need to start exploring the technology they will ultimately use. The PHR is only the first step in meeting their technology needs. The final answer is clearly some cool technology yet to be invented. But to figure out what that technology is, we have to start getting the information out there, and connecting it to the patient and provider.
What does this mean for healthcare IT? After that first step, I don't know, but we all need to be thinking about it. I don't know where e-patients are on the hype curve, but I do know that my own personal chasm of disillusionment will be deep and bitter if we don't start looking at this trend and start trying to address it.
The theme apparent in all of these articles that providers and healthcare IT will need to learn to work with a new breed of patient. We will all have some technological needs that are as yet unmet and even as yet undiscovered. But underlying all of that, the theme is really that change is coming. I've talked before about how generational change may be necessary when I discussed Healthcare Reform.
As consumers, we can spend hours researching the best TV brand or automobile to purchase. Why wouldn't we, and why shouldn't we do the same for our own health? We need to teach people how to be better patients, and give them the tools that they need to do that. This is one of my favorite public service ads on that topic (from ARHQ):
I want to encourage patient engagement. So now I have to think about what tools patients like my mother, me, or my children need to have to be engaged with healthcare providers. It starts, I think, with providing the skills and data to make good decisions. We need to teach our youth the skill to be a good patient the same way that we teach them other survival skills for our society such as how to use a checkbook, keyboard, or steering wheel. My kids could already do more with computers at the age of 3 than I could at the age of 13. Surely, if we succeed they will be better patients that I am today.
As patients, I hope that they will be able to come to their doctor with their own copy of their electronic medical records and the set of questions that they have that they have researched from the Internet, their friends and their family. If their grandmother can do it, and I can do it, they can learn it. But right now, this is NOT a skill that is taught to them in school.
At the same time, we need to start exploring the technology they will ultimately use. The PHR is only the first step in meeting their technology needs. The final answer is clearly some cool technology yet to be invented. But to figure out what that technology is, we have to start getting the information out there, and connecting it to the patient and provider.
What does this mean for healthcare IT? After that first step, I don't know, but we all need to be thinking about it. I don't know where e-patients are on the hype curve, but I do know that my own personal chasm of disillusionment will be deep and bitter if we don't start looking at this trend and start trying to address it.
Monday, April 12, 2010
An Education Question
I've come to the conclusion that I need more education. I deal with a lot of professionals in this space, many of whom have credentials like M.D., M.Sc., RN, or Ph.D. after their name. I have none of those, and cannot even claim a B.S. ;-)
Additional credentials would help, but what's really more important to me is the shared understanding one has with others who have been similarly educated in a particular field.
My education in technology and software development, while not including all the traditional coursework, is not all that ad hoc, but my education in healthcare is a quite a bit more chaotic. I often find myself at the second order of ignorance in the healthcare space, as I don't know what I don't know. It can sometimes makes it difficult to understand others, and even more so to teach, because I don't know what students have already learned so that I can build from it.
One of my goals this year is to get enrolled in a Master's program in Health Informatics. I want to learn what I don't know that others in this space do. My challenge is that lack of a first credential. I've found it rather difficult to locate a program that will have me. I've presented at symposia and been a guest instructor for informatics classes at institutions that would not be able to enroll me in a masters program because of my lack of a bachelor's degree. That makes me both giggle and cringe.
An optimal outcome for me would be to find a Master's program that would accept me without a bachelor's degree. I have no interest in completing a bachelor's outside my chosen field, and not a lot of interest in spending the money to obtain that particular credential in it. So, what are the other options?
If you think you have an answer, feel free to leave a comment here, as I personally know others in this field similarly situated who could also benefit, or contact me directly.
Additional credentials would help, but what's really more important to me is the shared understanding one has with others who have been similarly educated in a particular field.
My education in technology and software development, while not including all the traditional coursework, is not all that ad hoc, but my education in healthcare is a quite a bit more chaotic. I often find myself at the second order of ignorance in the healthcare space, as I don't know what I don't know. It can sometimes makes it difficult to understand others, and even more so to teach, because I don't know what students have already learned so that I can build from it.
One of my goals this year is to get enrolled in a Master's program in Health Informatics. I want to learn what I don't know that others in this space do. My challenge is that lack of a first credential. I've found it rather difficult to locate a program that will have me. I've presented at symposia and been a guest instructor for informatics classes at institutions that would not be able to enroll me in a masters program because of my lack of a bachelor's degree. That makes me both giggle and cringe.
An optimal outcome for me would be to find a Master's program that would accept me without a bachelor's degree. I have no interest in completing a bachelor's outside my chosen field, and not a lot of interest in spending the money to obtain that particular credential in it. So, what are the other options?
If you think you have an answer, feel free to leave a comment here, as I personally know others in this field similarly situated who could also benefit, or contact me directly.
Thursday, April 8, 2010
Validating CDA Documents
One of the benefits of using Schematron to validate CDA documents is that the assertion of conformance to a template in a CDA element can be used to trigger the testing rules in a schematron. A question that recently came up is: What schematrons do I need to use to validate a document of type X (where X is an HL7 Implementation Guide, IHE Profile, HITSP specification or other guide)?
The best answer that I can give is in fact: ALL of them. The reason for this is that most CDA implementation guides permit inclusion of sections or entries not otherwise prohibited. To verify an instance, you want to be able to check as much of what is in it as possible. If the instance uses a template that isn't required of it, you'd still want to be sure that it used that template correctly. So, when testing a CDA instance to see that it is valid, what you need is:
1. For each template that you recognize, verify that the instance conforms to the template.
2. For each template that you do not recognize, issue a warning that the test tool cannot verify that the instance conforms to the requirements of that template.
If you just want to verify that an instance is valid against a specific implementation guide, that's a slightly different story. At that point, you can just apply all the rules of that guide and any other rules that it also requires you to apply.
Two related issues also crop up with respect to CDA schema validation:
1. A CDA Document must conform to the CDA Schema
2. A CDA Document may contain extensions.
These two statements would appear to conflict with each other. However, CDA is clearly extensible according to the standard. Extensions are permitted but must be defined in a namespace prefix that is disctinct from the urn:hl7-org:v3 namespace.
So the first statement really needs to say:
1. A CDA Document minus any extension attributes or elements (and their content), must conform to the CDA Schema.
There are some attributes in a CDA document that are not defined in the CDA schema, but which are not "extension" attributes. The two most common non-extension attributes appearing in CDA instances are xsi:type and the namespace declarations (attributes matching the pattern: xmlns and xmlns:*).
A very simple XSLT transform will allow you to generate a CDA document minus extension element and attributes. The following template will copy all elements in an tree that are in the CDA namespace, and remove all that are not. You will need to add templates to match and copy attributes from the HL7 and other appropriate namespaces (an excercise I leave for the reader).
‹xsl:template match='cda:*'›
‹xsl:copy›
‹xsl:apply-templates select='@*'›
‹xsl:apply-templates select='cda:*'›
‹/xsl:copy›
‹/xsl:template›
Having produced an extension free CDA instance, you can validate it according to the CDA Schema.
Validating extension elements themselves is often done through Schematron, but can also be done by modifying the base CDA Schema to insert these elements at the appropriate locations. This has been done for the NIST Validator when testing against the HITSP C32 specification.
Finally, there are several notes appearing in the CDA model that are not verified by Schema or available schematrons. Some of these are checked by the Eclipse Instance Editor. Many of these could be checked by Schematron, a project for you to work on if you are interested.
The best answer that I can give is in fact: ALL of them. The reason for this is that most CDA implementation guides permit inclusion of sections or entries not otherwise prohibited. To verify an instance, you want to be able to check as much of what is in it as possible. If the instance uses a template that isn't required of it, you'd still want to be sure that it used that template correctly. So, when testing a CDA instance to see that it is valid, what you need is:
1. For each template that you recognize, verify that the instance conforms to the template.
2. For each template that you do not recognize, issue a warning that the test tool cannot verify that the instance conforms to the requirements of that template.
If you just want to verify that an instance is valid against a specific implementation guide, that's a slightly different story. At that point, you can just apply all the rules of that guide and any other rules that it also requires you to apply.
Two related issues also crop up with respect to CDA schema validation:
1. A CDA Document must conform to the CDA Schema
2. A CDA Document may contain extensions.
These two statements would appear to conflict with each other. However, CDA is clearly extensible according to the standard. Extensions are permitted but must be defined in a namespace prefix that is disctinct from the urn:hl7-org:v3 namespace.
So the first statement really needs to say:
1. A CDA Document minus any extension attributes or elements (and their content), must conform to the CDA Schema.
There are some attributes in a CDA document that are not defined in the CDA schema, but which are not "extension" attributes. The two most common non-extension attributes appearing in CDA instances are xsi:type and the namespace declarations (attributes matching the pattern: xmlns and xmlns:*).
A very simple XSLT transform will allow you to generate a CDA document minus extension element and attributes. The following template will copy all elements in an tree that are in the CDA namespace, and remove all that are not. You will need to add templates to match and copy attributes from the HL7 and other appropriate namespaces (an excercise I leave for the reader).
‹xsl:template match='cda:*'›
‹xsl:copy›
‹xsl:apply-templates select='@*'›
‹xsl:apply-templates select='cda:*'›
‹/xsl:copy›
‹/xsl:template›
Having produced an extension free CDA instance, you can validate it according to the CDA Schema.
Validating extension elements themselves is often done through Schematron, but can also be done by modifying the base CDA Schema to insert these elements at the appropriate locations. This has been done for the NIST Validator when testing against the HITSP C32 specification.
Finally, there are several notes appearing in the CDA model that are not verified by Schema or available schematrons. Some of these are checked by the Eclipse Instance Editor. Many of these could be checked by Schematron, a project for you to work on if you are interested.
Wednesday, April 7, 2010
Advance Directives
One question that often comes up is why the CCD, or other specifications based upon it only specify what kind of advance directive exists and where to find it, and do not what the content further. But providers do want to know what's in these things so that they can act appropriately.
When this topic was discussed (in three different venues I might add), three years ago, there was a problems identified with classifying details of an advance directive. The classification of advance directives depends upon the legal jurisdiction where it is interpreted. To say any more than "an advance directive with respect to resuscitation status" exists could have patient safety and legal implications that could not be resolved using any existing controlled vocabulary, and it still cannot today.
Many advance directives are are legal documents, and the content of them is often unconstrained. It could say just about anything you want to in words, and that even when the document has same name (a classification such as a DNR Order), they may have different meanings depending upon the state in which you live or are recieving care.
As a consequence, the CCD and specifications from IHE and HITSP which derive from it do not specify what appears within an advance directive. The advice given to the developers of CCD, based on policies and procedures in force at many institutions, was to alert the provider of the existence of these documents, and to provide references to where they could obtain the content, so as to comply with what the advance directive itself says, and not based on a shorthand assessment and classification of its content.
When this topic was discussed (in three different venues I might add), three years ago, there was a problems identified with classifying details of an advance directive. The classification of advance directives depends upon the legal jurisdiction where it is interpreted. To say any more than "an advance directive with respect to resuscitation status" exists could have patient safety and legal implications that could not be resolved using any existing controlled vocabulary, and it still cannot today.
Many advance directives are are legal documents, and the content of them is often unconstrained. It could say just about anything you want to in words, and that even when the document has same name (a classification such as a DNR Order), they may have different meanings depending upon the state in which you live or are recieving care.
As a consequence, the CCD and specifications from IHE and HITSP which derive from it do not specify what appears within an advance directive. The advice given to the developers of CCD, based on policies and procedures in force at many institutions, was to alert the provider of the existence of these documents, and to provide references to where they could obtain the content, so as to comply with what the advance directive itself says, and not based on a shorthand assessment and classification of its content.