I'm going to focus on the test procedures focusing on standards-based interoperability, rather than those covering specific EHR functionality.
The first tests that I'm going to look is maintenance of the problem list and that's because it has a vocabulary standard associated with it:
§170.314(a)(5) Problem list
The relevant content from the test appear below:
IN170.314.a.5 – 4.02: Tester shall verify that the patient problem list data entered during the test are associated with the required standard terminology. Verification methods include, but are not limited to:It's going to be critical for folks to be able to show that they either record the SNOMED CT code, or can map to it. I'd like to see an additional verification method included that says something like:
- verifying that the appropriate terminology code is displayed along with the patient problem description when the user is recording patient problems; or
- verifying that the EHR includes the capability to cross-reference (map) the user-displayed problem descriptions to the appropriate vocabulary codes; or
- verifying that the patient problem list data stored in the EHR contains the appropriate vocabulary codes
- verifying that the problem list in a CCDA document produced by the EHR contains the appropriate problem descriptions and codes.
This is something that could be automated with a conformance test tool. You'll note that this procedure has no tools listed.
The next two procedures (Medication, and Medication Allergy List) are quite similar to the problem list test, save that they do not include vocabulary tests (because the rule doesn't require direct support of the RxNORM vocabulary terms in the EHR, even though it is likely a good idea).
I skip also the Drug Formulary test, because you are no longer required to use a specific standard in a Certified EHR. Note, however, that providers conforming to Medicare Part D will still need that capability, it just won't be tested for the 2014 criteria.
Next up is the test procedure for smoking status:
§170.314(a)(11) Smoking status
IN170.314.a.11 – 1.02: Tester shall verify that the patient smoking status data and the SNOMED CT®
codes are captured and stored in the patient’s record ...
This one is interesting, because they've not taken the same approach here as they did for problem list. In this case, they ask for the specific SNOMED CT code to be recorded, and don't mention verification using the display, mapping or storage. I'd make the two consistent, following the problem list approach (which could include a CCDA verification test procedure).
Next we get to Patient Education Resources
This one is wrong on several levels.
- Two InfoButton guides are named with two different styles of messaging being used, but there is no verification procedure showing that the messages are being sent in any way.
- Their are no test tools to verify the message content. Come on, we've got an XML Schema for one guide, and a RESTful interface for the other one. There ought to be simple one-page JSP files that validate the message structure and coded content of the outbound query.
- It allows "manual re-entry" of EHR data. See below for details:
Here's the relevant portion of the rule [emphasis mine]:
EHR technology must be able to electronically identify for a user patient-specific education resources based on data included in the patient's problem list, medication list, and laboratory tests and values/results
Here's the relevant portion of the test [emphasis also mine]:
The Tester electronically identifies patient-specific education resources using the HL7 Context-Aware Knowledge Retrieval Standard (with the applicable implementation guide) and at least one other vendor identified EHR function for two or more patients based on, at a minimum, the data included in the patients’ problem lists, medication lists, and laboratory tests and values/results (these queries do not need to be derived automatically from the patient data stored in the EHR; these queries may be accomplished by the Tester entering the patient data stored in the EHR into an education resources module that is accessed by the Tester through the EHR).
Say what?! Queries do not need to be derived "automatically" from patient data, but may be accomplished by (re)entering the data in another module?
Somewhere this isn't right. I don't think that's what the rule intended when it referenced "data included in the patient's ...", because if the patient's chart wasn't used to get the data, then they could have just said: education resources based on problems, medications, laboratory tests and/or results. But they didn't say that. They said "the patient's ... "
No provider should EVER have to reenter data in this fashion. At the very least, it's a huge usability issue and could even be a patient safety issue (hypotension vs hypertension is an easy spelling error). At the very most, providers might be asked to identify the relevant data in the chart which can be used in the InfoButton query.
Another sad point is that the guides mentioned in the rule, and the rule itself does not specify any requirement for coded content in the InfoButton messages. That's a loss to be addressed in the 2016 criteria. Of course, by then, my hope is that everyone will be using the IHE Retrieve Clinical Knowledge profile of InfoButton, and they can name that guide the next time around.
No comments:
Post a Comment