Thursday, April 21, 2011

Not Fully Baked

My daughters like Pineapple Upside-Down Cake, especially for breakfast.  The first time my wife made this, she made it up because she didn't have all the ingredients called for in her cookbook to make it from scratch.  She made it using yellow-cake mix, fresh pineapple, brown sugar, and maraschino cherries and a square cake pan.  It took about two hours to finish, instead of the 45 minutes that just mixing and baking the yellow cake would have. We discovered that it wasn't done when we went to cut and serve the cake and discovered that the inside was quite runny.  Time was short, so we decided to do something else for breakfast.  It still had promise, so we put the cake back in the oven instead of throwing it out.  We wound up having it with lunch.  It was yummy.

We aren't sure why it needed 1.5 hours in the oven, but there was a pretty good test we could use for done-ness.  If the toothpick came out clean, it was done.  The next time my wife made it, we planned for 1.5 hours of baking time, and that's about what it took.  Our experience in doing it once before was used to estimate how much time it would take to do it again.

We need new standards.  That is a key message behind the ONC Standards and Interoperability Framework initiatives.  We need mature standards.  That is a key message from Doug Fridsma to the HIT Standards FACA found in John Halamka's blog post of yesterday.  I'm reminded of a cartoon that John Moehrke tweeted yesterday where two parts of an organization are not in sync.

The Direct Project was planned to take six months and instead took nearly a year.  It's maturing quite rapidly compared to other efforts, but still, maturity takes time.  That's an obvious syllogism that seems to have been lost somewhat in the aggressive development activities going on in the Standards and Interoperability Framework.  Project planning 101:  If it's something new that's never been done before, it needs even more time than you probably think.

The CDA Consolidation project got 3 months to complete development of model driven tools and to use them to produce an implementation guide.  The project completed the required document on time, but as I dig into the results, it's still quite runny inside.  If this was a green field, the document would be great.  But it isn't and the document doesn't outline what is different from the C32 Version 2.5 and C83 Version 2.0.  So until I analyze those changes, I won't know what must change in an implementation to know what can be reused.  That analysis of the new requirements for an implementation is a struggle for me, and if I'm having problems with it, I'm certain that others are either equally struggling or even just plain stymied.  If they treat this guide as a new set of requirements, it's simply back to the drawing board and time to start over.  It's only when we can reuse what we did before that it becomes better.  The current specification also did not meet many the requirements that the Documentation Workgroup outlined at the start.  Nor, in my opinion, will it meet the success metrics outlined on the project page as it stands today.  The model data isn't delivered, nor are schematrons, UML models, nor tooling to support import, creation and validation.

As critical as I am, the CDA Consolidation project still has great promise.  I think we are going to need to put it back into the oven before it will be ready to serve.  It may or may not be ready in time for stage 2, but if we wait, it should still be yummy.

5 comments:

  1. What we seem to be missing is a roadmap and timing of how these pieces fit together. The S&I Framework is interesting, but the CDA consolidation initiative is one piece in that picture. So when I look at the S&I Framework in relationship to the CDA consolidation initiative, what is the roadmap/timeline for the reference implementation, pilot demonstration projects, certification and testing?

    This is not an issue with just the CDA consolidation initiative, but the other efforts including the Direct Project.

    ReplyDelete
  2. I updated links today to point to the new Wiki instead of the old one.

    ReplyDelete
  3. > The model data isn't delivered, nor are schematrons, UML models, nor tooling to support import, creation and validation.

    MDHT did deliver a consolidated model (which was flattened from the CCD, IHE PCC and HITSP stack in a semi-automated fashion). The model is available via the MDHT project web site. MDHT also delivered an Consolidated Implementation Guide generated from the UML. Unfortunately, the HL7 balloted spec only contained one small portion of the MDHT generated guide (Problems Section) due to some perceived weaknesses in meeting publication formatting requirements. In addition to a generated guide, MDHT has a Java API and suite of Eclipse-based tools that can be used for validation / testing. So from the MDHT project team perspective, the original objectives of the initiative were met. So the question is, how to get HL7 to move off of the published guide mentality and to adopt the computable MDHT UML models and other generated artifacts as part of the normative spec.

    ReplyDelete
  4. Your comments are on MDHT, mine are on the Consolidation Project. Unfortunately, the two are NOT equivalent.

    ReplyDelete
  5. ONC has made an investment in MDHT solely for the purpose of delivering said artifacts to the various initiatives. Why the artifacts did not make it into the consolidation ballot package is a good question...

    ReplyDelete