When I worked on it, it was already old code, somewhat hard to read, and rather crufty, but it worked. Of course, I hated it. Nobody should ever have to deal with legacy this old. A couple years later I had to update it again to support a new search enhancement product. I had more leeway then, so, I rewrote that code, and incorporated new memory mapped IO technology that was now included with the Windows 32 APIs. Overall I more than doubled the speed of that module, and it worked better, was easier to read, and all that. I no longer hated it, because I had rewritten it.
I later profiled the whole spelling correction system, only to find out that my improvement only improved the total speed of the system by twenty percent. With Moore's law still in effect, my improvement was outpaced by the technology in a little more than three months. Which is what the rewrite took from start to finish.
A friend tells a similar but probably apocryphal story, of optimizing code in a Fortran compiler. Ten years later a bug is reported, which is traced back to his optimization, which as was later discovered could never work. In ten years of deployment, the code he optimized was used by only one customer.
The moral of this story is to put your efforts where they will have value, and understand what that value is. The value of reworking something that already works comes from several places:
- More maintainable. This is a big plus, especially for "legacy code" that needs a lot of maintenance.
- More extensible. This can be valuable, but only if you have some idea where those extensions are going to come into play.
- Faster, better use of resources. Look out for Moore's law. Reductions by an order of magnitude or more are needed to be really useful.
So, how does this related to standards? It takes quite a few years for a standard to go from idea to working product, and a few years more before it is "widely available". One author I happen to respect estimates about 3 - 5 years in this paper on the Standards Value Chain. I've worked with some pretty cool standards, and the people who helped to write them, including HTML, XML, XSLT, XPath and XML Schema. My own experiences with these standards, which now "run the net" reflect that judgement. I also see that holding true with healthcare standards, in more or less the same time span depending upon the novelty of the technology, and the entrenchment of that which it attempts to replace (One of the things holding back HL7 Version 3 is that it attempts to replace working "legacy" technology, without enough return on investment.)
Now, on to health information exchange, and standards and implementation guides for that, and the purpose of this rather long post. I find myself dreadfully annoyed at a poster who comments on "legacy" of implementation guides like XDS, XDR and XDM. These guides are about where I'd expect them to be in an adoption trend that needs about 3-5 years before you see them be available in product.
Usually, I don't respond to "attacks" like this one, but because I'm really annoyed at people who don't read specifications they are responsible for today, I feel like cutting loose. I'll probably be sorry later. Here's the text I found annoying:
The NHIN-Direct project has given us the opportunity to step back from legacy technologies and consider a greenfield solution to allow physicians to actually talk to each other for the benefit of their patients. It has also proven that the HIT community is shackled to its bloated legacy constructs and has become incapable of admitting its missteps or daring to think outside the box. We wouldn't want to lower market entry barriers and put pressure on the incumbent vendors to actually deliver value in a truly competitive market, would we? I'm just thankful that we have seen the user cost of HIT solutions significantly decrease over the past 5 years. (Oops)
First of all, I'm involved and participating in NHIN Direct, and contributing code and solutions, so please don't consider me to be "shackled" to anything. Yes, I'm building off of earlier work. I find that to be helpful, not harmful. Reuse according to one source if five times more effective than rework.
XDS, XDR and XDM were "invented" for the purpose of allowing physicians to actually talk to each other for the benefit of their patients. XDS resulted purely by accident, but the idea was right, and it was purely "thinking out of the box". That was six years ago, and back then it was a greenfield solution. It took two years to get that solution completed to the point where it was ready for implementation, and another two years to see it become readily available in products. These days it is available not only from vendors in the HIT community, in open source, but also from major IT vendors. About on or even slightly ahead of schedule. Over that time, IHE went back and corrected some "mistakes" producing XDS.b (and retained some others), and all that is available too.
Lower market entry barriers? How about 10 open source projects supporting these standards, ready for implementation in healthcare environments, with real world implementations using them? Free code is a pretty low entry point. Code that's been tested by somebody else is also a big plus. I have to compete against open source, as well as every other major IT Vendor out there, who also supports that set of standards.
Now, what will NHIN Direct deliver? Specifications and working code that supports features already available through other solutions. Translating this into my own words I believe that these will be prototypes that will take some time to develop into hardened products that will be available on the market about 1-2 years from completion of this first phase if there is a demand for them. I expect there will be and I also expect that NHIN Direct will break into the adoption curve early. What is the value here? The value is connecting the more than 400,000 physicians out there in small practices, which is why I'm participating in these activities. If I can make XDS/XDR or XDM simple enough for use in NHIN Direct, I will, and would be stupid to not try.
Now, back to my original point. What is the value of starting completely over? I'll admit that something is to be gained, and I likely know where many of the problems are in the existing specifications. With relatively few execeptions, there's very little technology we use today that wouldn't benefit from reinvention (including the automobile, my biggest pet peave). But what is the cost? What will happen to the dozens of implementations of that technology within a few hours of where I live (including the hospital I would use should I need to)?
Richard Soley, CEO of the Object Management group talks about the N+1 standards problem. According to Richard, a new standard rarely if ever reduces the number of standards used to accomplish a task Instead, it creates yet another choice that needs to be implemented. From my own perspective, I have to acknowledge the truth of that. XML hasn't completely replaced SGML yet even though it's clearly better (and in fact, you are using SGML technology right now I'll bet).
So, back to the value statement. What else could I be doing more productively with my time? Personally, I think there's a lot to be said for figuring out how to deal with the "Clincial Decision Support" integration problem, and not a lot of traction in this area. I've been working on that for the past couple of years. Of coure, by the time that problem gets solved, I'll have to deal with someone else telling me that's a legacy solution and that they have a better one.
Please, beat me to it. Then I can go back to playing leapfrog.