How long should it take to deploy a new measure? In the ideal world, where everyone uses a common information model, and supports APIs, the answer should be: Almost no time at all, the time should be measured in units of hours or smaller.
GIVEN a measure has been defined
AND it is ready for deployment
WHEN that measure is available
THEN it is deployed to a system
AND is available to be reported on within an hour of use.
The next issue has to do with validating that the measure works as expected. You don't just want to install new software and have it start reporting garbage data. Someone needs to verify that it works, and approve it for reporting. So now you have to schedule a person to deal with that, and they have to fit it into their schedule. This should take on the order of days or less.
GIVEN a measure has been deployed,
WHEN that measure has been validated for use,
THEN reporting on it begins.
But wait, somebody has to define this measure and do so clearly. How long does that take? Realistically, if you actually KNOW ALL of the detail of what you are doing, AND the data is available, a competent analyst can probably work out an initial draft in a week or so.
GIVEN that the information needed to be reported is available in the reporting system
WHEN the measure is defined computably
THEN it can be deployed.
But wait, you actually have to test this out. There's that validation step that involves a human, and that can produce errors in interpretation. Written (or spoken) language is NOT precise. It has ambiguity, which can result in different interpretations. So, you have to check that. So we need to change that last statement to:
THEN it can be deployed for testing.
Now you have to involve some test subjects (people and systems), and work that through. And you might add some pre-check time to verify that the requirements as written match the automation as developed. And you have to add some time to address dealing with issues that come back from this. With all the interactions involved, your week just became several weeks, perhaps even a month.
So, how long should it take to report on a new measure? Starting from scratch? Maybe a month. If you have to get agreement from a lot of people/organizations on the measure, you have to factor that process in, and so now, you have to add time for a process of review and evaluation. Now you are talking about a quarter or perhaps two depending on volume of input sources, and feedback from them.
So, the fact that it might take a month to create a new measure with enough detail to support computing is not a surprise, at least to me or anyone else who has done this before. It beats the hell out of throwing a spreadsheet over a wall and asking someone to populate it from some ideal view that they think should exist in the world.
The real issue is that for a lot of this, is not "How long does it take to deploy a new measure", but rather, how ready are we to deal with this kind of emergency. The time to prepare for a disaster is before it happens. You may not know exactly what you will need, but you can make some pretty good guesses. In the SANER Project, we often wrote requirements well in advance of them being released as official measures. The project which started on March 20th had identified just about everything released to date (now July) by April 15th. We've stopped adding requirements for new measures because they've served their purpose in proving out that the system that we've been building will support the measures we need, but here are a few additional measures we know are necessary:
You can basically figure out what to measure by thinking through the disease and pandemic process. It's not like we haven't seen pandemics before (just not recently at THIS scale), or other emergencies for that matter.
The point is, complaining about how long it takes to put together a reasonable, accurate and automatable measure SHOULD be done before hand. And putting together a system to handle it should also have been done beforehand.
My wife and I have a short meme about SHOULD HAVE:
Should'a, would'a, could'a ... DIDN'T.
We didn't. So we have to do it now. And it will take what it takes. Bite the bullet, do it right, so it will be ready for the next time, or the next flu season. So, that's what I'm doing.
GIVEN a measure has been defined
AND it is ready for deployment
WHEN that measure is available
THEN it is deployed to a system
AND is available to be reported on within an hour of use.
The next issue has to do with validating that the measure works as expected. You don't just want to install new software and have it start reporting garbage data. Someone needs to verify that it works, and approve it for reporting. So now you have to schedule a person to deal with that, and they have to fit it into their schedule. This should take on the order of days or less.
GIVEN a measure has been deployed,
WHEN that measure has been validated for use,
THEN reporting on it begins.
But wait, somebody has to define this measure and do so clearly. How long does that take? Realistically, if you actually KNOW ALL of the detail of what you are doing, AND the data is available, a competent analyst can probably work out an initial draft in a week or so.
GIVEN that the information needed to be reported is available in the reporting system
WHEN the measure is defined computably
THEN it can be deployed.
But wait, you actually have to test this out. There's that validation step that involves a human, and that can produce errors in interpretation. Written (or spoken) language is NOT precise. It has ambiguity, which can result in different interpretations. So, you have to check that. So we need to change that last statement to:
THEN it can be deployed for testing.
Now you have to involve some test subjects (people and systems), and work that through. And you might add some pre-check time to verify that the requirements as written match the automation as developed. And you have to add some time to address dealing with issues that come back from this. With all the interactions involved, your week just became several weeks, perhaps even a month.
So, how long should it take to report on a new measure? Starting from scratch? Maybe a month. If you have to get agreement from a lot of people/organizations on the measure, you have to factor that process in, and so now, you have to add time for a process of review and evaluation. Now you are talking about a quarter or perhaps two depending on volume of input sources, and feedback from them.
So, the fact that it might take a month to create a new measure with enough detail to support computing is not a surprise, at least to me or anyone else who has done this before. It beats the hell out of throwing a spreadsheet over a wall and asking someone to populate it from some ideal view that they think should exist in the world.
The real issue is that for a lot of this, is not "How long does it take to deploy a new measure", but rather, how ready are we to deal with this kind of emergency. The time to prepare for a disaster is before it happens. You may not know exactly what you will need, but you can make some pretty good guesses. In the SANER Project, we often wrote requirements well in advance of them being released as official measures. The project which started on March 20th had identified just about everything released to date (now July) by April 15th. We've stopped adding requirements for new measures because they've served their purpose in proving out that the system that we've been building will support the measures we need, but here are a few additional measures we know are necessary:
- Ambulatory provider staffing and availability.
- Immunization supplies.
- Immunization reporting capacity (how many health systems can report immunizations to an Immunization Registry).
- Drug supplies for critical drugs.
- Other measures might include aspects of SDOH, such as # of individuals with food, housing or income challenges due to COVID-19 by county or smaller regional subdivisions such as census tract (basically neighborhood).
You can basically figure out what to measure by thinking through the disease and pandemic process. It's not like we haven't seen pandemics before (just not recently at THIS scale), or other emergencies for that matter.
The point is, complaining about how long it takes to put together a reasonable, accurate and automatable measure SHOULD be done before hand. And putting together a system to handle it should also have been done beforehand.
My wife and I have a short meme about SHOULD HAVE:
Should'a, would'a, could'a ... DIDN'T.
We didn't. So we have to do it now. And it will take what it takes. Bite the bullet, do it right, so it will be ready for the next time, or the next flu season. So, that's what I'm doing.
0 comments:
Post a Comment