A subcommittee of Structured Documents was formed to propose how this would be implemented, and we had our first meeting yesterday. I put together a list of questions based on our discussions, which I shared with the list. Other members of the committee chimed in with additions (thanks to Robert, Thom, Lisa and Brian), and Brian Zvi Weiss put together a consolidated list of all of the input, which you see below. He proposed several additions to the list, and some rewording to turn the questions into declarative statements about what we are doing.
Over the next few weeks, I see this project making significant progress. With regard to tools, HL7 already has GForge, which could support issue input, tracking and publication. There are other tools out there, but unless and until we obtain a budget for anything, we'll have to use what is freely available.
Here's the current state of things. I welcome your input here, or on the HL7 Structured Documents mailing list.
Mission StatementEstablish a process for managing and responding to implementer questions about the C-CDA standard.
- Rapid start-up, and results, validated by implementer feedback
- Increase consistency of C-CDA adoptions
- Reduce need for pre-negotiation in the processing of discrete data
- Provide input and experience to relevant HL7 groups for possible application outside of SDWG and C-CDA
- Establish foundations for “examples library” for C-CDA
- C-CDA focuses on template definition, not implementation guidance
- There isn’t a proven operational model for supporting a standard for rapid, large-scale adoption
- Prioritize support for MU stage 2 data elements
- Develop common terminology to parse issues to determine which process they go through
- Create, and manage a work list of issues and ensure both immediate/tactical/interim resolution, and monitoring and follow-through of longer-term validation of interim solution and/or alternative via standards evolution process
- Work two tracks in parallel:
- Formulation of the proposed process
- Experimentation on execution of elements of the proposed process with current active questions from listserv and C-CDA DSTU 1.1 errata items marked “in process/review”
- Success Criteria
- To be determined
- To be determined
- Question asked/posted/surface as per agreed mechanisms
- Manage the set of issues
- Ensure question meets guidelines in terms of scope and quality (specific, clear, relevant, proposed sample XML, etc.)
- Classify each issue with respect to kind of guidance necessary [Brian: proposed addition] classification impacts handling and SLA (committed time for each phase of handling)
- Assign the issue to relevant parties for resolution
- Develop resolution
- Seek required levels of approval as per the nature of the question and the proposed resolution (and their classification)
- Report to SDWG
- SDWG is ultimate “approving authority” either explicitly or via delegation to support sub-group via agreed mechanisms per each established classification level of question/proposed-response (bulk-reviews, individual topic discussions, post-response review, etc.)
- Rejected resolutions go back to earlier states, as appropriate
- Finalized and publish types of guidance requiring publication (more generally, work each question-guidance situation to the appropriate closure state)
- Special focus on build-out of examples database generated directly and indirectly in the context of resolution/guidance/responses or proactively
- Maintain previously generated guidance artifacts
- Provide input to resources responsible for current standards maintenance activities (errata processing/publishing, re-balloted versions, consideration requirements for new versions of standards, etc.)
Key Questions to Address
- In general: details of process outlined above with associated tables/flow-charts and clarity of each step (inputs, owners, outputs, SLA, etc.)
- What point of entry should questions/interpretations be channeled thru?
- Single point of entry managed by HL7 and routed to appropriate WG/responsible person or entity for resolution (includes ANY HL7 Standard)?
- Single point of entry managed by SDWG (limited only to questions/interpretations related to C-CDA?
- Roles/Responsibilities. Defining the roles in the process – for example:
- Managing inputs – who collects the question from the sources and works with those asking the questions to ensure quality/scope of question.
- Managing the flow – who manages the list and the items in it as they move from step to step in the process
- Classification – who determines the classification of the issues which in turn determines the handling flow, the required approval level, etc.
- Prioritization and resource assignment – who manages the resource pool of answerers and prioritizes the issues as per the required resources and their bandwidth
- Developing resolution – who is authorized to formulate the answers (for each classification-based path)
- In the context of roles/responsibilities above, what are the respective roles of “ad-hoc volunteers” who are HL7 members , “ad-hoc volunteers” who are not HL7 members, SDWG, the support sub-group of SDWG, other HL7 groups, etc.
- What should the turnaround be on each type of issue?
- What types of requests are there?
- A question that can be answered directly using material from the C-CDA DSTU or previously developed official guidance artifacts (generated through this process).
- A question that requires the material from the C-CDA DSTU and prior guidance artifacts to be further interpreted.
- A question that points to errata in prior guidance artifacts
- A question that points to errata in the C-CDA DSTU
- A question that identifies an issue that requires prior guidance artifacts to be reconsidered.
- A question that identifies an issue that requires the C-CDA DSTU to be reconsidered.
- In context of “types of guidance requests” above, need clear guidelines on what is in-scope and what is out-of-scope for this process. Also guidance on quality requirements for questions.
- What is the relationship between Questions and Issues?
- Might several different questions relate to or be consolidated into a single issue before determining the type of guidance required?
- What kinds of guidance are there?
- Errata (a type-o or mistake in the drafting of the standard or prior guidance artifact)
- Clarifications (can be explained using just the standard or prior guidance artifacts)
- Interim Guidance (requires some interpretation of the standard)
- Change Proposals (affects either the standard or the prior guidance artifacts)
- New Feature Proposals (affects the standard)
- Implementation Assistance (What am I doing wrong?)
- External issues (The NIST Validator …)
- How is each type of guidance developed/handled?
- What makes guidance authoritative?
- Resource chartered to supply pointers to existing standard or existing guidance artifacts delivers information to address the question
- Subcommittee Approval
- Committee Vote
- TSC Approval
- What tool is used to manage the items and monitor/workflow their progress?
- What tool is used to manage the examples database?
- How is guidance distributed: Mailing List, Newsletter, Wiki, Other Infrastructure (e.g., GForge), etc.
- How are guidance artifacts managed (stored, accessed, maintained, tracked for usage)?
- Resourcing. What level of resourcing, what source of funding, mix of paid and volunteer work, etc.
- Usage rights/cost. Who gets to use this support mechanism (everyone? HL7 members only?), what does it cost, is there a single answer to those questions or is it a tiered answer (e.g. some level of usage free to everyone, some to HL7 members only, etc.)