Tails of failure often involve a sequence of multiple errors, each of which by itself is not fatal (or in this case, vaguely comedic), but which together produce a totality that is hard to fathom. I was reminded of this today when I went to register for the last class I need to complete my degree program, only to find out that it is on-campus only.
- I've been checking for the last three terms that I knew what I needed, AND
- Knowing also that scheduling can sometimes change, I did my best to get as much out of the way as I could up front, AND
- seeing that I had one dependency I couldn't meet last term, I checked with the instructor to be sure that the class was to be offered this term, AND
- received confirmation ...
- I failed to check that it was going to be available online, AND
- The syllabus I read was out of date, AND
- I failed to notice that the syllabus was out of date, AND
- I missed the wee bit in the course catalog noting it was offered online only in odd years.
I have two hopes left... I'm wrong, or I can do something else. And, no matter which happens, no babies die if I don't graduate the same year as my eldest, so I'll live either way, and still graduate, just a little bit later than I wanted to.
Gah, Pilot error. Or in other words, the person most responsible for making sure things lined up the way they were supposed to failed to accomplish that, in part by failing to note other errors or discrepancies in data available. In this case, I'm the pilot, and guess what, I'm human too.
In healthcare, we often rely on the physician to be the pilot ... but she or he is only human as well, and on a bad day is likely only to perform as well as the systems and people that surround him or her. Design for humanity ... design for error, and the world will be a better place.