By February 20, 2009 6 Comments Read More →

This Pilot Didn’t Follow Standardized Work?

After “Sully” and his crew were lauded as heroes for following checklists, keeping calm, and landing that plane in the Hudson River, it’s sad and interesting that the pilot of the commuter plane that crashed in Buffalo is being accused of “pilot error.”
“The commuter plane that crashed near Buffalo was on autopilot until just before it went down in icy weather, indicating that the pilot may have ignored federal safety recommendations and violated the airline’s own policy for flying in such conditions, an investigator said Sunday. Federal guidelines and the airline’s own instructions suggest a pilot should not engage the autopilot when flying through ice. If the ice is severe, the company that operated Continental Flight 3407 requires pilots to shut off the autopilot.

“You may be able in a manual mode to sense something sooner than the autopilot can sense it,” said Steve Chealander of the National Transportation Safety Board, which also recommends that pilots disengage the autopilot in icy conditions.

Automatic safety devices returned the aircraft to manual control just before it fell from the sky, Chealander said.”

Is this a case of “should not” or “must not”? If the airline “requires” the autopilot to be shut off in “severe” ice, what is the definition of “severe,” I wonder?

If policies and guidelines were not being followed, this seems like less of an “accident.” If the airline required pilots to follow certain rules, what are they doing on an ongoing basis to make sure the rules are being followed? Are they doing anything to “audit” the process or do they just wait for something bad to happen?

This has to be more systemic than just one pilot making an unfortunate fatal error this one time. Some news reports have pointed to problems in the training processes… which is a management responsibility.

There is some speculation that the pilot reacted incorrectly when an automated “stick pusher” system kicked in (ironically, this automation is supposed to HELP, but reports say that pilots often react to it the wrong way… again, this seems systemic and not just this one time).

This article (and the full WSJ version) says, in part:

The safety board, among other issues, is looking into why Colgan’s training programs apparently stop short of allowing pilots in simulators to feel the stick-pusher activate, according to people familiar with the issue. The device is intended to automatically prevent the plane from going into a stall by pointing the nose down to regain speed. Safety experts worry that unless pilots understand and feel what happens when the stick-pusher goes into action in a simulator, they may not react properly when it activates during an in-flight emergency.

In a statement, Colgan said its training programs “meet or exceed the regulatory requirements for all major airlines,” adding that in the wake of the Buffalo crash, it has “specifically re-examined our procedures for this aircraft.”

I don’t think it’s the right thing to just blame this individual pilot, in this case. As with this and other situations (including medical mistakes), it’s important to look at the system and ask:
  1. Do we really understand why this problem happened?
  2. What can we do to prevent from happening again?
  3. What could have prevented this the last time?

Subscribe via RSS | Lean Blog Main Page | Podcast | Twitter @MarkGraban

Please check out my main blog page at www.leanblog.org

The RSS feed content you are reading is copyrighted by the author, Mark Graban.

, , , on the author’s copyright.


Thanks for reading! I’d love to hear your thoughts. Please scroll down to post a comment. Click here to receive posts via email.


Now Available – The updated, expanded, and revised 3rd Edition of Mark Graban’s Shingo Research Award-Winning Book Lean Hospitals: Improving Quality, Patient Safety, and Employee Engagement. You can buy the book today, including signed copies from the author.

Related Posts Plugin for WordPress, Blogger...
Please consider leaving a comment or sharing this post via social media.

Mark Graban's passion is creating a better, safer, more cost effective healthcare system for patients and better workplaces for all. Mark is a consultant, author, and speaker in the "Lean healthcare" methodology. He is author of the Shingo Award-winning books Lean Hospitals and Healthcare Kaizen, as well as The Executive Guide to Healthcare Kaizen. His most recent project is an eBook titled Practicing Lean that benefits the Louise H. Batz Patient Safety Foundation, where Mark is a board member. Mark is also the VP of Improvement & Innovation Services for the technology company KaiNexus.

Posted in: Uncategorized
Tags: , ,

6 Comments on "This Pilot Didn’t Follow Standardized Work?"

Trackback | Comments RSS Feed

  1. Lee Stacey says:

    Very good post, Mark. The trouble is that the pilot will probably be blamed for this.

    Is it his fault? No.

    “Should not” roughly translates to: We don’t think it’s a good idea but it’s up to you.

    Maybe this isn’t actually an error? By leaving things open like this, it allows for blame to be easily placed on the pilot, thus saving a lot of bottom line.

  2. Quarterman Lee says:

    There is another aspect to this problem. When flying through clouds, the pilot has no outside visual reference and must depend upon instruments for orientation. On autopilot, it is unlikely that the pilot would be attending to the airspeed indicator, artificial horizon, altimeter and vertical speed indicator. When the autopilot shuts off, especially if the stick has just moved forward, it would take several seconds for the pilot to check the isntruments, get reoriented and move the controls in the proper directions. The liklihood of panic is very high.

    This may also be a case of overautomation. There is no way to program automation for every conceivable situation. This is why pilots exist. It may be that the stick shaker and mover created more problems than they solved.

    The automation can induce complacency as well and that probably contributed. As a former pilot of light aircraft, I find it hard to imagine putting my life in the hands of an autopilot under such conditions. However, I can understand how continual flying with more sophisticated autopilot would induce a complacency that one would not have with more primitive systems.

    A lot of this applies to factories as well. More factories are ruined by over-automation than by insufficient automation.

  3. David says:

    Federal Aviation Regulations:

    Sec. 91.3 – Responsibility and authority of the pilot in command.

    (a) The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft.

    (b) In an in-flight emergency requiring immediate action, the pilot in command may deviate from any rule of this part to the extent required to meet that emergency.

    (c) Each pilot in command who deviates from a rule under paragraph (b) of this section shall, upon the request of the Administrator, send a written report of that deviation to the Administrator.

    Paragraph (b) recognizes that it is not possible to write a set of rules to cover every imaginable situation. Checklists and procedures are very important, but so is the use of human judgment, which must sometimes override them.

  4. curiouscat says:

    As soon as the “cause” is said to be personal error due to not following guidelines the very next question I want to know the answer to is how often is that guideline followed? And how often other similar guidelines are followed. Are they just dead guidelines sitting on paper to assign blame once something goes wrong or part of an active strategy to manage how work gets done.

  5. Sandeep Chatterjee says:

    In one of my earlier blogs, I had mentioned the pitfalls of using a ‘Pull Production’ in case of a remanufactured product where there is a combination of new and salvaged percentage of components. Even with a new product manufacturing and pull production, you may not have optimized your global supply chain.

    http://www.infosysblogs.com/oracle/2009/02/pull_production_have_you_achie.html

  6. LeanJeff says:

    Mark,

    Another great post. As a former private pilot myself, I think back to the NOTAMS (notices to airmen) that highlight air accidents. The vast majority of accident investigations cite “pilot error” as the primary cause. I have many thoughts on this, but, relative to your post, much of flying is about judgment. In some cases, too much latitude is afforded the pilot because the standard is intentionally ambiguous to cover a range of conditions.

    This post struck a chord. It makes me want to go to every ISO document in my clients’ factories and highlight every instance of “should” as a judgment land mine toward which great scrutiny must be applied, since standard work is poorly defined.

Post a Comment

CommentLuv badge