This Pilot Didn’t Follow Standardized Work?


    After “Sully” and his crew were lauded as heroes for following checklists, keeping calm, and landing that plane in the Hudson River, it's sad and interesting that the pilot of the commuter plane that crashed in Buffalo is being accused of “pilot error.”

    “The commuter plane that crashed near Buffalo was on autopilot until just before it went down in icy weather, indicating that the pilot may have ignored federal safety recommendations and violated the airline's own policy for flying in such conditions, an investigator said Sunday. Federal guidelines and the airline's own instructions suggest a pilot should not engage the autopilot when flying through ice. If the ice is severe, the company that operated Continental Flight 3407 requires pilots to shut off the autopilot.

    “You may be able in a manual mode to sense something sooner than the autopilot can sense it,” said Steve Chealander of the National Transportation Safety Board, which also recommends that pilots disengage the autopilot in icy conditions.

    Automatic safety devices returned the aircraft to manual control just before it fell from the sky, Chealander said.”

    Is this a case of “should not” or “must not”? If the airline “requires” the autopilot to be shut off in “severe” ice, what is the definition of “severe,” I wonder?

    If policies and guidelines were not being followed, this seems like less of an “accident.” If the airline required pilots to follow certain rules, what are they doing on an ongoing basis to make sure the rules are being followed? Are they doing anything to “audit” the process or do they just wait for something bad to happen?

    This has to be more systemic than just one pilot making an unfortunate fatal error this one time. Some news reports have pointed to problems in the training processes… which is a management responsibility.

    There is some speculation that the pilot reacted incorrectly when an automated “stick pusher” system kicked in (ironically, this automation is supposed to HELP, but reports say that pilots often react to it the wrong way… again, this seems systemic and not just this one time).

    This article (and the full WSJ version) says, in part:

    The safety board, among other issues, is looking into why Colgan's training programs apparently stop short of allowing pilots in simulators to feel the stick-pusher activate, according to people familiar with the issue. The device is intended to automatically prevent the plane from going into a stall by pointing the nose down to regain speed. Safety experts worry that unless pilots understand and feel what happens when the stick-pusher goes into action in a simulator, they may not react properly when it activates during an in-flight emergency.

    In a statement, Colgan said its training programs “meet or exceed the regulatory requirements for all major airlines,” adding that in the wake of the Buffalo crash, it has “specifically re-examined our procedures for this aircraft.”

    I don't think it's the right thing to just blame this individual pilot, in this case. As with this and other situations (including medical mistakes), it's important to look at the system and ask:
    1. Do we really understand why this problem happened?
    2. What can we do to prevent from happening again?
    3. What could have prevented this the last time?

    Subscribe via RSS | Lean Blog Main Page | Podcast | Twitter @markgraban

    Please check out my main blog page at

    The RSS feed content you are reading is copyrighted by the author, Mark Graban.

    , , , on the author's copyright.

    What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn – and follow me or connect with me there.

    Did you like this post? Make sure you don't miss a post or podcast — Subscribe to get notified about posts via email daily or weekly.

    Check out my latest book, The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation:

    Get New Posts Sent To You

    Select list(s):
    Previous articleSad: One in Five Nurses Quits Within a Year
    Next articleWhat Sets Toyota Apart?
    Mark Graban
    Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's new book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation. He is also the author of Measures of Success: React Less, Lead Better, Improve More, the Shingo Award-winning books Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean. Mark is also a Senior Advisor to the technology company KaiNexus.


    1. Very good post, Mark. The trouble is that the pilot will probably be blamed for this.

      Is it his fault? No.

      “Should not” roughly translates to: We don’t think it’s a good idea but it’s up to you.

      Maybe this isn’t actually an error? By leaving things open like this, it allows for blame to be easily placed on the pilot, thus saving a lot of bottom line.

    2. There is another aspect to this problem. When flying through clouds, the pilot has no outside visual reference and must depend upon instruments for orientation. On autopilot, it is unlikely that the pilot would be attending to the airspeed indicator, artificial horizon, altimeter and vertical speed indicator. When the autopilot shuts off, especially if the stick has just moved forward, it would take several seconds for the pilot to check the isntruments, get reoriented and move the controls in the proper directions. The liklihood of panic is very high.

      This may also be a case of overautomation. There is no way to program automation for every conceivable situation. This is why pilots exist. It may be that the stick shaker and mover created more problems than they solved.

      The automation can induce complacency as well and that probably contributed. As a former pilot of light aircraft, I find it hard to imagine putting my life in the hands of an autopilot under such conditions. However, I can understand how continual flying with more sophisticated autopilot would induce a complacency that one would not have with more primitive systems.

      A lot of this applies to factories as well. More factories are ruined by over-automation than by insufficient automation.

    3. Federal Aviation Regulations:

      Sec. 91.3 – Responsibility and authority of the pilot in command.

      (a) The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft.

      (b) In an in-flight emergency requiring immediate action, the pilot in command may deviate from any rule of this part to the extent required to meet that emergency.

      (c) Each pilot in command who deviates from a rule under paragraph (b) of this section shall, upon the request of the Administrator, send a written report of that deviation to the Administrator.

      Paragraph (b) recognizes that it is not possible to write a set of rules to cover every imaginable situation. Checklists and procedures are very important, but so is the use of human judgment, which must sometimes override them.

    4. As soon as the “cause” is said to be personal error due to not following guidelines the very next question I want to know the answer to is how often is that guideline followed? And how often other similar guidelines are followed. Are they just dead guidelines sitting on paper to assign blame once something goes wrong or part of an active strategy to manage how work gets done.

    5. Mark,

      Another great post. As a former private pilot myself, I think back to the NOTAMS (notices to airmen) that highlight air accidents. The vast majority of accident investigations cite “pilot error” as the primary cause. I have many thoughts on this, but, relative to your post, much of flying is about judgment. In some cases, too much latitude is afforded the pilot because the standard is intentionally ambiguous to cover a range of conditions.

      This post struck a chord. It makes me want to go to every ISO document in my clients’ factories and highlight every instance of “should” as a judgment land mine toward which great scrutiny must be applied, since standard work is poorly defined.


    Please enter your comment!
    Please enter your name here

    This site uses Akismet to reduce spam. Learn how your comment data is processed.