Quantcast
Channel: James Fallows | The Atlantic
Viewing all articles
Browse latest Browse all 3824

Three Implications of the 737 Max Crashes

$
0
0

Previously on this topic: “Is It Time to Worry About the Boeing 737 Max?”, “A Shorter Guide to the Ethiopian Tragedy and the 737 Max,” “What Was On the Record About Problems With the 737 Max,” and “‘Don’t Ground the Planes, Ground the Pilots.’” Despite the nightmare and tragedy of the situation I am grateful to many informed readers who have written in.

The  dispatches below contain no “news,” in the strict sense—information that would give us new certainty about what has happened in the Indonesian or Ethiopian crashes. But these three samples illustrate themes that have come through a lot of the response I have seen. I’ll quote samples:

1) An asset of the air-travel culture: its striking safety-consciousness, compared even with the  medical realm. A reader in the Northeast writes:

There is a larger issue concept that you brought up that I think bears looking at.

I don’t like to harp on it, but the latest estimates are that large numbers of Americans die each year from medical errors. [JF note: the estimates of how many people die from outright error of “iaotrogenic causes”—maladies that come from simply being in a hospital—vary widely, but some estimates are very large. An article that quotes a John Hopkins study saying it could be 250,000 fatalities per year, or more, is here. If that were true, it would equivalent to several fully-loaded airliners crashing every day. For the moment, details of these estimates don’t matter. The reader’s point is the contrast between the hoped-for safety standards of the aerospace system and most other realms of life.]

My intention is not to say how pilots are so existentially “good” and doctors are so  existentially “bad,” but to look at the functional reasons for the discrepancy between medicine and aviation. Although I already understood what those functional differences were, it was driven home quite forcefully last year when I took a friend for an appointment at a major hospital …

When I took my friend to the hospital appointment it was a non-stop
parade of procedural errors from start to finish. Thankfully no medical
procedures, so it was more of a comedy than a tragedy.

The point, though is that looking at it from an aerospace perspective,
what was essentially 100% absent was the entire “flight operations” layer of organization. Lots of doctors and nurses, and lots of people in the billing department, but it was like sending a rocket up, or taking a flight, when the only people involved were the pilots and the booking agents.

I have always been fascinated by the Apollo program. There,  it wasn’t just the actual flight control level, but there was also an entire operational management layer. [People were in charge of] “what do you do with a Saturn V rocket from the time it shows up on a barge on the Banana River to the time it takes off?” At NASA Kennedy there was an entire organization devoted to addressing that exact question.

At the hospital there was a similar question as to what do you when a patient shows up. The problem being, at the hospital absolutely no one was in charge of that question. My friend was shuttled around from desk to desk, no one knowing what she was doing there or what she had already done or where she was to go next … This was at a very prestigious college teaching hospital, which had a new building which looked more like Disneyland than a hospital, but there was total absence of that entire operational management level.

When the doctor finally showed up almost by accident, she knew absolutely nothing of what my friend had already done or how she was to be managed throughout her stay. I am sure the doctor knew a lot about medicine, and I am sure the billing department would be quick to get their money, but no one involved knew what was happening to the patient.

Not only did they not know what has happening to the patient, they didn't even know they didn’t know. They simply assumed that all of their agents were existentially infallible, and that that alone would ensure the patient's successful outcome.

Your discussion in your latest article about the NASA run ASRS is
important. The point being that there is nothing even remotely like that
in the medical system. For that matter, there is nothing even remotely
like that in the political system (except for very rare instances of
good reporting). The Founding Fathers set up a specific mechanism of
checks and balances, but that is more “honored in its breach” these
days than not, as the courts get packed with ideological toadies, and
the legislature is abandons its inherent power for hiding in the shadows.

I believe it does come down to a basic philosophical approach to life.
If one assumes that we are all imperfect actors, yet we are tasked to
not fail, then there are ways of addressing that. That is how the entire
aerospace industry is organized. It is just assumed that everyone
involved has a 5% error rate, yet the operation as a whole requires a
0.0001% error rate. You do that with an enormous and professional flight operations layer to the organization. Interestingly enough, the body does the same thing. A good 10% of the energy you spend every day is just in maintenance and repair …

Doctors and politicians, however, think they are gods who are never wrong. So the medical industry kills 1,000 Americans every day through errors, and the country as a whole invades the wrong country and kills 3,000 of the best American kids they can find at the tune of a trillion dollars down the hole and the destabilization of a major part of the world.

If you believe you are always right you are headed for a fall. If you believe that you and everyone else are imperfect actors, then you can get to the moon, and fly over 700 million passengers a year with only a handful of fatalities….


2) A related, notable trait of the aerospace culture: its awareness that its members, especially but not only pilots, will need to work around predictable, inevitable errors and failures in other parts of the system.

A reader on the West Coast, who has taught within the University of California system and who has an immediate family member who flies 737s for an airline, writes:

I had a long conversation across the kitchen table with my spouse about the design of the 737 MAX. We have been in the world of design for over 40 years, 30 some years of that together. We both were at [a major university].

I was struck by the taken for granted assumption that this airplane was not really a new plane for pilots because the technology was the same. Yet, the interactional space—the cockpit design, the gauges, the additional representational media that was added, etc.—was considerably different. I have designed the user experience for a living for decades, and this is not the same airplane as the technology has changed considerably.

Further evidence of this difference in the plane is in your quote [from longtime pilot Wally Magathan]:

-“Boeing’s design deficiency [JF note: having the MCAS rely on a single data source, the “angle of attack” indicator, without backup or comparative sensors] sets up the need for pilot training on how to overcome it.

-“Boeing’s failure to highlight the change resulted in no specific MCAS pilot training.”

I believe the implications, for folks outside of technology development, are not generally understood.

One new aspect of this plane was that it had a design deficiency. Knowing not only what a technology does and doesn't do, but most importantly, what it is poorly designed to do and not do, is essential to a user's successful interaction with a technical system.

You may be aware of the concept of articulation work … the know-how that humans bring to operating technology, including work arounds. I note that this pilot, and I guess pilots in general (as you do not remark on it) routinely—take for granted—that their performance as a pilot must include learning about design flaws and training to work around them. [JF note: Yes indeed. A huge part of training, for various certificates and ratings, and for regular proficiency checks, is devoted to “What if this goes wrong?” scenarios. What if the engine fails right after takeoff? What if you have to land when all the plane’s electricity, which powers its instruments, has gone out? What if you see this oil-pressure warning light? What if a bird smashes into your windshield? What if the autopilot goes nuts? Etc—and this is just for private pilots like me.]

This was part of the work I was raised on in my academic career, and I believe anyone dependent upon a complex technical system knows this, but I suspect many readers may not ….

I am surprised that some issues recur but also fail to be seen as foundational. Technical systems that we rely upon daily and for extremely essential functions, have known design flaws that cannot be fixed but must be worked around.


3) A number of pilots and aerospace engineers have written in about one mystery from the ASRS reports I quoted here. This involves the difference between the plane’s autopilot, which can handle nearly all dimensions of the plane’s speed, climb rate, point-to-point navigation, and descent along an instrument approach path, and the new MCAS software, which is specifically designed to push the nose of the 737 Max downward, if it senses that the plane’s pitch is becoming dangerously, nose-up high.

The mystery is that MCAS is supposed to work only when the plane is hand-flown—that is, under the pilot’s own stick-and-rudder command, with no autopilot. Yet in the ASRS reports, airline pilots coping with automated-pitch problems (presumably caused by MCAS) say they switched off the autopilot, which in theory is a way to keep MCAS operating, rather than switching it on, which theoretically is a way to disable MCAS.

What is going on here? One reader explains why this is an important question:

What caught my eye was that if MCAS only operates in manual mode, why did the problem stop when the pilots switched off the autopilot?

The pilots who wrote Report 1) and Report 4) [in the ASRS post] might be wrong about MCAS operation but it seems extremely unlikely because the reports are independent.

From Report 1):

“I mentioned I would engage autopilot sooner than usual (I generally hand fly to at least above 10,000 ft.) to remove the possible MCAS threat.”

From Report 4):

“MCAS (Maneuvering Characteristics Augmentation System) is implemented on the 737 MAX … and only operates in manual, flaps up flight.”

If design engineering did not think that MCAS operated during autopilot mode, the Flight Manual could not have the remedy for an MCAS malfunction (that is, switch autopilot to OFF).

It is possible that the the Lion Air, and Ethiopian pilots did not react correctly because they did read and learn from the Flight Manual, and concluded that autopilot was not the problem.

And, to similar effect:

One thing that these ASRS pilot reports allude to, but which I fear hasn't received enough scrutiny (or at least, disambiguation) in media reports is the distinction between the MCAS system and the “normal” automated flight-control systems (which I'll refer to generically as “autopilot”).

Many reports seem to conflate these two—and indeed some of the initial reporting around the ASRS reports seized on the word “autopilot” as a sort of smoking gun implicating MCAS.

As I’m sure you know from the documents that Boeing has shared with the FAA, MCAS is “supposed” to function only during manual flight phases, with a clean configuration and autopilot off. And it’s clear that many pilots are now aware of this as well—viz: the pilot in the first ASRS report you cited, who stated, “I mentioned I would engage autopilot sooner than usual (I generally hand fly to at least above 10,000 ft.) to remove the possible MCAS threat.” Clearly this pilot was aware that engaging AP should, allegedly at least, disable MCAS.

What, then, are we to make of the reports that state rapid pitch-down upon engaging autopilot? What the ASRS reports don’t seem to tell us is whether a “runaway trim” situation was observed, or if the pitch-down was instead attributable to the autopilot manipulating control surfaces as in normal AP operation. But one would reasonably assume that either of the pilots reporting departures from normal flight upon engaging the AP would have mentioned if the stab trim were in a runaway condition during the pitch-down.

So where does that leave us?

Is MCAS activating even when AP is engaged?

Is there another unknown flaw in the autopilot software that could be responsible for the ASRS reports (which do not seem to be MCAS related) but not Lion Air?

Or is there another system unique to the MAX that could be responsible?

I only add that third possibility because the 737 MAX does include another additional system unique to the MAX that can alter vertical speed, and that's the Elevator Jam Landing Assist feature, which actually deploys spoilers to assist in vertical speed control if the elevator were to become jammed. That's something that's easily disengaged from the overhead panel—but if it were activated improperly I'm not sure it's the first thing a pilot would think of.


“Where does this leave us?” indeed. I quote messages like these both to indicate the depth and sophistication of the process the aviation world is going through right now—not to mention, the seriousness and sophistication of professional pilots—but also to indicate that it may be a long time until these questions are understood and resolved.

Bonus an additional short email has just arrived, which I think is a useful complement to the argument quoted yesterday: “Don’t Ground the Airplanes. Ground the Pilots.”  A reader in the Northwest writes:

Mr. Magathan (the C-5 flight instructor)’s notes about “grounding the pilot and not the planes” ring true to me.

I moved to Spokane, Washington in the past ten years and during that time have started flying [on an airliner] once a month.

Spokane is like Brigadoon and generally fogged in between mid-December and the end of February. Yet I have never had a plane delayed because of fog.

Based on this experience, I was surprised to find out that fog still delays flights in other places; I thought that the problem had been universally resolved.

It turns out that, while all airliners are equipped to land in fog, not all pilots are certified to do so.

It sounds like a very analogous situation to the way Mr. Magathan sees things relating to the MCAS on Boeing 737 Max.

His solution sounds like the right one, in my opinion. Learning scientists have realized that “experts” are experts because they know what to tune out and what is important to focus on. His description of a week in the simulator sounds like exactly the right way to understand why these alarms and sounds are going off, how to tune them out, and how to focus on taking the necessary steps. Practice makes perfect.

Chris Helgren / ReutersAn Air Canada Boeing 737 MAX 8 aircraft on the ground on March 13, 2019

Viewing all articles
Browse latest Browse all 3824

Trending Articles