Mish: Boeing 737 Max Unsafe To Fly, New Scathing Report By Pilot, Software Designer

Authored by Mike Shedlock via MishTalk,

A pilot with 30 years of flying experience and 40 years of design experience rips decisions made by Boeing and the FAA.

Gregory Travis, a software developer and pilot for 30 years wrote a scathing report on the limitations of the 737, and the arrogance of software developers unfit to write airplane code.

Travis provides easy to understand explanations including a test you can do by sticking your hand out the window of a car to demonstrate stall speed.

Design shortcuts meant to make a new plane seem like an old, familiar one are to blame.

This was all about saving money. Boeing and the FAA pretend the 737-Max is the same aircraft as the original 737 that flew in 1967, over 50 years ago.

Travis was 3 years old at the time. Back then, the 737 was a smallish aircraft with smallish engines and relatively simple systems. The new 737 is large and complicated.

Boeing cut corners to save money. Cutting corners works until it fails spectacularly.

Aerodynamic and Software Malpractice

Please consider How the Boeing 737 Max Disaster Looks to a Software Developer. Emphasis is mine.

The original 737 had (by today’s standards) tiny little engines, which easily cleared the ground beneath the wings. As the 737 grew and was fitted with bigger engines, the clearance between the engines and the ground started to get a little…um, tight.

With the 737 Max, the situation became critical. The engines on the original 737 had a fan diameter (that of the intake blades on the engine) of just 100 centimeters (40 inches); those planned for the 737 Max have 176 cm. That’s a centerline difference of well over 30 cm (a foot), and you couldn’t “ovalize” the intake enough to hang the new engines beneath the wing without scraping the ground.

The solution was to extend the engine up and well in front of the wing. However, doing so also meant that the centerline of the engine’s thrust changed. Now, when the pilots applied power to the engine, the aircraft would have a significant propensity to “pitch up,” or raise its nose. This propensity to pitch up with power application thereby increased the risk that the airplane could stall when the pilots “punched it”

Worse still, because the engine nacelles were so far in front of the wing and so large, a power increase will cause them to actually produce lift, particularly at high angles of attack. So the nacelles make a bad problem worse.

I’ll say it again: In the 737 Max, the engine nacelles themselves can, at high angles of attack, work as a wing and produce lift. And the lift they produce is well ahead of the wing’s center of lift, meaning the nacelles will cause the 737 Max at a high angle of attack to go to a higher angle of attack. This is aerodynamic malpractice of the worst kind.

It violated that most ancient of aviation canons and probably violated the certification criteria of the U.S. Federal Aviation Administration. But instead of going back to the drawing board and getting the airframe hardware right, Boeing relied on something called the “Maneuvering Characteristics Augmentation System,” or MCAS.

It all comes down to money, and in this case, MCAS was the way for both Boeing and its customers to keep the money flowing in the right direction. The necessity to insist that the 737 Max was no different in flying characteristics, no different in systems, from any other 737 was the key to the 737 Max’s fleet fungibility. That’s probably also the reason why the documentation about the MCAS system was kept on the down-low.

Put in a change with too much visibility, particularly a change to the aircraft’s operating handbook or to pilot training, and someone—probably a pilot—would have piped up and said, “Hey. This doesn’t look like a 737 anymore.” And then the money would flow the wrong way.

When the flight computer trims the airplane to descend, because the MCAS system thinks it’s about to stall, a set of motors and jacks push the pilot’s control columns forward. It turns out that the Elevator Feel Computer can put a lot of force into that column—indeed, so much force that a human pilot can quickly become exhausted trying to pull the column back, trying to tell the computer that this really, really should not be happening.

MCAS is implemented in the flight management computer, even at times when the autopilot is turned off, when the pilots think they are flying the plane. In a fight between the flight management computer and human pilots over who is in charge, the computer will bite humans until they give up and (literally) die. Finally, there’s the need to keep the very existence of the MCAS system on the hush-hush lest someone say, “Hey, this isn’t your father’s 737,” and bank accounts start to suffer.

Those lines of code were no doubt created by people at the direction of managers.

In a pinch, a human pilot could just look out the windshield to confirm visually and directly that, no, the aircraft is not pitched up dangerously. That’s the ultimate check and should go directly to the pilot’s ultimate sovereignty. Unfortunately, the current implementation of MCAS denies that sovereignty. It denies the pilots the ability to respond to what’s before their own eyes.

In the MCAS system, the flight management computer is blind to any other evidence that it is wrong, including what the pilot sees with his own eyes and what he does when he desperately tries to pull back on the robotic control columns that are biting him, and his passengers, to death.

The people who wrote the code for the original MCAS system were obviously terribly far out of their league and did not know it. How can they can implement a software fix, much less give us any comfort that the rest of the flight management software is reliable?

So Boeing produced a dynamically unstable airframe, the 737 Max. That is big strike No. 1. Boeing then tried to mask the 737’s dynamic instability with a software system. Big strike No. 2. Finally, the software relied on systems known for their propensity to fail (angle-of-attack indicators) and did not appear to include even rudimentary provisions to cross-check the outputs of the angle-of-attack sensor against other sensors, or even the other angle-of-attack sensor. Big strike No. 3.

None of the above should have passed muster. It is likely that MCAS, originally added in the spirit of increasing safety, has now killed more people than it could have ever saved. It doesn’t need to be “fixed” with more complexity, more software. It needs to be removed altogether.

Numerous Bad Decisions at Every Stage

Ultimately 346 people are dead because of really bad decisions, software engineer arrogance, and Boeing’s pretense that the 737 Max is the same aircraft as 50 years ago.

It is incredible that the plane has two sensors but the system only uses one. A look out the window was enough to confirm the sensor was wrong.

Boeing also offered “cheap” versions of the aircraft without some controls. The two crashed flights were with the cheaper aircraft.

An experienced pilot with adequate training could have disengaged MACS but in one of the crashed flights, the pilot was desperately reading a manual trying to figure out how to do that.

Flight Stall Test

If you stick you hand out the window of a car and your hand is level to the ground. You have a low angle of attack. There is no lift. Tilt your hand a bit and you have lift. Your arm will rise.

When the angle of attack on the wing of an aircraft is too great the aircraft enters aerodynamic stall. The same thing happens with your hand out a car window.

At a steep enough angle your arm wants to flop down on the car door.

The MACS software overrides what a pilot can see by looking out the window.

Useless Manuals

If you need a manual to stop a plane from crashing mid-flight, the manual is useless. It’s already too late. The pilot had seconds in which to react. Yet, instead of requiring additional training, and alerting pilots of the dangers, Boeing put this stuff in a manual.

This was necessary as part of the pretense that a 737 is a 737 is a 737.