In the last few moments, the Federal Aviation Administration (FAA) has ordered the temporary grounding of 171 Boeing 737 MAX 9 aircraft operated by U.S airlines or in U.S territory.
The whole history of the decision-making that led to the MCAS system made it clear to anyone who’s ever worked in an engineering organization that more failures were coming. The engineers saying “This is a problem, don’t do it this way” and the management saying “STFU, I’m in charge, do as I say” never, ever leads anywhere good.
The Boeing MCAS story and the fact they were not held accountable at all terrifies me. Not the idea of the augmentation, I kinda understand they needed to fit bigger engines onto their existing frame until they can make and certify a new one. It’s not a good solution, but I can understand the business thinking behind it.
Here’s where it goes wrong for me.
Not documenting the MCAS system, in order to cheat the system to not require recertification for the plane. Adding a system that can make trim changes without informing the pilots and that there isn’t a documented way to override was an accident waiting to happen.
Worse to me, is the fact that while the aircraft has two AoA sensors, the MCAS system only takes input from one of them. This is terrifying. There’s no way the software can know the inputs could be wrong. So the software would effectively try to kill people all the while thinking it’s actually doing you a favour.
It was a debacle that should have been investigated further. Now, it’s not fair (although it probably is) to compare Boeing putting their toes into more flight automation against airbus. But the modern airbus jets use multiple sensor sources, and when there is a disagreement, they will reduce flight protections and inform the pilots about it, pilots that will be trained on the various flight modes that can come out of this. Just using one sensor was just a crazy decision, and I bet it was based on cost.
What’s going on now though is more a general QC/QA situation. Where I think it overlaps with the MCAS situation is that both the lack of redundancy in MCAS sensor input and the lack of QC in general just reeks of ruthless cost-cutting.
Yeah. Normally there’s a swiss cheese model where multiple failures had to line up together to make a problem. MCAS was the only non-flight-crew-related problem I’ve ever heard of where it was just one thing. One thing failed, the plane flew itself inexorably into the ground, everyone’s dead, the end.
Yeah, it’s a race to the bottom. But we have strict aviation rules across the west for this very reason.
The crash in Japan is actually an example of a failure that fits the Swiss Cheese model. I think ultimately most of the blame will fall on the surviving coastguard captain, but everyone involved had a chance to stop that crash. The coastguard messed up and joined the runway when he shouldn’t have. Mistake 1. ATC didn’t notice the warning on the monitor that would have drawn attention to this. Mistake 2. The pilots didn’t see the coastguard plane on the runway. Now, this one, is a tough one. With all the bigger planes with beacon/nav/interior lights, the runway lights, the airport lighting. It may well have been hard to see the small plane on the runway, but it had beacon lights on, and they had the opportunity to see it and abort the landing.
So essentially there were three chances to stop that accident and all three were missed.
I completely agree, designing a feature on a plane that doesn’t respect this way of thinking is not the behaviour of a responsible aviation company.
The whole history of the decision-making that led to the MCAS system made it clear to anyone who’s ever worked in an engineering organization that more failures were coming. The engineers saying “This is a problem, don’t do it this way” and the management saying “STFU, I’m in charge, do as I say” never, ever leads anywhere good.
The Boeing MCAS story and the fact they were not held accountable at all terrifies me. Not the idea of the augmentation, I kinda understand they needed to fit bigger engines onto their existing frame until they can make and certify a new one. It’s not a good solution, but I can understand the business thinking behind it.
Here’s where it goes wrong for me.
It was a debacle that should have been investigated further. Now, it’s not fair (although it probably is) to compare Boeing putting their toes into more flight automation against airbus. But the modern airbus jets use multiple sensor sources, and when there is a disagreement, they will reduce flight protections and inform the pilots about it, pilots that will be trained on the various flight modes that can come out of this. Just using one sensor was just a crazy decision, and I bet it was based on cost.
What’s going on now though is more a general QC/QA situation. Where I think it overlaps with the MCAS situation is that both the lack of redundancy in MCAS sensor input and the lack of QC in general just reeks of ruthless cost-cutting.
Yeah. Normally there’s a swiss cheese model where multiple failures had to line up together to make a problem. MCAS was the only non-flight-crew-related problem I’ve ever heard of where it was just one thing. One thing failed, the plane flew itself inexorably into the ground, everyone’s dead, the end.
And, engineers involved tried to raise the alarm that it was a problem.
Yeah, it’s a race to the bottom. But we have strict aviation rules across the west for this very reason.
The crash in Japan is actually an example of a failure that fits the Swiss Cheese model. I think ultimately most of the blame will fall on the surviving coastguard captain, but everyone involved had a chance to stop that crash. The coastguard messed up and joined the runway when he shouldn’t have. Mistake 1. ATC didn’t notice the warning on the monitor that would have drawn attention to this. Mistake 2. The pilots didn’t see the coastguard plane on the runway. Now, this one, is a tough one. With all the bigger planes with beacon/nav/interior lights, the runway lights, the airport lighting. It may well have been hard to see the small plane on the runway, but it had beacon lights on, and they had the opportunity to see it and abort the landing.
So essentially there were three chances to stop that accident and all three were missed.
I completely agree, designing a feature on a plane that doesn’t respect this way of thinking is not the behaviour of a responsible aviation company.