I wonder how often things like that happen.
The launch could have gone right, and no one would have known anything about the decision process besides a few insiders. I am sure that on project as complex and as risky as a Space Shuttle, there is always an engineer that is not satisfied with some aspect, for some valid reason. But at some point, one needs to launch the thing, despite the complains. How many projects luckily succeeded after a reckless decision?
In many accidents, we can point at an engineer who foreshadowed it, as it is the case here. Usually followed by blaming those who proceeded anyways. But these decision makers are in a difficult position. Saying "no" is easy and safe, but at some point, one needs to say "yes" and take risks, otherwise nothing would be done. So, whose "no" to ignore? Not Allan's apparently.
Often.
I used to run the nuclear power plant on a US Navy submarine. Back around 2006, we were sailing somewhere and Sonar reported that the propulsion plant was much, much louder than normal. A few days later we didn't need Sonar to report it, we could hear it ourselves. The whole rear half of the ship was vibrating. We pulled into our destination port, and the topside watch reported that oil pools were appearing in the water near the rear end of the ship. The ship's Engineering Officer and Engineering Department Master Chief shrugged it off and said there was no need for it to "affect ship's schedule". I was in charge of the engineering library. I had a hunch and I went and read a manual that leadership had probably never heard of. The propeller that drives the ship is enormous. It's held in place with a giant nut, but in between the nut and the propeller is a hydraulic tire, a toroidal balloon filled with hydraulic fluid. Clearly it had ruptured. The manual said the ship was supposed to immediately sail to the nearest port and the ship was not allowed to go back out to sea until the tire was replaced. I showed it to the Engineer. Several officers called me in to explain it to them. And then, nothing. Ship's Schedule was not affected, and we continued on the next several-week trip. Before we got to the next port, we had to limit the ship's top speed to avoid major damage to the entire propulsion plant. We weren't able to conduct the mission we had planned because the ship was too loud. And the multiple times I asked what the hell was going on, management literally just talked over me. When we got to the next port, we had to stay there while the propeller was removed and remachined. Management doesn't give a shit as long as it doesn't affect their next promotion.
Don't even get me started on the nuclear safety problems.
The correct answer in that case is to go to the Inspector General. That's what they're there for. Leaders sweeping shit under the rug that ends up crippling a fleet asset and preventing tasking from higher is precisely the kind of negligence and incompetence the IG is designed to root out.
And I say that as a retired officer.
How long retired? Things have gone in what can only be described as an.. incomprehensible unfathomable direction in the last decade or so. Parent post is not surprising in the least.
Politics is seeping where it doesn't belong.
I am very worried.
Tell us more... what has happened?
To a first approximation: https://www.youtube.com/watch?v=KZB7xEonjsc
Less funny in real life. Sometimes the jizzless thing falls off with impeccably bad timing. Right when things go boom. People get injured (no deaths yet). Limp home early. Allies let down. Shipping routes elongate by a sad multiple. And it even affects you directly as you pay extra for that Dragon silicon toy you ordered from China.
Just google the Red Hill failure.
The Navy's careerist, bureaucratic incompetence is staggering. No better than Putin's generals who looted the military budget and crippled his army so they couldn't even beat a military a fraction of their size.
Honest question: what are the plausible outcomes for an engineer who reports this kind of issue to the IG?
I'm guessing there's a real possibility of it ending his career, at least as a member of the military.
I want to be pro-nuclear energy, but I just don't think I can trust the majority of human institutions to handle nuclear plants.
What do you think about the idea of replacing all global power production with nuclear, given that it would require many hundreds of thousands of loosely-supervised people running nuclear plants?
There's also the issue of force majeure - war, terrorism, natural disasters, and so on. Increase the number of these and not only can you not really maintain the same level of diligence, but you also increase the odds of them ending up in an unfortunate location or event.
There's also the issue of the uranium. Breeder reactors can help increase efficiency, but they bump up all the complexities/risks greatly. Relatively affordable uranium is a limited resource. We have vast quantities of it in the ocean, but it's not really feasible to extract. It's at something like 3.3 parts per billion by mass. So you'd need to filter a billion kg of ocean water to get 3.3kg of uranium. Outside of cost/complexity, you also run into ecological issues at that scale.
Considering that 1 Chernobyl scale accident per year would kill fewer people than global coal power does, I personally would be for it.
Is this a different phenomenon though? It seems that there's a difference between an informed risk assessment and not giving a fuck or letting the bureaucratic gears turn and not feeling responsible. Like there's a difference between Challenger and Chernobyl.
But, maybe someone can make a case that it's fundamentally the same thing?
I would make the case that it's fundamentally the same thing.
In both cases, there were people who cared primarily about the technical truth, and those people were overruled by people who cared primarily about their own lifestyle (social status, reputation, career, opportunities, loyalties, personal obligations, etc.) In Allan McDonald's book "Truth, Lies, and O-Rings" he outlines how Morton Thiokol was having a contract renewal held over their head while NASA Marshall tried to maneuver the Solid Rocket Booster production contract to a second source, which would have seriously affect MT's bottom line and profit margins. There's a strong implication that Morton Thiokol was not able to adhere to proper technical rationale and push back on their customer (NASA) because if they had they would have given too much ammunition to NASA to argue for a second-source for the SRB contracts. (In short: "you guys delayed launches over issues in your hardware, so we're only going to buy 30 SRB flight sets from you over the next 5 years instead of 60 as we initially promised."
I have worked as a NASA contractor on similar issues, although much less directly impacting the crews than the SRBs. You are not free to pursue the smartest, most technically accurate, quickest method for fixing problems; if you introduce delays that your NASA contacts and managers don't like, they will likely ding your contract and redirect some of your company's work to your direct competitors, who you're often working with on your projects.
If you're EB, why replace a hydraulic bushing when you can wait, and replace it but also have to repair a bunch of damage and make yourself a nice big extra chunk of change off Uncle Sam?
If you're ship's captain...why not help secure a nice 'consulting' 'job' at EB after retiring from the navy by helping EB make millions, and count on your officers to not say a peep to fleet command that the mess was preventable?
Not in my experience. Saying no to something major when others don’t see a problem can easily be career-ending.
Easily be career ending? That's a bit dramatic, don't you think?. Someone who continuously says no to things will surely not thrive and probably eventually leave the organization, one way or the other, that's probably right.
Ask Snowden.
Saying no isn't what ended his career.
Within NatSec, saying No to embarrassing the government is implied. Ceaselessly.
Equally implied: The brutality of the consequences for not saying no.
Not even slightly dramatic. I have seen someone be utterly destroyed for trying to speak out on something deeply unethical a state was doing, and is probably still doing.
He was dragged by the head of state in the press and televised announcements, became untouchable overnight - lost his career, his wife died a few days later while at work at her government job in an “accident”. This isn’t in some tinpot dictatorship, rather a liberal western democracy.
So - no. Career-ending is an understatement. You piss the wrong people off, they will absolutely fuck you up.
I have long thought that there ought to be an independently funded International Association for the Protection of Whistleblowers. However, it would quickly become a primary target of national intelligence agencies, so I don't know how long it would last.
Everyone seems to be reading this too simply. In fact, stupidly.
It's conceptually the easiest answer to the risk of asserting that you are certain, is simply don't assert that you are certain.
They aren't saying it's easy to face your bosses with anything they don't want to hear.
Isn't the definition of "easy" or "hard" that includes the external human pressures the less simple/stupid one? What is the utility of a definition of "easy" that assumes that you work in complete isolation?
Context.
The context to this conversation is the launch of a space shuttle that's supposed to carry a teacher to space. It has both enormous stakes and enormous political pressure to not delay/cancel. I'm unsure why that context makes the spherical cow version of "easy" a sensible one.
Do they? Even if risks are not mitigated and say risk for catastrophe can't be pushed below ie 15%? This ain't some app startup world where failure will lose a bit of money and time, and everybody moves on.
I get the political forces behind, nobody at NASA was/is probably happy with those, and most politicians are basically clueless clowns (or worse) chasing popularity polls and often wielding massive decisive powers over matters they barely understand at surface level.
But you can't cheat reality and facts, not more than say in casino.
Maybe it's a bad analogy given the complexity of a rocket launch, but I always think about European exploration of the North Atlantic. Huge risk and loss of life, but the winners built empires on those achievements.
So yes, I agree that at some point you need to launch the thing.
This sounds like you are saying colonialism was a success story?
For the ones doing the colonizing? Overwhelmingly yes. A good potion of the issues with colonizing is about how the colonizing nations end up extracting massive amounts of resources for their own benefit.
In context, it sounds like you think that the genocide of indigenous peoples was totally worth it for European nations and that callous lack of concern for human life and suffering is an example to be followed by modern space programs.
I'd like to cut you the benefit of the doubt and assume that's not what you meant; if that's the case, please clarify.
You are not reading the context correctly. The original point was that establishing colonies was very risky, to which whyever implied that colonialism was not a success story. But in fact it was extremely successful from a risk analysis point of view. Some nations chose to risk lives and it paid off quite well for them. The nuance of how the natives were treated is frankly irrelevant to this analysis, because we're asking "did the risk pay off", not "did they do anything wrong".
This comment sounds an awful lot like you think the genocide of indigenous peoples is justified by the fact that the winners built empires, but I'd like to assume you intended to say something better. If you did intend to say something better, please clarify.
I would somewhat agree with first launch, first moon mission and so on, but N-th in a row ain't building no new empires. Its business as usual.
I think ultimately the problem is of accountability
If the risks are high and there are a lot of warning signs, there needs to be strong punishment for pushing ahead anyways and ignoring the risk
It is much too often that people in powerful positions are very cavalier with the lives or livelihoods of many people they are supposed to be responsible for, and we let them get away with being reckless far too often
A lot of people are taking issue with the fact that you need to say yes for progress. I don’t know how one could always say no and expect to have anything done.
Every kind of meaningful success involves negotiating risk instead of seizing up in the presence of it.
The shuttle probably could have failed in 1,000 different ways and eventually, it would have. But they still went to space with it.
Some risk is acceptable. If I were to go to the moon, let’s say, I would accept a 50% risk of death. I would be happy to do it. Other people would accept a risk of investment and work hour loss. It’s not so black or white that you wouldn’t go if there’s any risk.
No offense but this sounds like the sayings of someone who has not ever seen a 50% of death.
It’s a little different 3 to 4 months out. It’s way different the night before and morning. Stepping “in the arena” with odds like those, I’d say the vast, vast majority will back out and/or break down sobbing if forced.
There’s a small percent who will go forward but admit the fact that they were completely afraid- and rightly so.
Then you have that tiny percentage that are completely calm and you’d swear had a tiny smile creeping in…
I’ve never been an astronaut.
But I did spend three years in and out of Bosnia with a special operations task force.
Honestly? I have a 1% rule. The things might have a 20-30% chance of death of clearly stupid and no one wants to do. Things will a one in a million prob aren’t gonna catch ya. But I figure that if something does, it’s gonna be an activity that I do often but has a 1% chance of going horribly wrong and that I’m ignoring.
The space program pilots saw it. And no, I would not have flown on those rockets. After all, NASA would "man rate" a new rocket design with only one successful launch.
50% of the time doing something that has a one percent chance of killing you 69 times will kill you
The key thing with Challenger is that the engineers working on the project estimated the risk to be extremely high and refused to budge, eventually being overruled by the executives of their company.
That's different than the engineers calculating the risk of failure at some previously-defined-as-acceptable level and giving the go-ahead.
It's possible you're just suicidal, but I'm reading this more as false internet bravado. A 50% risk of death on a mission to space is totally unacceptable. It's not like anyone will die if you don't go now; you can afford to take the time to eliminate all known risks of this magnitude.
Do they though? If the Challenger launch had been pushed back what major effects would there have been?
I do get your general point but in this specific example it seems the urgency to launch wasn’t particularly warranted.
An administrator would’ve missed a promotion.
I think it’s not even a missed promotion but a perceived risk of one- which may or may not be accurate.
you need to establish which complaints can delay a launch. The parent comment is arguing that you need to set some kind of threshold on that. In practice, airplanes fly a little bit broken all the time. We have excellent data and theory and failsafes which allow that to be the case, but it's written in blood.
The point is it's not just the Challenger launch. It's every launch.
I'm wondering how the two astronauts on the ISS feel about that while Boeing decides if/when it is safe to return then to Earth.
https://www.cnn.com/2024/06/18/science/boeing-starliner-astr...
Presumably about the same as they did prior to their first launch. Space travel is not like commercial air travel. This is part of the deal.
Hard disagree. The idea that the machinery your life will depend on might be made with half-assed safety in mind is definitely not part of the deal.
Astronauts (and anyone intelligent who intentionally puts themselves in a life-threatening situation) have a more nuanced understanding of risk than can be represented by a single % risk of death number. "I'm going to space with the best technology humanity has to offer keeping me safe" is a very different risk proposition from "I'm going to space in a ship with known high-risk safety issues".
It's definitely built in. The Apollo LM was .15mm thick aluminum, meaning almost any tiny object could've killed them.
The Space Shuttle flew with SSRB's that were solid-fuel and unstoppable when lit.
Columbia had 2 ejection seats, which were eventually taken out and not installed on any other shuttle.
Huge risk is inherently the deal with space travel, at least from its inception until now.
Nobody can afford the best technology humanity has to offer. As one adds more 9's to the odds of success, the cost increases exponentially. There is no end to it.
My understanding of the Space Shuttle program is that there were a lot of times they knew they probably shouldn't fly, or try to land, and they lucked out and didn't lose the orbiter. It is shocking they only lost two ships out of the 135 Space Shuttle missions.
The safety posture of that whole program, for a US human space program, seemed bad. That they chose to use solid rocket motors shows that they were willing to compromise on human safety from the get-go. There are reasons there hasn't ever been even one other human-rated craft to use solid rocket motors.
Except SLS?
Not that I think it's a good thing, but...
I forgot about the SLS until after I wrote that. SLS makes most of the same mistakes, plus plenty of new expensive ones, from the Space Shuttle program. SLS has yet to carry a human passenger though.
Its mind boggling that SLS still exists at all. At least $1B-$2B in costs whether you launch or not. A launch cadence measured in years. $2B-$4B if you actually launch it. And it doesn't even lift more than Starship, which is launching almost quarterly already. This before we even talk about reusability, or that a reusable Starship + Super Heavy launch would only use about $2M of propellent.
It happens extremely frequently because there is almost no downside for management to override the engineers decision.
Even in the case of the Challenger, no single article say WHO was the executive that finally approved the launch. No body was jailed for gross negligence. Even Ricahrd Feynman felt that the investigative comission was biased from the start.
So, since there is no "price to pay" to make this bad calls they are continuously made.
Jailing people means you'll have a hard time finding people willing to make hard decisions, and when you do, you may find they're not the right people for the job.
Punishing people for making mistakes means very few will be willing to take responsibility.
It will also mean that people will desperately cover up mistakes rather than being open about it, meaning the mistakes do not get corrected. We see this in play where manufacturers won't fix problems because fixing a problem is an admission of liability for the consequences of those problems, and punishment.
Even the best, most conscientious people make mistakes. Jailing them is not going to be helpful, it will just make things worse.
Jailing people means you'll have a hard time finding people willing to make hard decisions,
Why do you think you want it? You don't want it.
That's the thing I always wonder about these things.
It's fun and easy to provide visibility into whoever called out an issue early when it does go on to cause a big failure. It gives a nice smug feeling to whoever called it out internally, the reporters who report it, and the readers in the general public who read the resulting story.
The actual important thing that we hardly ever get much visibility into is - how many potential failures were called out by how many people how many times. How many of those things went on to cause a big, or even small, failure, and how many were nothingburgers in the end. Without that, it's hard to say whether leaders were appropriately downplaying "chicken little" warnings to satisfy a market or political need, and got caught by one actually being a big deal, or whether they really did recklessly ignore a called-out legitimate risk. It's easy to say you should take everything seriously and over-analyze everything, but at some point you have to make a move, or you lose. You don't get nearly as much second-guessing when you spend too much time analyzing phantom risks and end up losing to your competitors.
I'm not sure that's important at all. Every issue raised needs to be evaluated independently. If there is strong evidence that a critical part of a space shuttle is going to fail there should be zero discussion about how many times in the past other people thought other things might go wrong when in the end nothing did. What matters is the likelihood that this current thing will cause a disaster this time based on the current evidence, not on historical statistics
The point where you "have to make a move" should only come after you can be reasonably sure that you aren't needlessly sending people to their deaths.
What makes you say it "could have gone right"? From what came out about the o-rings behavior at cold temperatures, it seems they were taking a pretty big risk. Your perspective seems to be that it's always a coin toss no matter what, and I don't think that is true. Were there engineers speaking up in this way at every successful launch too?
I think what they were saying, especially given the phrasing “How many projects luckily succeeded after a reckless decision?” is that, if things hadn’t failed we would never have known and thus how many other failures of procedure/ ethics have we just not seen because the worst case failed to occur.
True, but that is for cases where you take the risk yourself. If the challenger crew knew the risk and were - fuck it - it's worth it it would have been different than a bureaucrat chasing a promotion.
Especially when that bureaucrat probably suffered no consequences for making the wrong call. Essentially letting other people take all of the risk while accepting none. No demotion, no firing, and even if they did get fired they probably got some kind of comfy pension or whatever
It's a joke
I've always thought the same, that something like space travel is inherently incredibly dangerous. I mean surely someone during the Apollo program spoke out about something. Like landing on the moon with an untested engine being the only way back for instance.
Nixon even had a 'if they died' speech prepared, so someone had to put the odds of success not at 100.
Neil Armstrong figured that he only had a 50% chance of making it back from the moon alive.
Saying "no" is easy and safe in a world where there are absolutely no external pressures to get stuff done. Unfortunately, that world doesn't exist, and the decision makers in these kinds of situations face far more pressure to say "yes" than they do to say "no".
For example, see the article:
Can't we apply the same logic to the current Starliner situation. There's no way it should have launched, but someone brow beat others into saying it was an acceptable risk with the known issues to go ahead with the launch. Okay, so the launch was successful, but other issues that were known and suspect then caused problems after launch to the point they are not positive it can return. So, should it have launched? Luckily, at least to this point, nobody has been hurt/killed, and the vehicle is somewhat still intact.