The context of “failure is not an option” is important, In the situation where that was uttered, accepting failure to try everything to save lives in danger was not an option, even if the odds were long. Granted that mindset is not universal, and doesn’t need to be applied when trying every new thing, but is it really acceptable to fail to try to save lives?
We crash a lot cars, to test them.
It’s not quite the same thing with rockets. you crash
cars to find ways of improving, and you test rockets
to find ways to improving them.
And part of testing rockets is to test rockets, with dummy or cheap payloads, a number of times, before
launching very costly satellites or people.
It also applies to airplanes, they first flown with test pilot, which will gradually fly in more dangerous environments or situations, so improve plane, or know how fly that plane in certain situations, so pilots can train to do it.
Krantz came up with that line in the 1990s, not during the mission. Although “failure is not an option” was certainly the mindset in the MOCR at the time, and rightly so.
I don’t think anyone involved in human spaceflight is of the opinion that it is “acceptable to fail to try to save lives.” That’s a big misunderstanding of the argument, which is, should we accept technological stasis because if we don’t, someone, inevitably, is going to die as a result? The obvious answer is no, we shouldn’t, and so far we haven’t. But the Precautionary Principle is out there, advocated by many. It’s alive and well at the FDA, for example–drugs that could save thousands of lives are withheld for years or decades because of hypothetical risks to a tiny fraction of that number. Meanwhile, those thousands of (not hypothetical) lives are lost as a direct result of that withholding.
The context of “failure is not an option” is important, In the situation where that was uttered, accepting failure to try everything to save lives in danger was not an option, even if the odds were long. Granted that mindset is not universal, and doesn’t need to be applied when trying every new thing, but is it really acceptable to fail to try to save lives?
We crash a lot cars, to test them.
It’s not quite the same thing with rockets. you crash
cars to find ways of improving, and you test rockets
to find ways to improving them.
And part of testing rockets is to test rockets, with dummy or cheap payloads, a number of times, before
launching very costly satellites or people.
It also applies to airplanes, they first flown with test pilot, which will gradually fly in more dangerous environments or situations, so improve plane, or know how fly that plane in certain situations, so pilots can train to do it.
Krantz came up with that line in the 1990s, not during the mission. Although “failure is not an option” was certainly the mindset in the MOCR at the time, and rightly so.
I don’t think anyone involved in human spaceflight is of the opinion that it is “acceptable to fail to try to save lives.” That’s a big misunderstanding of the argument, which is, should we accept technological stasis because if we don’t, someone, inevitably, is going to die as a result? The obvious answer is no, we shouldn’t, and so far we haven’t. But the Precautionary Principle is out there, advocated by many. It’s alive and well at the FDA, for example–drugs that could save thousands of lives are withheld for years or decades because of hypothetical risks to a tiny fraction of that number. Meanwhile, those thousands of (not hypothetical) lives are lost as a direct result of that withholding.