[HN Gopher] A simplified analysis of the Chernobyl accident (2021)
___________________________________________________________________
A simplified analysis of the Chernobyl accident (2021)
Author : _vk_
Score : 15 points
Date : 2025-01-22 11:13 UTC (2 days ago)
(HTM) web link (www.epj-n.org)
(TXT) w3m dump (www.epj-n.org)
| colechristensen wrote:
| The technical reasons why Chernobyl exploded when AZ-5 was
| pressed were only half of the reasons for the disaster.
|
| The other half of the reasons Chernobyl failed were human
| reasons.
|
| It is very important when analyzing large systematic failures to
| not leave these things out.
| SoftTalker wrote:
| Yes, the RBMK was not an inherently safe design (in fact was
| deeply flawed) but it could be operated safely if you didn't
| deviate from procedures. Which they did, because they were
| never told about the design flaws (it was a state secret).
| timewizard wrote:
| The local engineers were effectively scapegoated by the KGB.
| It was disappointing to see HBO follow the same lead.
| hencq wrote:
| Huh, did you stop watching after episode 1? The HBO series
| was all about how it was the system that caused the
| accident to happen. The fact that it was a known failure
| mode was a major plot point.
|
| It can hardly get more explicit than this scene:
| https://www.youtube.com/watch?v=jBwSuSuGhyk
| kqr wrote:
| Procedures are written under the assumption that the actual
| system behaves like the theoretical system the engineer has
| in their head. It never quite does. There's always a gap, and
| this gap nearly always requires deviating from procedures to
| ensure safe operation.
|
| Deviating from procedures prevents as many accidents as it
| causes. Safety cannot be based on adherence to procedure.
| Safe systems must be designed to take advantage of (and be
| protected against, I suppose) human ingenuity.
| colechristensen wrote:
| This is nonsense, especially this line.
|
| >Deviating from procedures prevents as many accidents as it
| causes.
|
| And they weren't doing some small deviation from procedure.
| They were doing something that was expressly forbidden and
| they knew why it was forbidden. In the time leading up to
| the accident it would be difficult to distinguish between
| what they were doing and trying to intentionally cause a
| meltdown. In a reactor experiencing xenon poisoning instead
| of shutting down for 24 hours (procedure) they removed
| every nuclear reaction moderating mechanism they could.
|
| This isn't a smart little deviation, it's pouring gasoline
| on a fire and hoping for a good outcome. It is hard to
| describe how stupid this was.
| Retric wrote:
| They are referring to a wider range of situations than
| this very specific incident. There are many accidents
| that have been avoided by modifying procedures on the
| fly. Pilots discovering letting go of the yoke/stick
| solved some very specific issues is one example.
|
| It generally requires operators to know how the systems
| work in great detail which would have avoided the
| specific Chernobyl incident.
| dylan604 wrote:
| To me, Chernobyl is an example of the classic conundrum that
| engineers cannot possibly think of every single weird thing
| that could possibly happen so that a perfectly safe anything
| can be made. It applies to software design just as much as a
| nuclear reactor. Sometimes it takes a failure to actually
| happen before something can be made safer. Somethings are just
| more consequential when they fail making the learning from
| failure much more expensive.
| colechristensen wrote:
| Not really. Corners were cut, the ultimate issue which caused
| the explosion was known beforehand, and the operators
| violated several points of standard procedure as well as
| doing several basic unwise things. It was not at all a case
| of unknown edge cases but stupid piled on top of stupid until
| the damn thing exploded.
|
| The biggest, stupidest action was trying to operate a reactor
| very clearly experiencing xenon poisoning including the many
| unsafe things they did to try to overcome the poisoning. I'm
| pretty sure modern reactors still shut down for 24 hours to
| avoid the xenon issue. This was well known, even without the
| design flaws this was a huge risk, and anyone with an ounce
| of sense would have known not to do what they did leading up
| to attempting to scram the reactor.
| orbital-decay wrote:
| There were two high-level causes, basically:
|
| 1. The failure to scale the education quickly enough. Nation-
| scale nuclear energy was new when the RBMK line was introduced.
| The demand for nuclear engineers skyrocketed, and it was
| impossible to train the required amount of professionals to the
| same standards as nuclear scientists in just a few years.
| Meanwhile, the RBMK assumed deeper knowledge of its design than
| they had.
|
| 2. The system that made academicians (the official Academy of
| Sciences title) equivalent to mid-to-large caliber politicians
| within their area of expertise. As a result, Dollezhal's pride
| ran unchecked and prevented him from addressing well-known
| design flaws (that already caused the 1975 accident).
|
| Both reasons are not unique to USSR at all and can be learned
| from. (something that is often ignored because "it can't happen
| here")
| Terr_ wrote:
| > When there was xenon poisoning in the upper half of the core,
| the safety rods were designed in such a way that, at least
| initially, they were increasing (and not decreasing) the core
| reactivity.
|
| I wonder if any reactor-design groups do "fuzz testing" on
| simulated models, checking that they can recover from very weird
| states even if it's not clear how the state could have been
| reached in the first place.
|
| For example, having one section just arbitrarily xenon-poisoned,
| while another is arbitrarily too hot.
| _n_b_ wrote:
| Yes, essentially this happens. PWRs and BWRs have operating
| limits on their power shape derived from doing those kinds of
| analyses.
|
| They're tend do be more physical than "arbitrarily xenon-
| poisoned" but represent a variety of extreme and nominal states
| to form an operating envelope, and then healthy margins are
| applied on top of that.
| xk3 wrote:
| The disaster was a result of fuzzy testing :-)
|
| https://en.wikipedia.org/wiki/Chernobyl_disaster#Safety_test
| cyberax wrote:
| Yes. Reactors now are designed to never have positive feedback
| loops that can result in uncontrollable power spikes. They do
| the worst case simulations to prove that.
|
| Russian atomic regulator downright shut down the project to
| design a light water breeder reactor, saying that it's not
| going to be licensed. Light water breeder reactors are barely
| theoretically possible, but they require trade-offs to limit
| the amount of water within the reactor core. So the trade-offs
| result in a positive void coefficient. It's supposed to be
| offset by other safety features, but better be safe than sorry.
| timewizard wrote:
| Yes, it wasn't intentionally, but it was at Ingalina.
|
| https://en.wikipedia.org/wiki/Ignalina_Nuclear_Power_Plant
___________________________________________________________________
(page generated 2025-01-24 23:01 UTC)