• Sir_Osis_of_Liver@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’ve spent 25 years in thermal generation, including some work on nukes, and there are a lot of issues with that video.

    Professional estimators are generally pretty accurate when it comes to thermal plants. A class D estimate would be ±20%, Class A ±5%. The problem arises when they have to spin unrealistic numbers in order to be competitive with other technologies. That’s when you get things that are off by 200% of the estimate.

    Speaking of which, over the history of the industry, the average nuclear reactor in the US is 207% of the initial estimate. That number has actually gotten worse going from 1st to 2nd to 3rd generation reactors, even though the reactors were simplified to reduce things like the number of pipe runs and control valves, etc, and the use of large pre-manufactured sub-assemblies. In order to become economic, reactor sizes have increased, which increases overall efficiency, the theory being an incremental increase in cost would be offset by the higher output. The trouble was, the increase in costs never ended up being anywhere close to ‘incremental’.

    Initial design costs and regulatory approval & fees typically should be in the range of 10-15% of capital costs. For the most part, that’s about where they land in practice. The bulk of the costs and cost overruns remain in construction, construction-management, material, and project financing, especially now that interest rates are increasing.

    The regulatory sabotage theory is BS. The French have been the largest nuclear industry proponents since their big construction boom in the 1970/80s. Yet their plants are just as likely to go over budget as anywhere else. The regulations are written based on accident/incidents in the past. They’re there for a reason.