Strategic Initiatives
11123 stories
·
45 followers

Drinking Sugar Is Much Worse Than Eating It for Diabetes Risk

1 Share
Brigham Young

For years, we've been told that sugar is a major culprit behind the global rise in type 2 diabetes. Now, emerging evidence from BYU researchers adds nuance to that message,...

Read the whole story
bogorad
6 hours ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete

The bad science behind expensive nuclear - Works in Progress Magazine

1 Comment
On 23 May 2025, President Trump signed four executive orders on nuclear power, intended to speed up approvals of and reduce regulatory burdens on new nuclear reactors in America. Buried in one of them was a requirement that the Nuclear Regulatory Commission reconsider its use of ‘Linear No Threshold’ (or LNT). LNT is the hypothesis that the relationship between radiation dose and cancer risk to humans is linear and that there is no truly ‘safe’ level of radiation. It underpins nuclear regulation worldwide and it may be one of the most important rules that almost no one’s ever heard of.

In 2013, GE Hitachi Nuclear Energy, a joint venture between General Electric and Hitachi, applied to build three advanced boiling water reactors in Wales. Fission reactions would boil water into steam, turning a turbine, powering a generator, and producing electricity. This specific design had been employed in four Japanese reactors, which had survived earthquakes of a greater magnitude than have ever hit the UK without posing any threat to workers or the public. 
Even though the reactor had a flawless safety record, the UK’s Office for Nuclear Regulation was not satisfied. Over the course of a four and a half year process, it demanded a series of design changes. These included the installation of expensive, bulky filters on every heating, ventilation, and air conditioning duct in the reactor and turbine building, a new floorplan for the room in the plant’s facility that housed the filtration systems, and an entirely new layout for the facility’s ventilation ducts. The purpose of these changes was to reduce radiation discharges from the filter by 0.0001 millisieverts per year. This is the amount a human ingests when they consume a banana. 
A CT scan hits a patient with ten millisieverts all in one go. Natural background radiation in the UK or US typically exposes people to two or three millisieverts during the course of a year, and exceeds seven millisieverts per year in Iowa and North Dakota and South Dakota. A single flight from New York to London exposes a passenger to 0.04–0.08 millisieverts; 0.0001 millisieverts is equivalent to 1/400 of the upper range of that, or about 72 seconds in the air per year worth of radiation.
The regulatory ratchet that makes nuclear unaffordable can be summarized in a single acronym: ALARA. This is the internationally accepted principle that exposure to ionizing radiation – the kinds of radiation produced by x-rays, CT scans, and the radioactive isotopes of elements used in nuclear power plants – should be kept ‘as low as reasonably achievable’. ALARA has been interpreted in major economies like the US, UK, and Germany as meaning that regulators can force nuclear operators to implement any safety improvement, no matter how infinitesimal the public health benefit, provided it meets an ambiguous proportionality standard.
ALARA stems from the Linear No Threshold hypothesis, the theory about how the body responds to radiation that May’s Executive Order took on. Critically, the hypothesis holds that any amount of ionizing radiation increases cancer risk, and that the harm is cumulative, meaning that multiple small doses over time carry the same risk as a single large dose of the same total magnitude. 
In other areas of our lives, this assumption would seem obviously wrong. For example, the cumulative harm model applied to alcohol would say that drinking a glass of wine once a day for a hundred days is equivalent to drinking one hundred glasses of wine in a single day. Or that a jogger who ran a mile a day for a month was putting her body under greater strain than one who ran a marathon in a day. We recognise that the human body is capable of repairing damage and stress done to it over time. 
But the Linear No Threshold assumption is the orthodoxy in international radiation protection, and its implications in ALARA regulations are among the most significant contributors to nuclear energy’s unaffordability in most of the developed world. But these assumptions are not just counterintuitive: they may be unscientific. 

The making of LNT

In 1927, Herman Muller, a researcher at Columbia University, published a breakthrough finding on the connection between radiation and genetic changes: fruit fly sperm cells treated with X-rays had a 15,000 percent higher mutation rate than untreated controls. These mutations were stable, heritable, and frequently lethal.
Muller became famous overnight. Researchers began to find similar results in maize, mice, and other organisms. Despite his good fortune, the Great Depression hit his lab hard. Muller moved from the US to Germany in 1932 and then to the USSR a year later, where the government funded his lab generously. Among the friendships he made during this trip was one with Soviet biologist Nikolai Vladimirovich Timofeeff-Ressovsky.
In 1930, Muller had observed that ‘the frequency of mutations produced is exactly proportional to the energy of the dosage absorbed’, but he had not formally turned it into a dose-response model.
In 1935, Timofeeff-Ressovsky, in collaboration with the German radiobiologist Karl Zimmer and German-American physicist Max Delbrück, released research reaffirming that x-ray induced mutations in Drosophila are directly proportional to radiation dose. They extended the theory by arguing that mutations could result from a single blast of radiation, which would come to be known as ‘hit theory’.
Muller was a strong believer in the power of science to effect social change. In his case, this meant a twin passion for eugenics and socialism. In a 1936 letter to Stalin, he would describe himself as ‘a scientist with confidence in the ultimate Bolshevik triumph’, who believed that ‘human nature is not immutable, or incapable of improvement’. But his stay in the Soviet Union was not a happy one. The rise of Lysenkoism, the pseudo-scientific Soviet alternative to genetics, would result in his eventual return to the US in 1940. 
The atom bombs dropped on Hiroshima and Nagasaki catapulted radiation to the top of the agenda, and Muller was awarded the 1946 Nobel Prize in Medicine. He used his lecture to cite Timofeeff-Ressovsky approvingly and declare that there is ‘no escape from the conclusion that there is no threshold dose’. The Linear No Threshold hypothesis had been born.
Muller’s work was highly influential and would go on to play an outsized role in the regulation of radiation. 
But not everyone was as convinced by its implications as he was, even at the time. Robley D Evans had emerged in the 1930s as one of the world’s first experts on the impact of radiation on human health. Though a believer in the potential harms of radiation exposure, he rejected the LNT model that Muller was popularising. 
In 1949, Evans published a paper that attempted to extrapolate the findings from studies on fruit flies, mice, and plants to humans, accounting for the biological differences. He found that even at a radiation dose of 2.5 röntgen per day for 21 days – roughly 25 millisieverts, equivalent to two and a half CT scans – some organisms did not show any increase in mutations at all. 
Regulations at the time limited radiologists to 0.1 röntgen of exposure a day, after higher rates of cancer and illness had been observed in the profession. Since 2.5 röntgen significantly exceeded these levels, Evans concluded that it is ‘highly improbable that any detectable increase in hereditary abnormalities will result’ from the low levels of exposure they faced.
Muller was unimpressed. He sent Evans a long letter full of criticisms, which Evans derided as containing ‘a few points of scientific interest, and many matters regarding personalities and prejudices’. In Muller’s view, Evans was backed up by radiologists plus figures who had a vested interest in minimising the dangers of radiation due to their association with America’s gung ho nuclear regulator, the Atomic Energy Commission (AEC).
Initially, none of these abstruse debates about fruit flies seemed to matter. Popular attitudes to radiation were cavalier and many physicians believed that radiologists were just disproportionately weak or sickly. X-rays were used routinely in shoe-fittings and for hair removal, radium-enhanced tonic was sold for medicinal purposes, and radium was routinely infused in cosmetics. American consumers could buy radium-infused handwash with the ominous slogan of ‘takes everything off but the skin’. 
When the first nuclear power stations came online in the US in 1957, there were rules around radiation exposure for plant workers, but nothing governing background radiation around facilities. The civilian application of nuclear energy was initially uncontentious, but optimism would rapidly drain away. The US government, the technology’s biggest champion, would soon prove to be a liability. Nuclear energy would be crippled by events with only a tangential connection to the industry.

With friends like these

Over the course of the 1950s, the US conducted well in excess of 100 nuclear weapons tests, either in Nevada or in sites dotted around the Pacific Ocean. This was overseen by the AEC, which was in the odd position of both regulating civilian nuclear power and running atomic weapons testing. It was both the nuclear industry’s main promoter in the US and its regulator.
In 1953, fallout from a test in Nevada led to a number of local sheep falling ill and then dying. Then in March 1954, a test at Bikini Atoll in the Marshall Islands went seriously wrong. Castle Bravo was (and remains) the most powerful nuclear device that the US ever tested, roughly 1,000 times more powerful than the atomic bomb dropped on Hiroshima. Not only did it produce more fallout than anticipated, but a sudden shift in wind speed and direction caused the fallout to spread significantly further than intended, raining down on nearby islands. 
The small population of Rongelap was the worst hit. Located 110 miles from the test site, the nuclear fallout looked like snow, leading children to play with it, while much of the population ignored it and went about their daily business. The population was hit with 2,000 millisieverts of radiation over three days, significantly more than many Hiroshima and Nagasaki survivors. While there were no fatalities, significant numbers of people developed skin lesions and alopecia, while leukemia and thyroid cancer rates remain elevated among this population. The US Government evacuated a number of islands in the days after the blast, while Rongelap remains uninhabited after a failed return effort. 
Less than a week later, the Associated Press revealed that the crew of the Japanese fishing vessel Lucky Dragon had suffered skin discoloration, hair loss, and nausea. During their voyage home from trawling 100 miles east of Bikini, their eyes and ears had leaked yellow pus. Panic ensued after it transpired that part of the crew’s cargo of tuna and shark meat had been sold before the danger was apparent. Fish prices collapsed and panic spread across the US and Japan as the authorities searched for contaminated fish. 
There would be other fears. From the late 1950s onwards, public fears rose about elevated levels of strontium-90, an isotope produced by nuclear fission, in milk. The levels were never high enough to come close to causing harm, but a panic about children catching bone cancer and leukemia nevertheless spread. In 1956, Democratic presidential candidate Adlai Stevenson proposed a unilateral ban on hydrogen bomb testing to protect Americans from the effects of fallout.
At every turn, the AEC’s instinct was to play down these incidents and to avoid discussing fallout. The full scale of livestock contamination in Nevada would not emerge for decades, after the AEC allegedly altered scientists’ reports to change the causes of death for the animals. Meanwhile, AEC Chairman Lewis Strauss wrongly claimed that Lucky Dragon had been sailing inside the restricted test area, while suggesting that the crew’s injuries ‘are thought to be due to the chemical activity of the converted material in the coral rather than to radioactivity’.
When it came to the Marshall Islanders who had been evacuated, the AEC wrongly implied in its public statements that none of them had suffered real side effects; its acknowledgement that they were exposed to ‘some radioactivity’ scarcely conveyed the levels of radiation that they had encountered.
The AEC’s evasiveness troubled the public, but more importantly, it began to radicalise a section of the scientific community. Geneticists particularly bridled against the AEC’s attempts to push news articles downplaying the health risks of radiation, as well as their attempts to steer the scientific conversation. In a move almost perfectly calibrated to drive ill-feeling in the community, the AEC used its influence to bar Muller from delivering a paper on radiation-induced mutation at the UN’s 1955 Geneva Conference on Peaceful Uses of Atomic Energy. 
Against this backdrop, the US National Academy of Sciences convened a committee to assess the Biological Effects of Atomic Radiation (BEAR) in 1955. The defining feature of BEAR I was its disharmony. The committee was split into separate panels of geneticists and pathologists, whose main activity became feuding with each other. The geneticists, led by Muller, pushed hard for LNT. The pathologists, however, were not believers. Sceptical of attempts to extrapolate to humans from fruit flies, the pathologists believed the geneticists had an overly simplistic view of how diseases developed. 
Both panels’ reports were published, along with a compromise summary and set of recommendations. The summary concluded that ‘except for some tragic accidents affecting small numbers of people, the biological damage from peacetime activities (including the testing of atomic weapons) has been essentially negligible’. However, critically, it also noted that ‘there is no minimum amount of radiation which must be exceeded before mutations can occur’  and ‘the harm is cumulative’, meaning ‘the genetic damage done by radiation builds up as the radiation is received’. This point is critical – it implies that the body has no way to recover from radiation damage. In essence, receiving a huge blast of radiation suddenly is no worse than receiving small doses gradually over time.
The report recommended reducing the maximum lifetime cumulative radiation exposure to reproductive cells from 300 down to 50 röntgen (from approximately 300 CT scans to approximately 50), and limiting the total exposure received by a member of the public up to age 30 to ten röntgen.
Media coverage of BEAR I was as nuanced as you’d expect. The front page of the New York Times on 13 June 1956 screamed ‘Scientists term radiation a peril to future of man’, with the subhead ‘even small dose can prove harmful to descendants of victim, report states’. The AEC was berated in the media for having misled the public about the existence of a safe threshold.
Things were going to get worse for nuclear power and the AEC. By the end of the decade, ionizing radiation was under political and scientific siege. 

The war on radiation

In 1957 Chet Holifield, who chaired the congressional Joint Committee on Atomic Energy, complained that he had to ‘squeeze the [fallout] information out of the Agency’ and accused the AEC of having a ‘party line’ of ‘play it down’, asking ‘is it prudent to ask the same agency to both develop bombs and evaluate the risk of fallout?’. International developments were also unhelpful. A 1958 UN report, which had drawn heavily on the work of American scientists, strongly supported LNT.
Another angle of attack opened up in medicine. By the 1950s, x-ray equipment had become widely used in hospitals and most pregnant women in the UK and US received an x-ray at least once during their pregnancy. Between the mid-1930s and mid-1950s, deaths from childhood leukemia doubled in England and Wales. Alice Stewart, an Oxford epidemiologist, doubted the prevailing view that this stemmed from a combination of industrial pollutants and better diagnosis. In 1958, she published an article in the British Medical Journal, presenting survey data showing that children who had been x-rayed in utero were twice as likely to die by the age of nine. While Stewart’s work was met with skepticism, a 1962 US study found that childhood cancer mortality was 40 percent higher among x-rayed children.
The 1950s also saw the birth of modern cytogenetics, the study of chromosomes. Thanks to improved staining and microscopy techniques, scientists finally determined the number of human chromosomes accurately at 46. Scientists established the first links between chromosomal abnormalities and conditions like Down’s, Turner syndrome, and Klinefelter syndrome. They quickly took an interest in how different radiation doses impacted chromosomes. Michael Bender, also of Cold Spring Harbor, established that x-rays could induce chromosome aberrations in human tissue cultures in 1957. 
Five years later, along with his colleague PC Gooch, Bender took blood samples from volunteers, exposed them to different x-ray doses, and then examined the chromosomes during cell division. Not only did they find that the x-rays caused identifiable chromosome damage, they could predict the amount of damage based on the dose. They found damage at the lowest dose they measured, 50 röntgen, the radiation dose you’d expect from 50 CT scans. 
It’s around this time that the seeds of ALARA – the goal of reducing background radiation from nuclear reactors to a level ‘as low as reasonably achievable’ – were sown. The principle combines the Linear No Threshold view that all ionizing radiation causes harm to humans with the view that it is never worth trading off some health costs against other benefits. 
In a 1959 publication, the International Commission on Radiological Protection swung in a much more conservative direction. Historically, it had been believed there was a safe threshold, while the long-term genetic effects of radiation sat outside the expertise of most of its membership. However, it now recommended that radiation exposure be kept as ‘as low a level as practicable, with due regard to the necessity of providing additional sources of energy to meet the demands of modern society’. 
Petrol was thrown on the fire in 1969, when John Gofman and Arthur Tamplin, two scientists at Lawrence Livermore National Laboratory, started publishing inflammatory claims about radiation exposure and cancer risk. Gofman and Tamplin claimed that if the entire US population were exposed to the Federal Radiation Council and AEC’s safe radiation limits from birth to age 30, it would result in 16,000 additional cancer cases a year. They subsequently revised this number to 32,000. As a result, they believed that man-made radiation exposure limits needed to be cut from 1.7 millisieverts a year to 0.17.
Gofman and Tamplin’s work was significant because of the radiation levels that they attacked. Much of the work discussed above, from Hermann Muller onwards, used levels of radiation a factor of tens, hundreds, or even millions of times greater than natural levels of background radiation – the sorts of levels usually seen only at nuclear weapons test sites or by x-ray technicians exposed to radiation every single day. This was understandable, given that radiation safety began in the worlds of medicine and nuclear weapons testing. It also reflected the statistical challenges of measuring the effects of very low doses. But it also tells us relatively little about nuclear power; Bender and Gooch’s ‘low’ dose is four times higher than the average dose received by the recovery staff who worked on the Chernobyl accident site. 
Gofman and Tamplin’s work was met with skepticism by their peers and was initially ignored. But after Gofman testified before the Senate Subcommittee on Air and Water Pollution and then the Joint Committee on Atomic Energy, the ensuing public fall-out led to Robert H Finch, the Secretary of Health, Education and Welfare to establish the Committee on the Biological Effects of Ionizing Radiation (BEIR), which would produce its first report in 1972
This report reaffirmed LNT, but marked an important shift. BEIR I and II had emphasised genetic risks heavily, but the descendants of Hiroshima and Nagasaki survivors were simply not displaying signs of the genetic damage at the rates the geneticists’ modelling on mice or fruit flies suggested they should. In fact, there was no statistically significant increase in birth defects, stillbirths, survival rates, or chromosomal abnormalities versus control groups, either at initial observations in the 1950s or after subsequent follow ups. Anyone more than about 1,800 metres from the point on the ground directly below the blast did not experience heightened rates of cancer at all. 
BEIR I started a trend, followed by subsequent BEIR reports, of focusing significantly more on the risk of cancer, rather than genetic damage. BEIR I didn’t take a position on the shape of the dose-response curve, but affirmed that even very low radiation doses could have carcinogenic effects. 

The end of nuclear’s golden age

By the end of the 1960s, it was clear that the AEC was living on borrowed time and, along with it, the golden age of the US nuclear industry.
A big change was the growth of environmental consciousness. This had found an unlikely champion in Richard Nixon, who signed the National Environmental Policy Act into law in 1970. This required federal agencies to prepare environmental assessments and impact statements to evaluate their decisions. The AEC was not willing to kowtow and attempted to interpret these rules as narrowly as possible, resulting in a 1971 legal defeat over a planned nuclear plant on Chesapeake Bay. This forced the AEC to suspend new plant licensing for 18 months while it updated its rules. 
It then endured a series of brutal congressional hearings over the course of 1972–73, in which independent experts and internal whistleblowers criticised its approach to regulating the safety and reliability of emergency core cooling systems in nuclear reactors. Witness after witness took the opportunity to attack the AEC for its lack of transparency and for allegedly rushing approvals.
In 1974, the Government decided that it had seen enough and abolished the AEC through the Energy Reorganization Act. In its place, the Nuclear Regulatory Commission (NRC) was established to regulate civilian nuclear activities, while the Energy Research and Development Administration managed weapons research.
The NRC’s institutional culture was markedly different to that of its predecessor. It very much saw itself as a regulator, not an advocate or an enabler. The AEC had already started to ramp up regulation in response to public and political pressure, but the NRC accelerated this trend. It formally adopted ALARA in 1975. This meant that the NRC would not issue a construction or operating licence until the applicant showed that further shielding or processing equipment would not cost unreasonably more than it saved. Inspectors would no longer simply assess whether facilities stayed below dose limits, but on how aggressively they drove doses lower year-to-year.
The combination of tougher radiation safety standards and new environmental rules caused the costs of nuclear power to spiral in this period. This can clearly be seen in individual projects. New radiation shielding, extra instrumentation, and the relocation of control systems to reduce exposure risk drove up materials bills. The amount of cabling required for a nuclear project in the US jumped from 670,000 yards to 1.3 million between 1973 and 1980, while cubic yards of concrete increased from 90,000 to 162,000. The number of man hours per kilowatt hour of energy generated surged from 9.6 in 1972 to 28.5 in 1980. The Sequoyah Nuclear Plant in Tennessee, scheduled for completion in 1973 at a cost of $300 million was completed for $1.7 billion in 1981, after 23 changes to structure or components were requested by the regulator. 
By 1980 the previous decade’s regulatory changes had driven a 176 percent increase in plant cost. New safety rules had resulted in greater complexity, in turn driving up the materials bill and engineering costs. 
The number of regulatory guides began to climb, and projects would take longer to complete, resulting in higher financing costs. A 1979 Congressional Budget Office study found that a one-month delay in the construction of a new reactor would cost an extra $44 million (in 2025 terms), with half this total coming from interest. The Public Service Company of New Hampshire, the builders of the prospective Seabrook Station went bankrupt in 1988, after regulatory delays resulted in one unit being completed 14 years after its construction permit was issued and the other being cancelled. It is not surprising that a 1982 Department for Energy report found that utilities companies with a huge percentage of their electricity generated by nuclear power tended to have lower bond ratings, even after controlling for earnings and state regulatory quality.
The notorious Three Mile Island accident in 1979, when a reactor in Pennsylvania released radioactive gases and iodine into the environment after a partial meltdown, would worsen the political backlash against nuclear energy. No credible research has found evidence that the accident impacted the health of anyone in the surrounding area. However, the regulatory damage had already been done.
Thanks to its leadership position in the field, debates around radiation science in the US played an outsized role in shaping global standards. In 1977, the International Commission on Radiological Protection adopted its three fundamental pillars of radiation protection that remain in effect to this day: justification, optimisation, and dose limitation. In practice, these pillars mean that any introduction of a new source of radiation, like a new reactor, must first be shown to have a net overall benefit to society, and that all new doses of radiation received by workers and members of the public should be as low as reasonably achievable. 
Governments around the world were adopting ALARA too. Britain was one enthusiastic example. The Health and Safety at Work Act, passed in 1974, adopted a subtly modified formulation of ‘as low as reasonably practicable’. For exposure to be considered ALARP, the regulator can require the inclusion of any measure that it does not rule to be ‘grossly disproportionate’. The prospective licensee has to proactively challenge any requested change, which rarely happens in practice, in part because they would be suing the regulator they are reliant on to give them a license to operate.
The European Atomic Energy Community, founded by Belgium, France, Germany, Italy, Luxembourg, and the Netherlands to create a market for nuclear power, adopted ALARA in 1980, but its application across Europe was uneven.
In the 1960s, French nuclear approvals had been determined by a small group of engineers, safety experts, and military scientists in secrecy, in a process dubbed ‘French cooking’ by Anglo-American observers. This process of ‘technical dialog’ relied on non-binding individual ministerial directives, safety reports, and guides. This was designed to allow rapid construction, while the flexibility was designed to help France’s ambition to become an exporter of nuclear technology. In fact, France didn’t have codified technical regulations for nuclear safety until the end of the 1970s. 
The system gradually became more formalised and transparent over the course of the 1980s, but the French government largely resisted the regulatory ratchet until the Fukushima disaster in 2011. While this era’s approach would fly in the face of today’s norms around transparency and conflicts of interest, the vast majority of France’s operating nuclear reactors were built under this system during the 1970s and early 1980s. Today, nuclear power generates around two-thirds of France’s electricity, making it the most nuclearized country on earth.
By contrast, West Germany pursued aggressive safety standards and designed a legal framework with significant scope for public consultation and judicial review. In 1981, experts estimated that this was delaying close to $53 billion in nuclear investment (in 2025 dollars). 
The end of the 1970s oil shock, a global collapse in coal prices, and a flatlining in energy demand in most developed countries from the early 1970s all contributed to making nuclear a significantly less attractive commercial proposition. The number of new nuclear reactor projects collapsed, and many that were under construction were cancelled.

The breaking of LNT

The science of radiation safety did not stop in the 1970s. Even as LNT was becoming the regulatory consensus, its scientific basis was beginning to unravel. 
If the human body has ways of healing itself, then lower doses over a sustained period of time seem unlikely to have the same effect as the same size dose taken at once. The body will use the time to repair the damage caused by the low doses that it does not have when it experiences the high dose.
Scientists at Cold Harbor Spring Laboratory found in 1949 that bacteria could repair damage caused by ultraviolet light, once they were exposed to normal light. Even before this, scientists had assumed that cells must have some way to repair damage from radiation, given the amount of naturally occurring background radiation the earth is exposed to from things like the sun, cosmic rays, and minerals in the earth like radon and uranium.
Watson and Crick’s work on DNA in the 1950s showed that it had a double-helix structure. This allows it to repair one damaged strand using information encoded in the other strand. When UV light or chemicals damage DNA, special proteins locate the damaged section, cut it out, and then fill in the gap with the correct sequence. 
The 1960s and early 1970s saw a series of research breakthroughs that showed processes like this happening for fixing both small-scale damage and larger, more disruptive lesions. But they did not immediately reject the idea that ionizing radiation could be repaired: ionizing radiation could cause both strands to break. 
The idea of double-strand break repair would not be proposed until the 1980s, after initially promising experiments in yeast led to its exploration in mammalian cells. DNA repair, including double-strand break repair, is now universally accepted. It appears in foundational molecular biology textbooks, while the 2015 Nobel Prize in Chemistry went to three researchers for their study of DNA repair.
We can test LNT at the epidemiological level as well. If there is truly no threshold, we should expect to see higher incidences of cancer among populations that have endured prolonged radiation exposure. But study after study has failed to find this. In 1991, the US Department of Energy commissioned Johns Hopkins University to study the health records of 70,000 nuclear shipyard workers, comparing workers in radiation-exposed areas with workers in other areas. It found no evidence of adverse effects. Johns Hopkins repeated the same study in 2022 with a bigger dataset, looking at over 372,000 shipyard workers from 1957 to 2011. Beyond some asbestos-related illness from early years of the study, they found no evidence of heightened cancer risk in workers working in radiation-exposed areas.
Of course, nuclear shipyard workers could well be fitter and less susceptible to illness than the average member of the public. But some unfortunate 1980s construction in Taiwan provides us with clues. Over the course of 1982–87, steel that had been contaminated with cobalt-60 was recycled into construction materials for flats. Over two decades, 10,000 people occupied these buildings, receiving an average total radiation dose of 400 millisieverts, about twenty times more than the average American would receive over that period. A study found that they suffered lower incidences of cancer death than the general population.
We can also look at regions with high natural background radiation. Thanks to thorium-rich mineral deposits, Kerala in southern India has some of the highest levels of background radiation in the world. In some areas, radiation levels reach up to 30-35 millisieverts per year, compared to the worldwide average natural background radiation of about 2.4 millisieverts per year. Again, no excess cancer risk has been found.
The scientific advisory panels that birthed LNT, like BEIR, have not modified their positions in their most recent reports, but they have acknowledged that considerable uncertainty exists at lower doses. The International Commission on Radiological Protection, for example, has underscored LNT’s total lack of predictive power, warning in 2007 that: ‘Because of this uncertainty on health effects at low doses, the Commission judges that it is not appropriate, for the purposes of public health planning, to calculate the hypothetical number of cases of cancer or heritable disease that might be associated with very small radiation doses received by large numbers of people over very long periods of time.’ Despite this, the ICRP continues to recommend its use, while acknowledging that ‘existence of a low-dose threshold does not seem to be unlikely for radiation-related cancers of certain tissues’. It now uses an adjusted model that assumes radiation delivered at a lower dose rate is half as harmful as the same total dose delivered at a higher rate. This is meant to account for the body’s natural repair processes. 
So far, the French Academy of Sciences and National Academy of Medicine remain the only national-level scientific bodies to have recommended abandoning the orthodoxy. In a 2005 joint report, it expressed ‘doubts on the validity of using LNT for evaluating the carcinogenic risk of low doses (<100 mSv) and even more for very low doses (<10 mSv) … Decision makers confronted with problems of radioactive waste or risk of contamination, should re-examine the methodology used for evaluation of risks associated with very low doses and with doses delivered at a very low dose rate.’
LNT believers did, however, seemingly catch a break in 2015 with the publication of INWORKS, which studied cancer mortality after low dose exposure across 300,000 radiation workers across France, the US, and the UK countries. This was then updated in 2023. INWORKS concluded that there was indeed evidence of a linear association between cumulative radiation dose and the risk of developing solid cancers. It also found that the risks of radiation-induced cancer mortality may be higher than previously reported. 
The study, however, contains a number of methodological quirks that render the headline findings suspect. In the INWORKS study, background radiation is subtracted from workers’ dosimeter readings, even though for most participants, background exposure far exceeds their occupational dose. This results in misleading comparisons. For example, a Rocky Flats worker exposed to five milligrays (roughly equivalent to the same number in millisieverts) per year of background radiation is treated as equivalent to a Hanford worker receiving only one milligray per year, despite large differences in total radiation exposure. 
INWORKS uses a control group of workers who received 0–5 milligrays to avoid the health worker effect we warned about in the John Hopkins study. However, this introduces a different bias: workers in this group often hold desk jobs and tend to have higher education, income, and healthier lifestyle habits than blue-collar workers. This explains some of the bizarre results elsewhere in the study. The next dose up from the control group, which received a negligible 5–10 milligrays (that is, less than 0.2 milligrays per year), saw a six percent increase in cancer risk. This amounts to an 850 percent increase in cancer risk for every gray of radiation. Yet, from 10 to 300 milligrays, no further increase in cancer is observed. This indicates that the sharp jump is likely due to confounding socioeconomic factors, not radiation. 

The triumph of inertia 

Throwing out decades of orthodoxy on radiation safety would be controversial and result in considerable bureaucratic inconvenience. Meanwhile, LNT defenders have certain forces on their side.
For a start, it will always be possible to label evidence about low-dose radiation as highly uncertain. While this logic should cut both ways, in practice, it creates a huge bias in the incumbents’ favour.
Scientists who believe in the existence of a safe threshold have the unenviable task of essentially proving a negative, definitively showing that no effect exists below a certain dose. Meanwhile, LNT advocates have a simple model that can always be defended using the precautionary principle. 
These practical challenges make LNT borderline unfalsifiable. Whether its statistical limitations, the challenge of controlling for other factors, or difficulties in follow-up, it will always be possible to find a reason to dismiss any single study that contradicts it.  
While LNT is very conservative, incumbents are reluctant to challenge it. The clear regulatory line in the sand allows nuclear operators and developers to constrain tort judgments in the event that workers fall ill. Many incumbents are willing to pay the price of highly conservative exposure limits. In the UK, for example, EDF restricts worker radiation exposure to 10 millisieverts a year, which is half the statutory dose limit, and public exposure to 0.3 millisieverts, a fraction of the already negligible one millisievert limit. 
Even the Trump Administration’s May 2025 Executive Order does not go beyond asking the NRC to ‘reconsider’ LNT and ALARA, describing LNT as ‘flawed’. As Jack Devanney, one of the most prolific and prominent critics of LNT and ALARA today, has pointed out, the NRC has already been asked to ‘reconsider’ LNT three times, most recently in 2019. ‘The NRC,’ he says, ‘pondered the issue for three years before proclaiming to no one’s surprise that it was sticking with LNT.’ The Administration would be on safe ground legally if it took a more assertive stance: Congress did not mandate it, or ALARA – the NRC adopted them itself.
Meanwhile, the costs of ALARA only continue to stack up. The case of the banana-like levels of radiation exposure is just one example. Tens of billions of dollars are added to the lifetime costs of nuclear projects to bury waste underground in deep geological repositories, facilities 200 to 1,000 metres below the surface of the earth, on safety grounds. There have been no fatalities in the history of international nuclear energy waste management. 
In nuclear facilities, some regulators will expect operators to prepare for double-ended guillotine breaks in piping. This is the assumption that a pipe could completely sever, causing the broken ends to ‘whip’ with intense force, causing significant damage to all of the equipment around it. This is, in fact, an unrealistic assumption. Decades of operating experience and research indicate that pipes typically develop detectable leaks long before catastrophic failure. As a result, operators have to install extensive restraint systems that add to the maintenance burden. The US has started to ease these restraint requirements, but the UK has not.
While the nuclear industry can shift national regulators on individual requirements by attrition, this seems like a bad way of incentivising long-term investment in critical infrastructure. It seems highly unlikely that if we were starting out from scratch, we would end up with a radiation safety regime built on LNT. As the urgency of the energy transition is brought into sharp relief, governments are responding with one hand tied behind their back – even an Administration not otherwise known for its reticence. 
Read the whole story
bogorad
7 hours ago
reply
Summary: Executive Order: President Trump signed executive orders to speed up nuclear reactor approvals and reduce regulatory burdens, including a review of the 'Linear No Threshold' (LNT) hypothesis.
LNT Hypothesis: LNT posits a linear relationship between radiation dose and cancer risk with no safe level, underpinning global nuclear regulation.
Regulatory Costs: The ALARA principle, derived from LNT, leads to costly safety improvements with potentially negligible public health benefits, increasing nuclear energy costs.
Historical Context: The LNT model emerged from early 20th-century studies on radiation's effects on organisms, but has been challenged by later research.
Challenges to LNT: Recent research suggests that the human body can repair radiation damage and find no evidence of increased cancer risk in populations with high natural or occupational radiation exposure.
Barcelona, Catalonia, Spain
Share this story
Delete

Is menopause necessary? What science says about delaying, eliminating it. | Vox

1 Comment
Will my generation be the last to go through menopause?
Just a few years ago, that would’ve seemed like a bizarre question — I’ve always assumed that I and every other human being with ovaries would eventually experience what my grandmother called “the change of life.” But now, researchers are calling into question what once seemed like basic facts of human existence. “What if menopause happened later?” they are asking. “What if it never happened at all?“
In recent years, patients have gained access to a wider variety of medications to treat menopausal symptoms like hot flashes and vaginal dryness. But newer treatments, one already in clinical trials, go deeper: The goal is not just to treat the symptoms, but to actually slow down ovarian aging so that the hormonal changes associated with midlife happen later — or maybe even never. “For the first time in medical history, we have the ability to potentially delay or eliminate menopause,” Kutluk Oktay, a reproductive surgeon and an ovarian biologist at Yale University, said in a release last year.
I cover reproductive health, and my inbox has been filling up for months with news of research like this. As an elder millennial barreling toward the uncertainty of perimenopause (which some research suggests can start as early as one’s 30s), I’ve received these updates with interest, sure, but also with a fair amount of trepidation.
On the one hand, the loss of estrogen that comes with menopause is associated with a host of illnesses and conditions, from cardiovascular disease to osteoporosis. Delaying the menopausal transition even five years “would result in an enormous improvement in terms of women’s health and decreased mortality,” Zev Williams, director of the Columbia University Fertility Center, told me. “It’s a really exciting opportunity.”
On the other hand, the idea of getting rid of menopause can feel like yet another way of insisting that women remain young and fertile forever. At a time when JD Vance is talking dismissively about the “purpose of the postmenopausal female,” I’m unsettled by the prospect of treating women’s aging out of their childbearing years, in particular, as something that must be cured.
If the idea of stopping menopause is a fraught one, though, it’s also an opportunity to think about what we want from our later lives, and to consider what it would look like to balance the real medical concerns of midlife and beyond with the fact that women are flesh-and-blood human beings who, like everyone else, get old. As Ashton Applewhite, author of the book This Chair Rocks: A Manifesto Against Ageism, put it to me, “you can’t stop aging, or you’re dead.”
To understand menopause, it helps to understand a little bit about ovaries, the human reproductive organs that store and release eggs. Starting in puberty, these glands ramp up their production of estrogen, a hormone that leads to breast development and a host of other changes to the body. Throughout the reproductive years, the ovaries make estrogen and other hormones according to a monthly cycle to help prepare the body for potential pregnancy.
Starting around a person’s late 30s, however, estrogen production starts to drop off. By the mid-40s, people typically enter perimenopause, which means “around menopause.” This period is characterized by unpredictable ups and downs in estrogen, though on a general downward trend (Mary Jane Minkin, an OB-GYN who teaches at the Yale School of Medicine, likens the pattern to the stock market during the Great Recession). That hormonal decline can lead to symptoms like irregular periods, hot flashes, and night sweats.
A lot of the symptoms most commonly associated with menopause actually start in perimenopause, and they can range from annoying to devastating. Perimenopause has been getting a lot of media attention lately, along with more focus from brands who may want to sell you stuff to help you manage it. Problems like hot flashes and brain fog can cause women to miss work, resulting in $1.8 billion in lost work productivity in the US per year, according to one study. The loss of estrogen can cause vaginal dryness, which can cause discomfort during sex and, in some cases, constant pain.
At some point, the ovaries stop producing eggs, and menstruation stops entirely. This is menopause, and it’s diagnosed when someone has gone without a period for a full year. It happens at an average age of 51, though Black and Latina women reach menopause earlier than white and some Asian American women. Though some symptoms, like dryness, persist after menopause, others, like hot flashes, often resolve, Minkin told me.
Hormonal changes in the body around menopause are also linked with increased risk of cardiovascular disease, Stephanie Faubion, director of the Mayo Clinic’s Center for Women’s Health and medical director of the Menopause Society, told me. Blood pressure and cholesterol tend to rise during this time, as does insulin resistance, a condition that can lead to diabetes. Bone density also falls during and after the menopause transition, increasing the risk of osteoporosis.
Some experts believe that menopause is part of the reason women spend more time than men living with chronic diseases. “Women live longer than men,” gerontology professor Bérénice Benayoun told Vox last year, “but they usually do so in a much more frail state.”
Given all this, it’s reasonable that more experts are looking at menopause and wondering, what if we could just…not?
Hormone therapy — typically estrogen taken on its own or with progesterone, sometimes in the form of birth control pills — already exists to treat the symptoms of menopause. The treatment was stigmatized for decades after a 2002 study linked it to breast cancer and other ailments, but doctors now say the benefits often outweigh the risks. Taking estrogen can dramatically reduce hot flashes, and even reduce cardiovascular risk while patients are on the medication.
When you look at illnesses and conditions like dementia, heart disease, and stroke, “there’s a much, much lower rate in women compared to men, until ovarian function stops. Then they start to catch up,” Williams said. “If we have a way of extending the ovarian lifespan in a way that’s safe, then you’re allowing the ovary to provide all these incredible health benefits” for a longer period of time.
Estrogen therapy can replace some of the estrogen a person’s ovaries are no longer producing, but it doesn’t actually stop those organs’ decline. To do that, some researchers are looking at more involved procedures.
In one recent study, Oktay, the Yale biologist, and his team used a mathematical model to predict how a technique called ovarian tissue cryopreservation might work in healthy patients. The process, typically used in cancer patients undergoing treatment that could harm their fertility, involves removing a section of ovary, freezing it, and reimplanting it at a later date. This technique, if used in healthy patients under 40, “would result in a significant delay in menopause,” according to the study.
Oktay and his team have begun preserving ovarian tissue from healthy patients, with the goal of reintroducing it when the patients are close to menopause. Since the women are still young, the team will have to wait years for real-world results, Oktay told me. But the technique does work to restore ovarian function in cancer survivors, he said.
About 10 percent of women enter menopause at age 55 or older, Oktay told me, and they tend to have longer life expectancy and less risk of osteoporosis and diabetes than people who go through the transition earlier. “We’re saying, why not make everybody that lucky?”
Williams, the Columbia Fertility Center director, and his team are working on a different, less invasive option. They’re currently in the midst of their first human trial of rapamycin, an oral medication typically used as an immunosuppressant in higher doses. Rapamycin has been found to extend lifespan in some animal studies, suggesting to some that it might help humans live longer. Williams and his team have also found that the drug can extend ovarian function and fertility in mice.
The Columbia researchers are now monitoring 50 women between the ages of 35 and 45 who have taken either rapamycin or a placebo, asking them questions about their mood, memory, and sleep quality, as well as checking their ovarian function through blood work and ultrasounds. They don’t have results yet, but they’ve seen no serious side effects so far.
The goal isn’t just to extend fertility, though “as a fertility specialist, that’s obviously something that I think about all the time,” Williams said. It’s also about extending the benefits that ovaries provide to women’s health, potentially reducing their lifetime risk of chronic illness.
Research like Williams’s has generated significant excitement, as the idea of pushing back menopause starts to move into the mainstream. Toward the end of the Biden administration, Jill Biden launched a women’s health initiative dedicated to studying the idea.
The ovaries are “the only organ in humans that we just accept will fail one day,” Renee Wegrzyn, director of the agency in charge of the first lady’s initiative, told the New York Times last year. “It’s actually kind of wild that we all just accept that.”
Some are skeptical, though, that delaying or eliminating menopause would be an entirely good idea.
For one thing, the link between menopause and illness isn’t completely clear-cut. People who go through menopause later tend to have better health outcomes, “but is it chicken or egg?” Faubion asked. “Do their ovaries last longer because they’re otherwise in better general health than the other people that go through menopause early?”
Prolonging ovarian function — and thus increasing people’s lifetime exposure to estrogen — could also come with risks of its own, like increases in breast cancer or blood clots, which have been linked to the hormone, Faubion said (going through menopause after age 55 is associated with an increased risk of breast cancer). For some, the tradeoffs might be worth it, but it’s not necessarily true that later menopause would mean better health across the board.
With more invasive treatments, there are also other questions to consider. “What are the ethics of taking out a healthy organ from a healthy person” — a surgical procedure that could fail — “all in the name of ‘delaying menopause?’” Faubion asked.
Oktay, the biologist studying ovarian tissue cryopreservation, told me the procedure is a minimally invasive laparoscopic surgery, and can be performed at the same time as another abdominal surgery like a C-section. Many participants in his study have a family history of severe menopause complications or conditions that can worsen with menopause, he said, giving them a reason to want to delay the transition.
For Minkin, the Yale gynecologist, the experimental treatments are intriguing if they can extend fertility. But for dealing with the physical challenges of menopause, she’s not sure they’re necessary: “There are plenty of easy ways to give people hormones.”
Some people, including survivors of certain cancers, aren’t able to take hormones, and new treatments could be helpful for them. Meanwhile, some experts see delaying menopause as most beneficial for people who experience the transition early. About one percent of women go through menopause before age 40, and five percent before age 45. Cancer treatments or autoimmune conditions can cause early menopause, but sometimes, the cause is unknown. Since early menopause is associated with elevated health risks, a new way to treat it “would probably result in a net benefit for population health,” Nanette Santoro, a professor of obstetrics and gynecology at the University of Colorado School of Medicine, told Time.
Going through menopause early can be deeply upsetting to people, especially (though not exclusively) if they’re hoping to have children. But when it happens at the average time, this stage of life can come with social and emotional benefits, despite the physical challenges.
“It’s liberating,” Applewhite told me. “No more mood swings, no more worries about getting pregnant.”
“I don’t know any woman, including yourself, who wants to be bleeding every single month,” Denise Pines, creator of the menopause summit WisePause, told me.
Indeed, research has found that women often become happier as they age, especially after midlife — potentially because they’re less consumed with caring for children and other family members. Some anthropologists believe that female humans, unlike other animals, live beyond their reproductive years to help care for grandchildren (it’s called the “grandmother hypothesis”). But Minkin offered a more expansive view of this theory: In early human settlements, pregnant people couldn’t do heavy labor like moving rocks around. The grandmother was “somebody who moves the rocks,” she said. She also described postmenopausal women as “shooting saber-toothed tigers.”
Even the troublesome symptoms of menopause, such as hot flashes and night sweats, can be a useful “disruptor” in people’s lives, Pines said. “Where women have been so giving and outwardly focused, suddenly you have to focus on yourself.”
“That gives you a chance to reset everything else around you,” from relationships to career, Pines said. “It’s such a great time to really reimagine who we are.”
Applewhite welcomes the recent surge in awareness around menopause, and says hormone therapy to treat its symptoms can be helpful — “I’m not saying, keep your body pure and avoid the temptations of Western medicine.” But when it comes to putting off menopause or eliminating it entirely, she said it’s concerning “when inevitable transitions of aging are pathologized.” That’s especially true for women’s aging, which is doubly stigmatized in American culture. “Under patriarchy, a woman’s value is linked to her reproductive value,” Applewhite said. It’s why there’s so little research into the health of older women: “because we are no longer reproductively useful.”
As appealing as the idea of extending a person’s healthy lifespan is, I can’t quite get past the ovary of it all. I, too, have heard from post-menopausal people about the liberation they feel when they exit their reproductive years. I, too, have at times been frustrated by doctors’ focus on my reproductive capacity over other aspects of my health. I want to be healthy as I get older, but I also want to accept my aging (and for the people around me to accept it), rather than feeling constant pressure to stave it off.
Applewhite wants women of all ages to see “later life as a time of enormous power and liberation and possibility,” and I’d like to see it that way too, not as something to be avoided at all costs.
When I shared some of these concerns with Williams, he asked me if I’d feel the same trepidation around treatments that focused on other areas of the body. “You want to extend normal heart function, liver function,” he said. But “for some reason, if you say, we want to slow ovarian aging, that touches on a very different note.”
It’s a fair point, especially since a lot of the health outcomes he and others are trying to promote aren’t about fertility or attractiveness or any of the attributes our culture demands that women maintain in our quest to remain forever young — they’re about things like cardiovascular and mental health. I want those!
Williams argued that understanding ovarian aging might actually remove some of the negative messages around menopause and getting older more generally. He also studies recurrent miscarriage, which “has always had a tremendous amount of stigma associated with it.” What’s helped reduce that stigma has been “when it goes away from this realm of myth and taboo and folklore, and we start to understand the process,” he says.
It’s worth noting that research into menopause, like so much work on reproductive health and indeed health in general, is imperiled under the Trump administration. When I tried to visit the website for Jill Biden’s menopause initiative, I found that it was gone. Renee Wegrzyn, the head of the initiative, was fired in February. In a time when a lot of medical research is simply disappearing, it’s hard to look askance at treatments that could improve people’s lives.
After talking to Williams and other experts, I’m not against the idea of a medication that could help people live longer without heart attacks or cognitive decline. But as I get older, I’m also keenly aware that what happens outside our bodies can affect our health as much as what happens inside them.
When I asked Pines what she’d like to see for people in perimenopause and menopause right now, she said she wants a future when people in this stage of life “are not dismissed,” when “we can talk about menopause the same way we talk about puberty.” She’d also like to see workplaces support women experiencing perimenopause symptoms, including by offering insurance plans that cover the treatment of them. And she wants OB-GYNs, internists, and other doctors to be specifically trained in perimenopause and menopause, something that’s often lacking.
“When we have those kinds of things in place,” she said, our society “will start looking at aging and aging women differently.”
Read the whole story
bogorad
7 hours ago
reply
Summary: Research explores delaying or eliminating menopause: Studies are investigating methods to slow ovarian aging, potentially delaying or preventing the hormonal changes associated with midlife.
Potential benefits of delaying menopause: Extending ovarian function could improve women's health and decrease mortality by reducing the risk of conditions like cardiovascular disease and osteoporosis.
Ethical and societal considerations: Discussions arise regarding societal pressures on women to remain young and fertile, and the potential risks and benefits of medical interventions to delay menopause.
Existing and experimental treatments: Hormone therapy is available to manage menopausal symptoms, while research is ongoing with techniques like ovarian tissue cryopreservation and medications like rapamycin.
Broader perspective: Menopause, aging, and societal views: The article considers the need for societal changes to support women's health during and after menopause, including workplace support and improved medical training.
Barcelona, Catalonia, Spain
Share this story
Delete

Electric buses don’t like the cold, study finds | Cornell Chronicle

1 Comment
Read the whole story
bogorad
8 hours ago
reply
Summary: Electric buses' energy consumption: The study analyzed energy consumption, finding batteries used 48% more energy in cold weather (-4 to 0 degrees Celsius) and 27% more in a broader range (-12 to 10 degrees Celsius).
Key factors for increased consumption: Increased energy use stems from battery self-heating and cabin heating, with regenerative braking also less efficient in cold weather.
Recommendations for improvement: Short-term strategies include indoor storage, charging warm batteries, and minimizing door-opening time.
Infrastructure adjustments: The research suggests optimizing bus schedules, considering infrastructure like charging stations, and training staff for future transit planning.
Collaboration and insights: The study highlights collaboration between Cornell researchers and TCAT, revealing insights into urban vs. rural route performance in cold weather.
Barcelona, Catalonia, Spain
Share this story
Delete

Should Everyone Be Taking Ozempic? How GLP-1 Drugs Could Help With a Growing List of Conditions - WSJ

1 Comment
Read the whole story
bogorad
8 hours ago
reply
Summary: Ozempic and other GLP-1 drugs: Show promise for a wide range of conditions beyond weight loss and diabetes.
Potential benefits: These drugs are being investigated for their effects on heart, kidney, and liver diseases, sleep apnea, arthritis, Alzheimer's, and alcohol addiction.
Eligibility: Millions more could benefit, with a large portion of the adult population potentially eligible.
Considerations: Doctors are cautious about prescribing to those who don't medically qualify due to potential side effects like malnourishment.
Challenges: High costs, limited insurance coverage, and manufacturing capacity issues currently restrict wider use.
Barcelona, Catalonia, Spain
Share this story
Delete

What does Trump’s call for ‘gold standard science’ really mean? | Science | AAAS

1 Comment
From lobster fishing bans to school closings during the COVID-19 pandemic, the misuse of science by federal agencies and individual researchers has fueled the public’s growing distrust of science. So says U.S. President Donald Trump in a new executive order designed to promote “gold standard science” through transparency, replication, and taking swift action to correct errors and punish misconduct.
But some research advocates, while embracing those principles, think Trump has credibility problems of his own that lead them to question his intent. They note that the new order says nothing about preventing political interference before disseminating scientific findings—which scientists say happened many times during Trump’s first term. “That’s quite an omission,” says Kris West of COGR, a consortium of higher education institutions that track federal research policies.
Instead, the order on Restoring Gold Standard Science gives a political appointee the power to decide when those findings need to be “corrected” and to take disciplinary action against those seen as the perpetrators of misinformation. “And putting that power in the hands of a political appointee who doesn’t need to consult with scientific experts before making a decision is very troubling,” West adds.
The 23 May executive order employs a phrase, “gold standard science,” that has become widely used by science officials in the second Trump administration. The directive asserts that “over the last 5 years, confidence that scientists act in the best interests of the public has fallen significantly.” It cites “a reproducibility crisis” across many disciplines, and laments “the falsification of data by leading researchers [that] has led to high-profile retractions of federally funded research.”
The order cites examples from former President Joe Biden’s administration to make its point that the government has contributed to “this loss of trust.” In one case, the Centers for Disease Control and Prevention issued guidance on school closures that it says, “discouraged in-person learning … resulting in substantial negative effects on educational outcomes.” The order also cites a federal court telling the National Marine Fisheries Service that it erred in using a “worst-case scenario” to estimate the number of right whales, an endangered species, to support a ban on lobster fishing. 
To reverse that declining trust, the order prescribes a course of treatment that would be familiar to most advocates of greater openness in science. It defines good science as research that is “reproducible; transparent; communicative of error and uncertainty; collaborative and interdisciplinary; skeptical of its findings and assumptions; structured for falsifiability of hypotheses; subject to unbiased peer review; accepting of negative results as positive outcomes; and without conflicts of interest.” The order reiterates the current federal categories of research misconduct—plagiarism, fabrication, and falsification of data—and continues to exclude “honest error.”

Advertisement

But the seemingly uncontroversial language could be a smokescreen, warns Brian Nosek, a University of Virginia psychologist and co-founder of the nonprofit Center for Open Science. “This reminds me of earlier efforts that targeted scientific findings used by the EPA [Environmental Protection Agency] for rule-making that risked turning principles of good practice into weapons,” Nosek posted on Bluesky. “That is, scientific information could be disregarded entirely unless it meets an extremely high standard—a gold standard!”
Gretchen Goldman, president of the Union of Concerned Scientists, worries the new order will accelerate previous efforts by Republicans in Congress and Trump “to discount the underlying data from studies that present results at odds with the administration’s political agenda.” She told ScienceInsider that would undermine policies adopted by agencies under Biden to ensure that scientific data underlying regulatory decisions are not subject to the whims of politicians.
“Those policies were pretty decent,” she says. “I’m afraid the new order is a warning shot that signals the direction the administration is headed, and it’s not good.”
Stand Up for Science, a nonprofit that has organized demonstrations against Trump science policies, has written an open letter protesting the policy that since Friday has attracted more than 3800 signatures. The order represents “an escalation of the ongoing assault on science,” the letter states.
Read the whole story
bogorad
8 hours ago
reply
Summary: Executive Order: A new executive order by Donald Trump aims to promote "gold standard science" through transparency, replication, error correction, and punishment for misconduct.
Criticism: Research advocates express concerns about Trump's credibility and the potential for political interference in science, as the order doesn't address preventing it.
Key Provisions: The order grants a political appointee the power to correct scientific findings and enforce disciplinary actions, raising concerns among experts.
Examples: The order references instances where scientific guidance and data were allegedly misused, such as school closure guidance and a lobster fishing ban.
Concerns: Critics fear the order could be a pretext to dismiss data conflicting with the administration's agenda, potentially reversing progress made by the Biden administration in ensuring the integrity of scientific data in regulatory decisions.
Barcelona, Catalonia, Spain
Share this story
Delete
Next Page of Stories