Strategic Initiatives
11749 stories
·
45 followers

Texas redistricting court ruling could cost Republicans 5 House seats | Fox News

1 Share
  • Gerrymandering history: Practice dates to founding fathers, named after Elbridge Gerry, upheld by Supreme Court in 2019 for partisan but not racial reasons.
  • Texas redistricting action: Legislature passed new congressional map earlier this year, signed into law by Governor Greg Abbott.
  • Court invalidation: Three-judge panel ruled against Texas map, citing minority vote dilution, ordering 2021 map for midterms, potentially costing Republicans seats.
  • Appeal and stay request: Texas officials appealed to Supreme Court, urging emergency docket stay due to extraordinary circumstances and Voting Rights Act misapplication.
  • Voting Rights Act critique: Current interpretations mandate majority-minority districts, seen as racial discrimination; justices considering limits in Louisiana v. Callais.
  • Purcell principle application: Supreme Court precedent advises against election interference near voting; similar stays granted in past cases like Merrill v. Milligan.
  • Election timeline urgency: Primaries in March, filing deadline December 8, absentee voting soon, impacting candidates and military voters.
  • Judicial and nomination concerns: Ruling by Judge Jeffrey Brown, a Ted Cruz nominee, criticized; calls for high-quality district judge selections to avoid such issues.

close

Redistricting battle across the nation

Texas and California take up gerrymandering fights.

NEWYou can now listen to Fox News articles!

Listen to this article

6 min

Gerrymandering has been a staple of the Republic since its beginning. The practice has such a storied tradition that it is named after Elbridge Gerry, one of our founding fathers who served as vice president under President James Madison. For decades, leftists attempted to outlaw partisan gerrymandering. Justice Anthony Kennedy could not make up his mind on the issue, so it languished until he retired. Fortunately for the Constitution, President Trump replaced Justice Kennedy — the Court’s swing vote for over a dozen years — with solid constitutionalist Justice Brett Kavanaugh. In 2019, thanks to Kavanaugh’s addition, the Court upheld partisan gerrymandering in Rucho v. Common Cause. Legislatures cannot gerrymander based on race, but they can do so based on partisanship.

Texas Capitol in Austin and President Donald Trump

Following Texas Democratic lawmakers’ return on Monday, President Donald Trump urged the state legislature to move quickly to pass a highly controversial redistricting bill, saying, "Please pass this Map, ASAP." (Sergio Flores/Getty; Mark Schiefelbein/AP)

Earlier this year, Texas did just that. Yet, a three-judge district court panel invalidated Texas’s map earlier this week and ordered that the map drawn by the legislature in 2021 remain in effect for the midterm elections. This ruling could cost Republicans five seats in the U.S. House of Representatives. The ruling claims that minority votes would be diluted were the new map to go into effect. It does not matter, according to the ruling, that the legislators who voted to redistrict never advocated in favor of discrimination on the basis of race. Texas Gov. Greg Abbott and Attorney General Ken Paxton have immediately appealed to the Supreme Court. The 2-1 ruling was shockingly written by Texas U.S. District Judge Jeffrey Brown — handpicked by U.S. Senator Ted Cruz in 2019. U.S. Fifth Circuit Judge Jerry Smith, the adult on the three-judge panel, dissented.

It is imperative for the justices to stay the ruling using the emergency docket, a vehicle that permits the Court to pause rulings without full briefing and oral argument when extraordinary circumstances necessitate it. Here, the problem lies in the way courts have wrongly applied the Voting Rights Act of 1965 for decades. The current system allows DEI districts; that is, current law mandates majority-minority districts, explicitly requiring racial discrimination in redistricting. The justices are considering the proper interpretation of this statute in Louisiana v. Callais. Even if there might have been a time when such a scheme was permissible at the height of segregation when the Voting Rights Act was passed, that period has long since lapsed. Kavanaugh focused on this point during oral argument in Callais. It would, of course, be ideal for the court to hold that the scheme was never permissible, as Justices Clarence Thomas, Samuel Alito and Neil Gorsuch suggested during oral argument. Either way, the justices should release the Callais decision in short order so that legislatures can respond accordingly in time for the midterms.

Texas Gov. Greg Abbott in front of microphone

Texas Gov. Greg Abbott is seen on Nov. 14, 2025 in Midlothian, Texas. (Ron Jenkins/Getty Images)

It is almost certain that the justices have decided Callais and are crafting the majority, concurring and dissenting opinions. That decision undoubtedly will impact the way the Texas case is decided. If the justices know that they will be curtailing the Voting Rights Act to end mandatory majority-minority districts, they should not allow the ruling of the lower court to stand. In other words, Texas should not be forced to have districts in place that, if the court rules according to the Constitution, are unlawful.

FEDERAL JUDGES BLOCK TEXAS FROM USING REDRAWN CONGRESSIONAL MAP

There is a separate reason why the justices should stay the ruling of the lower court. In Purcell v. Gonzalez (2006), the court held that federal courts ordinarily should not interfere in elections when the elections are about to occur. Based on this principle, the justices stayed a similar ruling to the Texas one in 2022. In Merrill v. Milligan, the court dealt with a district court ruling by a three-judge panel that had enjoined the implementation of Alabama’s new congressional map. The panel had issued its ruling about two months prior to the beginning of absentee voting in the Alabama primaries. In concurring in the grant of the stay of that ruling, Kavanaugh emphasized the closeness of the election, citing the decision in Purcell.

The Texas case presents a similar time crunch. The primaries will occur in March, and absentee voting will begin weeks before that. Military personnel overseas need extra time to send in their votes, as many are stationed in remote locations thousands of miles away, and others are in the middle of the ocean on ships or submarines. The filing deadline for the primaries is Dec. 8, only three weeks away. The ruling by the lower court has wreaked havoc; candidates had been planning their runs based on the newly-drawn districts. The justices must restore order to this chaotic mess that the lower court has caused.

CLICK HERE TO DOWNLOAD THE FOX NEWS APP

One more important point requires emphasis. Again, one of the two judges who invalidated the map is Brown. He invalidated a map on similar grounds in 2024 concerning Galveston County. The Fifth Circuit, with all 17 judges sitting (formally called en banc), reversed his decision in Petteway v. Galveston County. Brown, again, is a Ted Cruz pick. Presidents must pick district judges based on the recommendations of home-state senators. These senators wield veto power through an outdated tradition called the blue slip. Picking quality district judges is nearly impossible in blue states, where leftist senators will veto excellent candidates. Senators in red states, particularly deep-red states like Texas, must ensure that candidates of the highest quality are recommended for nomination. Brown was a clear miss by Ted Cruz.

Supreme Court justices

Members of the Supreme Court sit for a group photo following the recent addition of Associate Justice Ketanji Brown Jackson, at the Supreme Court building on Capitol Hill on Friday, Oct 7, 2022 in Washington, D.C. Bottom row, from left, Associate Justice Sonia Sotomayor, Associate Justice Clarence Thomas, Chief Justice of the United States John Roberts, Associate Justice Samuel Alito, and Associate Justice Elena Kagan. Top row, from left, Associate Justice Amy Coney Barrett, Associate Justice Neil Gorsuch, Associate Justice Brett Kavanaugh, and Associate Justice Ketanji Brown Jackson. (Jabin Botsford/The Washington Post via Getty Images)

In light of the upcoming filing deadline, time is incredibly short. The Supreme Court rapidly must stay this incorrect order and restore the lawful map passed and signed into law earlier this year. The justices also must rule against continuing the practice of DEI districts by restoring sanity to voting rights jurisprudence. This decision also needs to occur quickly so that legislatures across the country can have time to redistrict prior to the midterms. There is no place for racial discrimination in elections, and there is no place for improper judicial interference in elections. The Supreme Court needs to put a stop to all of it now.

CLICK HERE FOR MORE FOX NEWS OPINION

Mike Davis is the founder of the Article III Project.

Read the whole story
bogorad
1 hour ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete

My Gemini 3 Review — matt shumer

1 Share
  • Creative Writing Excellence: Gemini 3 produces coherent, natural book chapters with surprising phrasing, avoiding typical AI slop for genuinely good writing.
  • Quality Consistency: Improvements reduce variability in output, making the model more reliable across tasks compared to previous spiky performances.
  • Practical Impact: Differences may not show in routine tasks but emerge in complex reasoning and creative edge cases where other models struggle.
  • Speed Efficiency: Delivers high intelligence quickly, outperforming GPT-5 Pro in regular mode without extended wait times.
  • Direct Personality: Responds tersely and to the point, respecting user time by avoiding unnecessary verbosity or praise.
  • Adaptable Styling: Follows specific prompt instructions for personas without reverting to default styles.
  • Antigravity IDE Functionality: Provides a real development environment with useful browser integration, but requires user oversight for verification and error handling.
  • Overall Reliability: Acts like a senior engineer delivering brilliant results when correct, positioned as a stable daily tool due to Google's infrastructure.

Everyone else is going to obsess over benchmark numbers. They're going to do that because the numbers are, frankly, insane... truly wild improvements across the board. But I'm not going to do that here. I've been living with Gemini 3 for the past few days; actually working with it, building things, writing things, seeing what it feels like in practice. The benchmarks might tell you what it can do (and it can do a lot); I want to tell you how it feels to use.

The Model

Let's start with the creative writing, because that's where Gemini 3 first floored me. GPT-5.1, which dropped last week, was already a noticeable jump from previous frontier models. But Gemini 3? It wrote book chapters I had to double-check weren't plagiarized from a real book. The voice was coherent. The pacing natural, the turns of phrase genuinely surprising. But most importantly, it didn't feel like the "AI slop" writing we all know just a little too well. It's really impressive... Gemini 3 doesn't just put out "good for AI" writing, it puts out genuinely good writing.

The improvements feels fundamental. Previous models had a certain spikiness... their quality varied wildly depending on the task. You could get brilliance on one task, followed by just-okay results on another. Gemini 3 is more consistent, less prone to those jarring spikes. My hunch is that Google has cracked something about reinforcement learning on non-verifiable tasks... creative work where you can't just check if the answer is right. The result is a model that feels more like a skilled collaborator than anything we've had before.

That said, here's an important note: for 80% of your daily work, you might not even notice the difference. Current models are already "good enough" for writing emails or making small changes to your webapp. So at first glance, Gemini 3 doesn't always feel like a massive leap. But that feeling is deceptive. The jump is there, it’s just hiding in the difficult 20%... the complex reasoning, the subtle creative choices, the edge cases where other models fall apart. When you really need that extra brainpower, it's there.

Another standout feature: it's fast for how smart it is. To understand this, we can think of a metric like "intelligence per second", and Gemini 3 is fantastic in this regard.

I probably shouldn't compare it to GPT-5 Pro directly since their Deep Think (~equivalent mode) wasn't available for early testing, but, impressively, the regular version of Gemini 3 often outperformed GPT-5 Pro. Outperforms it, and does so without a 5-10 minute wait. You get both the quality and the speed, which changes how you work.

Want early access to future reviews?

Join the list →

Personality-wise, it's a shift. Out of the box, Gemini 3 is less... ingratiating than most other models. It doesn't open with flowery praise, followed by three paragraphs of preamble. It's more terse. Direct. It gives you the answer and (mostly) stops. I prefer this. I don't need an AI to give me every little detail (if I need this, I'll ask); I need it to get to the point. With GPT-5.1, for example, I find myself scrolling through verbose explanations hunting for the actual content. Gemini 3 respects your time.

Other models have default "personas" and styles (UI, writing, etc.) that are very hard to escape; Gemini 3 just... listens and does what you ask. For example, if you prompt it to "write this like a cynical 1940s detective, but make it modern", it nails the specifics without fighting you and reverting back to the slop-styles we all know and hate.

The Antigravity IDE: Great, But Keep Your Eyes on It

The Antigravity IDE is impressive for a launch product. It feels like a real development environment, not a demo. The browser integration for testing sites is genuinely useful... it'll spin up a server, check if it acheived the goal it was working on, iterate without context-switching or human input. It's great.

But here's the thing: you have to babysit it. The model will sometimes glance at a log, declare victory, and move on while your build is still throwing errors. It'll screenshot a UI, say "looks good," and miss that the site wasn't even running in the first place. You need to keep the terminal open, re-run checks, and explicitly tell it to verify its work. Custom instructions help... "Keep reading the logs as you spin things up until you know it works." is a good one to add. For developers who stay engaged, it's powerful. For those wanting a magic button, it'll frustrate. That said, these are likely temporary issues that'll be patched via prompt updates on Google's side over time.

The Tradeoff

If GPT-5.1 is a solid junior engineer, Gemini 3 is a senior engineer who says "got it, done", and you better check that it's actually done. I keep reaching for it, not because it's perfect, but because when it's right, it's brilliantly, almost humanly right.

This is, without a doubt, my new daily driver. And with Google's computing power and ability to serve this cheaply and stably I'd bet this is going to be a winner.

Follow me on X for updates on Gemini, new models, and products worth using.
Follow @mattshumer_
Get early access to future reviews & builds →

Read the whole story
bogorad
2 hours ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete

Warner settles lawsuit and agrees licensing deal with AI music platform

1 Share
  • Warner Music Deal: Warner Music has agreed to a licensing deal with AI start-up Udio to enable a new streaming platform using its songs.
  • Lawsuit Settlement: The agreement includes settling a lawsuit Warner had against Udio for using copyrighted recordings in AI training.
  • Subscription Service Launch: Udio plans to introduce a subscription service next year for fans to create songs with licensed tracks.
  • Artist Consent Required: Warner's artists must provide agreement for their music to be featured in the service.
  • Upcoming Announcement: An official announcement of the deal could occur as early as Wednesday.
  • Previous Lawsuits: Major labels including Warner, Universal, and Sony sued Udio last year over copyright issues in AI models.
  • Universal's Involvement: Universal Music recently reached a similar deal with Udio to include its catalogue in the service.
  • Additional Agreements: Warner announced a deal with Stability AI and anticipates more licensing pacts soon, aiming to address AI disruptions like past Napster challenges.

Warner Music has struck a licensing deal with artificial intelligence start-up Udio to power a new streaming platform with its songs, according to people familiar with the matter, as major labels seek to set terms for payment in the AI era.

Warner, the world’s third-largest music company and home to acts including Charli XCX, Madonna and Ed Sheeran, has settled a lawsuit with Udio as part of the agreement, according to people familiar with the matter. 

As part of the deal, Udio plans to launch a new subscription service next year, allowing fans to create their own songs using licensed tracks. Warner’s artists would need to agree for their music to be included in the service, these people said. 

An announcement could come as early as Wednesday, said people familiar with the matter.

Warner Music, along with rivals Universal and Sony, last year sued Udio, alleging the company was illegally using copyrighted recordings to train its AI models.

Universal Music last month struck a deal with Udio to include its catalogue in the upcoming subscription service. 

Warner also announced a licensing deal with Stability AI, an AI music tools specialist, on Wednesday. The label is close to unveiling more agreements in the coming days, said people familiar with the talks. 

After the Napster crisis of the early 2000s, music companies are trying to get ahead of disruptive technology this time around. The labels have spent much of this year in negotiations with AI groups to hash out the terms for licensed products to create songs using their music copyrights — and ensure they are properly compensated.

However, many artists remain staunchly opposed to AI-generated music, fearing it could undermine the value of their work. 

Paul McCartney, Kate Bush, Annie Lennox and others have released a “silent” album to protest against the UK government’s recent changes to copyright law. The album’s track listing spells out the message: “The British government must not legalise music theft to benefit AI companies.” 

Elliot Grainge, chief executive of Warner’s Atlantic Records, told the Financial Times in September: “Labels have a responsibility to negotiate the best deals for their artists — and they’re really good at that. They learned from their mistakes in the past.” 

Warner Music declined to comment and Udio did not immediately respond to a request for comment.

Read the whole story
bogorad
2 hours ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete

Tucker Carlson Goes Full Truther

1 Share
  • Conspiracy Theory Techniques: Theorists select supporting facts while overlooking contradictory evidence, shifting quickly between claims to avoid scrutiny.
  • Tucker Carlson's Series: _The 9/11 Files_ presents unsubstantiated insinuations about 9/11 events, focusing on questions without concrete proof.
  • Carlson's Ideological Shift: Previously criticized 9/11 Truthers, now promotes similar views aligning with far-right narratives.
  • Absence of Evidence: Series avoids interviewing witnesses or presenting physical evidence, relying on vague allegations of prior knowledge.
  • Popular Mechanics Investigations: 2005 report and subsequent books examined and refuted major 9/11 conspiracy claims, facing backlash from proponents.
  • Pentagon and Flight 93 Claims: Assertions of hidden footage and lack of wreckage contradicted by photos, debris, and impact details.
  • WTC Collapses and Thermite: Theories of explosives or thermite dismissed due to impracticality and lack of installation evidence; NIST report explains WTC 7 failure from fire.
  • Foreknowledge Allegations: Claims of media errors, stock trades, and Israeli involvement lack supporting investigations, promoting LIHOP without detailing logistics.

Effective conspiracy theorists need to be quick on their feet. To tell a persuasive story, they must focus our attention on the tiny number of facts that seem to support their theory, while ignoring the vast amount of evidence that contradicts it. An agile theorist therefore jumps from point to point like a hiker crossing a stream by leaping from rock to rock. The trick is to get listeners to forget about the river of facts that refute the conspiracy claims. Still, even the seemingly solid points supporting most conspiracy theories generally collapse under honest scrutiny. When that happens, the theorists rarely concede that their elaborate assumptions have been debunked. They simply jump to new, even shakier pieces of “evidence.”

Tucker Carlson uses this device, and many more, in his slickly deceptive new video series, The 9/11 Files. Carlson is late to the 9/11 conspiracy party. In fact, in the past he employed his considerable rhetorical skills arguing against the so-called 9/11 Truth Movement, once calling its adherents “parasites.” But the former Fox News anchor has made quite an ideological journey in recent years. Today, he embraces the Truther worldview that was originally a hallmark of the anti-American Left. In recent years, such dark conspiratorial fantasies—including anti-Semitic tropes—have found new life on the very-online far Right.

In The 9/11 Files, released on the Tucker Carlson Network (and on YouTube), Carlson promises to prove that “what you have been told about 9/11 is not true.” Instead, the five-part series mostly rehashes familiar claims and unproven insinuations, albeit in a highly polished fashion. Reviving 9/11 conspiracy theories at this late date gives Carlson a chance to flesh out his increasingly blame-America-first outlook —while maintaining his “just-asking-questions” pose of deniability—and build bonds with the so-called Woke Right.

Producing this series two decades after the first spasm of 9/11 conspiracy mania also allows Carlson to sidestep some of the sillier assertions made by first-wave Truthers. His series focuses mostly on vague claims that he makes no attempt to substantiate. He interviews no firsthand witnesses who say they played a role in the alleged plot, nor does he uncover any tangible physical evidence. Instead, he raises leading questions (“What were they hiding?”) and makes broad allegations (“Foreign intel agencies, including those of allies, likely had detailed prior knowledge”). Then, before viewers have time to notice that he offers no real corroboration for these indictments, he leaps to the next rock, raising a new set of provocative, equally unsupported claims.

I’ve watched 9/11 theories evolve since the movement’s early days. In 2004, as editor of Popular Mechanics, I became curious about the growing popularity of such conspiracy claims and asked my team to investigate. Then, as now, most of those theories posited a shadowy alliance between Israel and the George W. Bush administration, both of which supposedly wanted a pretext to start wars in the Middle East. Beneath the geopolitical theorizing, however, all these theories rest on specific factual claims. For example, some theorists assert that a military missile, and not American Airlines Flight 77, struck the Pentagon; that the commercial jets that struck the World Trade Center were instead military tankers or drones; that the buildings themselves were prewired with demolition explosives; that the crash site near Shanksville, Pennsylvania, didn’t really contain the wreckage of United Flight 93—and so on.

Popular Mechanics put together a team of eight reporters to investigate these and similar claims and published its first in-depth report in February 2005. The report concluded that every major piece of evidence commonly cited by theorists was either incorrect, misinterpreted, or fabricated from whole cloth. The following year, we released a book-length version of our reporting, Debunking 9/11 Myths, which we updated and expanded for a second edition in 2011. September 11 Truthers often insist that they are only asking questions. For the first decade after the attacks, Popular Mechanics was the only major journalistic outlet attempting to answer those questions in good faith. The Truther community responded with predictable hyperbole: they accused us of working in the service of the Bush/Cheney administration, the CIA, Mossad, the Illuminati, or other supposed conspirators. Since we were part of the plot, the conspiracy fans concluded, our findings could be dismissed as propaganda. (I recounted this experience in a 2021 City Journal article.)

Still, the Popular Mechanics reporting had an impact. A few other journalists joined in, as did some capable amateurs. For example, the blog Screw Loose Change amusingly dismantled the many absurd claims made in the popular “Loose Change” series of 9/11-conspiracy videos. Gradually, the Truther movement became a target for parody on TV shows like South Park and It’s Always Sunny in Philadelphia. As criticisms and ridicule mounted, many 9/11 conspiracy theorists retreated from their original bold allegations and migrated toward more indirect theories, typically ones that rely less on verifiable—i.e., disprovable—claims and more on broad assertions of America’s role as all-powerful global villain. In other words, as one factual claim after another was proved false, the Truther community quietly dropped or downplayed them. But its members never wavered in their conviction that the U.S. government was responsible for the attacks.

The 9/11 Files fits this pattern. “The official story is a lie,” Carlson says. And yet he artfully avoids any concrete description of the vast conspiracy he alleges. And, while he raises several hoary Truther claims—about missing jets and World Trade Center bombs, for instance—Carlson makes little effort to support those claims with evidence. He simply invokes the penumbra of those debunked theories to lull viewers into believing that our government was—somehow—complicit in the attacks.

Several months after 9/11, far-left French writer Thierry Meyssan published a book claiming that the Pentagon was hit by a military missile, an attack that “could only be committed by United States military personnel against other U.S. military personnel.” Today, fewer truthers stand by that theory. There are simply too many photographs showing debris from an American Airlines Boeing 757 around the Pentagon. Carlson avoids claiming that a missile hit the building, but he insists that U.S. officials must be hiding something. “Why did it take the government five years to release footage of the Pentagon [attack]?” he asks. Carlson doesn’t answer the question. (The video footage was temporarily withheld because it was part of the evidence in the prosecution of al-Qaida conspirator Zacarias Moussaoui.) Nor does he mention the vast amount of physical evidence—airplane parts, luggage, human DNA—collected inside the Pentagon itself.

Carlson employs a similar sleight-of-hand in discussing Flight 93. “Why wasn’t there substantial wreckage of Flight 93 at its supposed crash site in Shanksville, Pennsylvania?” he asks. Again, he sidesteps the answer: the aircraft hit the ground going 560 miles per hour, creating a 30-foot-deep trench which contained most (but not all) of the wreckage. _Popular Mechanic_s interviewed the county coroner who had the grim task of identifying the bodies. “We were told the crash of that aircraft was so powerful that it vaporized the aircraft’s hull,” Carlson goes on. “And yet the hijackers’ passports were found intact at the site. How does that work?”

Here, Carlson employs a classic conspiracy theory technique: when evidence confirming the conventional view of an event is scanty, conspiracists will claim that absence is proof of a conspiracy. But when evidence supporting the mainstream account is found, they will argue its presence also confirms the conspiracy. It’s just too convenient, they say. The evidence must have been planted. In reality, the airplane was thoroughly shattered (not “vaporized”) on impact, but some aircraft components and many of the passengers’ personal effects were found somewhat intact.

Carlson also takes an indirect approach to the longstanding claim that the World Trade Center buildings were professionally demolished. Originally, 9/11 conspiracy theorists argued that the Twin Towers must have been wired top to bottom with explosives. But that scenario is hard to defend. Rigging such a demolition job would have taken months and been visible to thousands of office workers. Instead, many Truthers moved on to a more recondite theory: that the World Trade Center buildings (including the Twin Towers and/or the nearby World Trade Center Building 7) were felled by packets of thermite powder. Thermite, a highly reactive mix of aluminum and iron oxide, can heat up enough to melt steel in certain applications, such as welding. However, demolition experts say the compound would be a wildly impractical tool to use in the demolition of large buildings.

Nonetheless, the thermite theory took off in 2009 when a physicist published a paper in a fringe science journal. It claimed to show the presence of “nano-thermite“ in dust samples collected in lower Manhattan. Note that the material in question consisted of a few flakes of aluminum and a bit of ordinary rust—hardly a surprising discovery after the world’s biggest building collapse. The thermite theory allows Truthers to sidestep the embarrassing lack of evidence for a conventional controlled demolition. After all, since no one really knows how this imaginary thermite demolition process would work, it’s harder to prove that it didn’t happen. (Of course, in a rational debate, Truthers would still have to explain how the 9/11 plotters could install thousands of pounds of thermite without anyone noticing.)

Carlson cites the 2009 paper and suggests that “thermitic material” might explain the collapse World Trade Center Building 7, the final building to fall on 9/11. But he doesn’t linger on the thermite question for long. In fact, in a rare nod to a counterargument, he mentions an engineering report that explains why thermite “would be an unlikely substance for achieving a controlled demolition.” Then, just like that, he jumps to a different explanation. Maybe it was some other type of explosive. The documentary then cites a single eyewitness who believed he heard an explosion. (True to form, it doesn’t mention the hundreds of other witnesses who heard nothing.) As Carlson leaps from rock to rock, it’s easy to forget that he hasn’t provided evidence for any type of intentional demolition. But he has nudged the viewer from wondering whether home-grown conspirators engineered the collapse of WTC 7 to asking which demolition method was employed.

Carlson devotes much of an episode to the WTC 7 collapse, a focus that reflects another Truther climbdown. After all, claims that the Twin Towers were professionally demolished face lots of reasonable pushback, even with the thermite variation. But the collapse of WTC 7, which was “never hit by a plane,” as Truthers constantly remind us, was a legitimate mystery—at least initially. Damaged by falling debris from the North Tower, the 47-story building burned ferociously for nearly seven hours. Investigators assumed those fires weakened the structure to the point of failure, but it took them years to establish the exact mechanism that led to its collapse. Conspiracy theorists happily flooded that zone of uncertainty. In their view, this less well-known building—which housed various government offices, among other tenants—would have been easier to demolish surreptitiously. The government must have wanted to destroy records housed in a CIA office in the building, they suggested. (With typical disdain for Occam’s razor, they don’t explain why rigging a huge building for demolition would have been easier than simply removing the files.) Since not much was known about the collapse, it was harder to rebut the conspiracy claims with facts.

This gap in knowledge has made WTC 7 a magnet for conspiracy buffs who want to appear level-headed: one can claim to be agnostic about the wilder 9/11 theories while insisting that something suspicious happened to Building 7. For example, Rosie O’Donnell once told her cohosts on The View that it would have been “physically impossible“ for Building 7 to have collapsed from fire alone. “I do believe that it’s the first time in history that fire has ever melted steel,” she famously said. “I don’t know,” Robert F. Kennedy Jr. told journalist Peter Bergen in 2023, whether al-Qaida was responsible for 9/11. After all, “There were some strange things that happened” involving Building 7. Carlson follows this line of thinking, asking, “Why did Building 7 collapse after just seven hours of burning in a way that no steel-frame building anywhere in the world has ever collapsed?”

Today, thanks to a detailed analysis by the National Institute of Standards and Technology, we know fairly precisely how WTC 7 fell. But Carlson doesn’t buy the NIST analysis. Instead, he shows a video that he insists shows “the building coming down symmetrically all at once at free-fall acceleration.” The NIST report explains why it appears that way: WTC 7 had an unusual design that put particularly heavy loads on three vertical columns. Thermal expansion caused by the fires made one of those columns fail between floors 5 and 14; the rest of the building’s internal support structure on those lower floors soon collapsed as well. Now unsupported, the tower’s upper floors—the ones visible in the video—then fell as a unit. It was an extraordinary event, but hardly inexplicable.

But Carlson isn’t interested in engineering nuances. Having concluded that the “official” explanation of the collapse must be a coverup, he leaps again to the next rock. On the afternoon of September 11, a BBC reporter mentioned on camera that WTC-7 had fallen half an hour before it actually collapsed. “Did the BBC have advanced word that the building was coming down?” Carlson asks. Truthers are obsessed with finding cases in which someone seems to have had prior knowledge of the attacks. After all, if one could prove that some group outside al-Qaida knew the attacks were coming, it would seem to be a priori confirmation of a conspiracy.

But the BBC? Does Carlson really think that an ultra-clandestine group not only secretly wired Building 7 for collapse but also made sure to notify the news media? In fact, the BBC reporter was simply repeating an incorrect report in the midst of a horrifying, chaotic day. But Carlson doesn’t give his audience a chance to reflect on the laughable idea that the BBC was part of the plot. Once again, he’s off to the next rock: “If the media did have foreknowledge of the events that day, they weren’t alone,” he says. Then Carlson trots out some repeatedly debunked Truther chestnuts: prior to the attacks, some investors bet that stocks in United and American Airlines would go down. (A massive Securities and Exchange Commission investigation found no evidence that anyone traded on the basis of prior knowledge.) And what about the five Israelis spotted watching the towers burn who seemed to be “celebrating the event,” Carlson asked. (The immigrants, who were reported to authorities by a jittery neighbor, were hapless laborers who heard about the attacks and, like thousands of others, went to a vantage point to observe the historic event.)

According to The 9/11 Files, lots of people knew the attacks were coming. The series quotes former CIA operative John Kiriakou asking, “Why didn’t Germany warn us? What about the Israelis?” Now Carlson is getting to his sweet spot: “The Israeli government stands out in particular,” he says. Several weeks before hosting professional Jew-hater Nick Fuentes on his program, Carlson was setting the stage to blame Israel for facilitating the attacks. The 9/11 Files quotes former CIA officer Michael Scheuer opining that, “The Israelis are always for the Israelis first. They don’t like the United States, except, for the most part, our money.” The message is clear: Jews, money. This is more than a dog whistle. Carlson knows that anti-Semitic tropes provide an electric thrill to fringe elements on the right. In The 9/11 Files, he’s making a bid for that audience.

Carlson’s focus on Israel’s supposed foreknowledge of the attacks reflects another trend in conspiracy thinking. Today, many Truthers have given up trying to find tangible evidence of a conspiracy. They simply maintain that Israeli intelligence, the Bush administration, the CIA, defense contractors (among many others) knew about the attack in advance and “let it happen on purpose.” This “LIHOP theory” has one great advantage: Truthers can simply list every screwup made by our security establishment—and there were many— and attribute them to an insidious master plan rather than to complacency, incompetence, and inter-agency rivalry.

Carlson devotes much of his series to this approach. The 9/11 Files documents a maddening series of fumbles and oversights on the part of the CIA and other agencies. The CIA’s failure to notify the FBI about al-Qaida agents it had been tracking was especially egregious. And Carlson is correct that many of the officials responsible for those lapses were never properly investigated or called to account. But at no point does he quote a witness who can attest that he or she knew the attack was coming and was ordered to stay silent. He produces no documents or other evidence attesting to a coverup. Nor does he explain how a broad LIHOP conspiracy would have worked. How many people in the CIA would need to have been complicit? How many in the White House? In the FBI? Were airport security agents ordered not to stop the hijackers (as Carlson implies)? By whom? The documentary notes that the Clinton administration also passed up chances to stop bin Laden. Were they in on the plot, too? In short, even a LIHOP conspiracy would have required hundreds, perhaps thousands, of conspirators.

The conspiracy grows even more implausible when Carlson implies that everyone involved in what he calls the “official story” must be part of a massive coverup. Does that include the roughly 90 people who worked on the 9/11 Commission report? The hundreds of investigators who examined the collapse of the World Trade Center buildings and the Pentagon attack for FEMA, NIST, and other agencies? The recovery workers at Shanksville and the Pentagon? How is it that none of these people have come forward to get this terrible secret off their chests? And what about the media—not just the BBC, but all the large and small outlets that spent years investigating 9/11 without finding a trace of the conspiracy Carlson claims is “hiding in plain sight”? What is stopping even one of those reporters from spilling the beans? It would be the scoop of the century.

Carlson doesn’t grapple with such commonsense objections to his vision of a global conspiracy. He’s not trying to win over the undecided. He’s preaching to an audience predisposed to see the U.S., Israel—and, really, the West in general—as the ineluctable perpetrators of injustice in the world. The first wave of 9/11 theories appealed to the far Left for the same reason. Followers of anti-American and “anti-colonialist” authors Howard Zinn and Noam Chomsky naturally loved seeing George W. Bush, Mossad, and big business cast as the real villains of 9/11. Left-wing politicians (former Georgia congresswoman Cynthia McKinney) and celebrities (Mark Ruffalo, Woody Harrelson) helped spread the conspiracy gospel. Progressive radio stations gave out “Loose Change” DVDs during fundraising drives.

Today, however, animosity toward American institutions, and sympathy for America’s enemies, burns strongly on the fringe Right. As Commentary’s Abe Greenwald points out, the so-called Woke Right has become almost indistinguishable from the Woke Left. “They share the left’s hatreds, heroes, and self-pitying worldview,” he writes. Carlson now flatters Vladimir Putin, defends Tehran, and praises Venezuela dictator Nicolás Maduro. His recent talk show guest Fuentes calls himself an “admirer” of Soviet mass murderer Joseph Stalin. In promoting 9/11 conspiracy theories to this audience, Carlson is pushing on an open door.

Challenging left-wing orthodoxies has lost its thrill. For Carlson and his ilk, attacking the Republican Party (now including the mainstream MAGA movement) brings much greater rewards: online buzz, surging subscriptions, and the clout that comes from playing the role of disrupter. If that requires embracing a worldview Carlson used to abhor, he seems willing to make that trade.

In 2012, Carlson was surrounded by a group of 9/11 conspiracy buffs outside a political event. The video of the encounter, shot by one of the Truthers, is a time capsule from a different era, one showing a very different Tucker Carlson. As they badger him for his views on “Building 7” and other conspiracy tropes, he engages them with good humor, showing the easy charm that made him a successful broadcaster. But when the Truthers ask him what he would say to the families of 9/11 victims, he becomes withering. “Parasites like you make it much worse for them,” he says. “In order to imply that there’s a conspiracy behind 9/11, you ought to have some evidence,” he goes on. “And you have none. So you should stop.”

Things have changed.

James B. Meigs is a senior fellow at the Manhattan Institute, a contributing editor of City Journal_, and the former editor of_ Popular Mechanics_._

Photo by Chip Somodevilla/Getty Images

Read the whole story
bogorad
6 hours ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete

Student Loans: Here's What Happens As Trump Dismantles Education Department - Newsweek

1 Share
  • Department Dismantling Acceleration: Trump administration advances plan to eliminate U.S. Education Department by transferring billions in federal school grants to agencies like Labor, HHS, Interior, and State.
  • Student Loan Continuity: $1.6 trillion student loan system remains under Education Department oversight, including repayment plans, forgiveness, and aid eligibility.
  • Major Program Transfers: Six new agreements shift responsibilities, with Labor taking Title I for low-income schools, teacher training, and TRIO; HHS handling student parents and foreign medical accreditation.
  • Additional Shifts: State Department oversees foreign language programs; Interior manages Native American education; funding levels preserved as set by Congress.
  • Administrative Changes: Hundreds of staff laid off or retired; moves test functionality without standalone agency, outsourcing key offices for elementary, secondary, and postsecondary education.
  • Administration's Rationale: Secretary McMahon describes actions as bold reform to reduce federal bureaucracy and devolve education control to states, amid lagging student performance.
  • Critics' Concerns: Opponents, including AFT President Weingarten, argue transfers abandon vulnerable students and lack authority, potentially disrupting programs without education expertise.
  • Long-Term Goals and History: Plan seeks congressional approval for full elimination; department established in 1979 under Carter to centralize federal education efforts, now facing political opposition.

The Trump administration is accelerating its plan to dismantle the U.S. Education Department, shifting billions of dollars in federal school grants to other agencies — but leaving the nation’s $1.6 trillion student loan system in place, at least for now.

Under a series of new agreements, major K-12 and higher education grant programs will move to the Departments of Labor, Health and Human Services, Interior and State even as the Education Department continues to oversee federal student loans and college accreditation.

The split underscores the limits of President Donald Trump’s effort to eliminate the department, a goal he first announced in a March executive action. Millions of borrowers will still use the Education Department for repayment plans, loan forgiveness and aid eligibility, even as its footprint shrinks dramatically elsewhere.

President Donald Trump holds the executive order to dismantle the Department of Education he just signed with Education Secretary Linda McMahon during...

Trump Administration Moves Forward with Dismantling Education Department: What We Know

Six newly signed agreements represent the most sweeping transfer of the department’s responsibilities in its 45-year history. Labor will inherit some of the largest federal funding streams for schools, including the $18 billion Title I program for low-income communities, as well as grants for teacher training, English language instruction and the TRIO college-access program.

HHS will take over a grant program for student parents and foreign medical school accreditation, while State will oversee foreign language programs and Interior will manage Native American education initiatives. Officials say the programs will continue to be funded at levels set by Congress.

Education Secretary Linda McMahon said the moves are part of a “bold action to break up the federal education bureaucracy and return education to the states.” Since spring, the department has shed hundreds of staff through layoffs and early retirements, and officials say the new transfers are designed to prove the system can function without a standalone federal agency. The Education Department tested the approach in June by shifting adult education programs to the Labor Department; the new agreements go much further, effectively outsourcing its Office of Elementary and Secondary Education and most of its Office of Postsecondary Education.

Read More

News

[ICE To Target Mississippi, Louisiana in Major ‘Swamp Sweep’ Raid: Report 

ICE To Target Mississippi, Louisiana in Major ‘Swamp Sweep’ Raid: Report](/ice-target-mississippi-louisiana-swamp-sweep-raid-11071320) [

These Democrats Voted to Overturn Biden Limits on Alaska Drilling

These Democrats Voted to Overturn Biden Limits on Alaska Drilling

](/these-democrats-voted-to-overturn-biden-limits-on-alaska-drilling-11071335)[

Xi Jinping Issues New Instruction on Rule of Law in China

Xi Jinping Issues New Instruction on Rule of Law in China

](/xi-jinping-issues-new-instruction-rule-law-china-11071321)More Related Stories

But critics warn that dismantling the department could disrupt programs serving vulnerable students and leave key responsibilities in the hands of agencies without education expertise. Some legal scholars also question whether the Trump administration has the authority to shift programs that federal law requires the Education Department to manage directly. McMahon has dismissed those criticisms, arguing the agency has become a “bloated bureaucracy” while student performance lags.

American Federation of Teachers President Randi Weingarten said in a statement shared with Newsweek via email, "This move is neither streamlining nor reform–it’s an abdication and abandonment of America’s future. Rather than show leadership in helping all students seize their potential, it walks away from that responsibility.

“What’s happening now isn’t about slashing red tape. If that were the goal, teachers could help them do it: and we invite Donald Trump and Linda McMahon to sit down with educators and hear from the people who actually do this work every day. Teachers know how to make the federal role more effective, efficient, and supportive of real learning–if only the administration would listen.”

The administration’s long-term goal remains persuading Congress to codify the transfers — a necessary step for fully eliminating the department. Until then, while much of the agency’s work disperses across Washington, the student loan system will remain firmly under its control, ensuring borrowers continue to rely on the department even as its broader mission recedes.

Will Student Loans Be Forgiven?

Last month, the Trump administration confirmed its agreement to cancel student loan debt for eligible borrowers under certain plans, following a legal agreement between the American Federation of Teachers (AFT) and the Department of Education.

To qualify for these plans, borrowers typically must have made 20 to 25 years of consecutive qualifying payments, depending on loan origination date and plan enrollment. As of October 2025, these plans account for more than 2 million borrowers.

More than 42 million Americans hold a collective $1.7 trillion in student loan debt.

As of September, many borrowers remained in limbo about the applications for federal student loan forgiveness and repayment programs, and there were more than a million applications for income-driven repayment plans that are still unprocessed.

What is the Department of Education Responsible for?

The U.S. Department of Education oversees federal policy on K-12 and higher education, distributes billions of dollars in funding to states and school districts, and enforces laws that protect students’ civil rights. It administers programs for low-income students, English learners and students with disabilities, and it sets rules for colleges participating in federal financial aid programs. The agency also manages the nation’s $1.6 trillion student loan portfolio, monitors accreditation systems and collects data on school performance nationwide. While states and local districts run public schools, the department plays a key role in ensuring federal dollars are spent properly and that students receive equal access to education.

When was the Department of Education Established?

The Department of Education was established in 1979 after President Jimmy Carter signed legislation creating a Cabinet-level agency dedicated solely to education. Before then, federal education programs were scattered across multiple departments, including Health, Education and Welfare. The move to create a standalone department followed decades of debate over the federal government’s role in schools, with supporters arguing it would bring needed attention and oversight to national education issues. The agency officially began operations in May 1980 and has remained a political flashpoint ever since, with critics—most recently President Donald Trump—pushing to dismantle it.

Updates: 11/18/25, 1:43 p.m. ET: This article was updated with new information.

Updates: 11/18/25, 2:24 p.m. ET: This article was updated with new information.

Updates: 11/18/25, 3:32 p.m. ET: This article was updated with new remarks.

This article includes reporting by the Associated Press.

Read the whole story
bogorad
10 hours ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete

Cloudflare outage on November 18, 2025

1 Share
  • Outage Onset: On 18 November 2025 at 11:20 UTC, Cloudflare's network failed to deliver core traffic, showing error pages to users accessing customer sites.
  • Root Cause: A database permission change in ClickHouse caused duplicate entries in a Bot Management feature file, doubling its size beyond software limits.
  • Propagation Effect: The oversized file spread across the network, triggering failures in the core proxy software handling traffic routing and Bot Management updates.
  • Initial Misdiagnosis: Fluctuating errors from partial cluster updates mimicked a DDoS attack, compounded by coincidental status page downtime.
  • Resolution Timeline: Issue identified by 14:30 UTC with file propagation stopped and a good version deployed; full recovery by 17:06 UTC after service restarts.
  • Impacted Services: Core CDN, Turnstile, Workers KV, Dashboard login, Email Security spam detection, and Access authentication experienced failures or degraded performance.
  • Technical Details: Query to system.columns returned extra metadata from underlying shards post-permission update, inflating feature rows and hitting memory preallocation limits in the proxy.
  • Future Measures: Plans include hardening file ingestion, adding kill switches, limiting error reporting resource use, and reviewing proxy module failure modes.

On 18 November 2025 at 11:20 UTC (all times in this blog are UTC), Cloudflare's network began experiencing significant failures to deliver core network traffic. This showed up to Internet users trying to access our customers' sites as an error page indicating a failure within Cloudflare's network.

HTTP error page displayed during the incident

The issue was not caused, directly or indirectly, by a cyber attack or malicious activity of any kind. Instead, it was triggered by a change to one of our database systems' permissions which caused the database to output multiple entries into a “feature file” used by our Bot Management system. That feature file, in turn, doubled in size. The larger-than-expected feature file was then propagated to all the machines that make up our network.

The software running on these machines to route traffic across our network reads this feature file to keep our Bot Management system up to date with ever changing threats. The software had a limit on the size of the feature file that was below its doubled size. That caused the software to fail.

After we initially wrongly suspected the symptoms we were seeing were caused by a hyper-scale DDoS attack, we correctly identified the core issue and were able to stop the propagation of the larger-than-expected feature file and replace it with an earlier version of the file. Core traffic was largely flowing as normal by 14:30. We worked over the next few hours to mitigate increased load on various parts of our network as traffic rushed back online. As of 17:06 all systems at Cloudflare were functioning as normal.

We are sorry for the impact to our customers and to the Internet in general. Given Cloudflare's importance in the Internet ecosystem any outage of any of our systems is unacceptable. That there was a period of time where our network was not able to route traffic is deeply painful to every member of our team. We know we let you down today.

This post is an in-depth recount of exactly what happened and what systems and processes failed. It is also the beginning, though not the end, of what we plan to do in order to make sure an outage like this will not happen again.

The outage

The chart below shows the volume of 5xx error HTTP status codes served by the Cloudflare network. Normally this should be very low, and it was right up until the start of the outage.

Volume of HTTP 5xx requests served by the Cloudflare network

The volume prior to 11:20 is the expected baseline of 5xx errors observed across our network. The spike, and subsequent fluctuations, show our system failing due to loading the incorrect feature file. What’s notable is that our system would then recover for a period. This was very unusual behavior for an internal error.

The explanation was that the file was being generated every five minutes by a query running on a ClickHouse database cluster, which was being gradually updated to improve permissions management. Bad data was only generated if the query ran on a part of the cluster which had been updated. As a result, every five minutes there was a chance of either a good or a bad set of configuration files being generated and rapidly propagated across the network.

This fluctuation made it unclear what was happening as the entire system would recover and then fail again as sometimes good, sometimes bad configuration files were distributed to our network. Initially, this led us to believe this might be caused by an attack. Eventually, every ClickHouse node was generating the bad configuration file and the fluctuation stabilized in the failing state.

Errors continued until the underlying issue was identified and resolved starting at 14:30. We solved the problem by stopping the generation and propagation of the bad feature file and manually inserting a known good file into the feature file distribution queue. And then forcing a restart of our core proxy.

The remaining long tail in the chart above is our team restarting remaining services that had entered a bad state, with 5xx error code volume returning to normal at 17:06.

The following services were impacted:

Service / Product

Impact description

Core CDN and security services

HTTP 5xx status codes. The screenshot at the top of this post shows a typical error page delivered to end users.

Turnstile

Turnstile failed to load.

Workers KV

Workers KV returned a significantly elevated level of HTTP 5xx errors as requests to KV’s “front end” gateway failed due to the core proxy failing.

Dashboard

While the dashboard was mostly operational, most users were unable to log in due to Turnstile being unavailable on the login page.

Email Security

While email processing and delivery were unaffected, we observed a temporary loss of access to an IP reputation source which reduced spam-detection accuracy and prevented some new-domain-age detections from triggering, with no critical customer impact observed. We also saw failures in some Auto Move actions; all affected messages have been reviewed and remediated.

Access

Authentication failures were widespread for most users, beginning at the start of the incident and continuing until the rollback was initiated at 13:05. Any existing Access sessions were unaffected.

All failed authentication attempts resulted in an error page, meaning none of these users ever reached the target application while authentication was failing. Successful logins during this period were correctly logged during this incident. 

Any Access configuration updates attempted at that time would have either failed outright or propagated very slowly. All configuration updates are now recovered.

As well as returning HTTP 5xx errors, we observed significant increases in latency of responses from our CDN during the impact period. This was due to large amounts of CPU being consumed by our debugging and observability systems, which automatically enhance uncaught errors with additional debugging information.

How Cloudflare processes requests, and how this went wrong today

Every request to Cloudflare takes a well-defined path through our network. It could be from a browser loading a webpage, a mobile app calling an API, or automated traffic from another service. These requests first terminate at our HTTP and TLS layer, then flow into our core proxy system (which we call FL for “Frontline”), and finally through Pingora, which performs cache lookups or fetches data from the origin if needed.

We previously shared more detail about how the core proxy works here

Diagram of our reverse proxy architecture

As a request transits the core proxy, we run the various security and performance products available in our network. The proxy applies each customer’s unique configuration and settings, from enforcing WAF rules and DDoS protection to routing traffic to the Developer Platform and R2. It accomplishes this through a set of domain-specific modules that apply the configuration and policy rules to traffic transiting our proxy.

One of those modules, Bot Management, was the source of today’s outage. 

Cloudflare’s Bot Management includes, among other systems, a machine learning model that we use to generate bot scores for every request traversing our network. Our customers use bot scores to control which bots are allowed to access their sites — or not.

The model takes as input a “feature” configuration file. A feature, in this context, is an individual trait used by the machine learning model to make a prediction about whether the request was automated or not. The feature configuration file is a collection of individual features.

This feature file is refreshed every few minutes and published to our entire network and allows us to react to variations in traffic flows across the Internet. It allows us to react to new types of bots and new bot attacks. So it’s critical that it is rolled out frequently and rapidly as bad actors change their tactics quickly.

A change in our underlying ClickHouse query behaviour (explained below) that generates this file caused it to have a large number of duplicate “feature” rows. This changed the size of the previously fixed-size feature configuration file, causing the bots module to trigger an error.

As a result, HTTP 5xx error codes were returned by the core proxy system that handles traffic processing for our customers, for any traffic that depended on the bots module. This also affected Workers KV and Access, which rely on the core proxy.

Unrelated to this incident, we were and are currently migrating our customer traffic to a new version of our proxy service, internally known as FL2. Both versions were affected by the issue, although the impact observed was different.

Customers deployed on the new FL2 proxy engine, observed HTTP 5xx errors. Customers on our old proxy engine, known as FL, did not see errors, but bot scores were not generated correctly, resulting in all traffic receiving a bot score of zero. Customers that had rules deployed to block bots would have seen large numbers of false positives. Customers who were not using our bot score in their rules did not see any impact.

Throwing us off and making us believe this might have been an attack was another apparent symptom we observed: Cloudflare’s status page went down. The status page is hosted completely off Cloudflare’s infrastructure with no dependencies on Cloudflare. While it turned out to be a coincidence, it led some of the team diagnosing the issue to believe that an attacker may be targeting both our systems as well as our status page. Visitors to the status page at that time were greeted by an error message:

Error on the Cloudflare status page

In the internal incident chat room, we were concerned that this might be the continuation of the recent spate of high volume Aisuru DDoS attacks:

Internal chat screenshot

The query behaviour change

I mentioned above that a change in the underlying query behaviour resulted in the feature file containing a large number of duplicate rows. The database system in question uses ClickHouse’s software.

For context, it’s helpful to know how ClickHouse distributed queries work. A ClickHouse cluster consists of many shards. To query data from all shards, we have so-called distributed tables (powered by the table engine Distributed) in a database called default. The Distributed engine queries underlying tables in a database r0. The underlying tables are where data is stored on each shard of a ClickHouse cluster.

Queries to the distributed tables run through a shared system account. As part of efforts to improve our distributed queries security and reliability, there’s work being done to make them run under the initial user accounts instead.

Before today, ClickHouse users would only see the tables in the default database when querying table metadata from ClickHouse system tables such as system.tables or system.columns.

Since users already have implicit access to underlying tables in r0, we made a change at 11:05 to make this access explicit, so that users can see the metadata of these tables as well. By making sure that all distributed subqueries can run under the initial user, query limits and access grants can be evaluated in a more fine-grained manner, avoiding one bad subquery from a user affecting others.

The change explained above resulted in all users accessing accurate metadata about tables they have access to. Unfortunately, there were assumptions made in the past, that the list of columns returned by a query like this would only include the “default” database:

SELECT name, type FROM system.columns WHERE table = 'http_requests_features' order by name;

Note how the query does not filter for the database name. With us gradually rolling out the explicit grants to users of a given ClickHouse cluster, after the change at 11:05 the query above started returning “duplicates” of columns because those were for underlying tables stored in the r0 database.

This, unfortunately, was the type of query that was performed by the Bot Management feature file generation logic to construct each input “feature” for the file mentioned at the beginning of this section. 

The query above would return a table of columns like the one displayed (simplified example):

Example of code block

However, as part of the additional permissions that were granted to the user, the response now contained all the metadata of the r0 schema effectively more than doubling the rows in the response ultimately affecting the number of rows (i.e. features) in the final file output. 

Memory preallocation

Each module running on our proxy service has a number of limits in place to avoid unbounded memory consumption and to preallocate memory as a performance optimization. In this specific instance, the Bot Management system has a limit on the number of machine learning features that can be used at runtime. Currently that limit is set to 200, well above our current use of ~60 features. Again, the limit exists because for performance reasons we preallocate memory for the features.

When the bad file with more than 200 features was propagated to our servers, this limit was hit — resulting in the system panicking. The FL2 Rust code that makes the check and was the source of the unhandled error is shown below:

code that generated the error

This resulted in the following panic which in turn resulted in a 5xx error:

thread fl2_worker_thread panicked: called Result::unwrap() on an Err value

Other impact during the incident

Other systems that rely on our core proxy were impacted during the incident. This included Workers KV and Cloudflare Access. The team was able to reduce the impact to these systems at 13:04, when a patch was made to Workers KV to bypass the core proxy. Subsequently, all downstream systems that rely on Workers KV (such as Access itself) observed a reduced error rate. 

The Cloudflare Dashboard was also impacted due to both Workers KV being used internally and Cloudflare Turnstile being deployed as part of our login flow.

Turnstile was impacted by this outage, resulting in customers who did not have an active dashboard session being unable to log in. This showed up as reduced availability during two time periods: from 11:30 to 13:10, and between 14:40 and 15:30, as seen in the graph below.

availability of Cloudflare internal APIs during the incident

The first period, from 11:30 to 13:10, was due to the impact to Workers KV, which some control plane and dashboard functions rely upon. This was restored at 13:10, when Workers KV bypassed the core proxy system. The second period of impact to the dashboard occurred after restoring the feature configuration data. A backlog of login attempts began to overwhelm the dashboard. This backlog, in combination with retry attempts, resulted in elevated latency, reducing dashboard availability. Scaling control plane concurrency restored availability at approximately 15:30.

Remediation and follow-up steps

Now that our systems are back online and functioning normally, work has already begun on how we will harden them against failures like this in the future. In particular we are:

  • Hardening ingestion of Cloudflare-generated configuration files in the same way we would for user-generated input

  • Enabling more global kill switches for features

  • Eliminating the ability for core dumps or other error reports to overwhelm system resources

  • Reviewing failure modes for error conditions across all core proxy modules

Today was Cloudflare's worst outage since 2019. We've had outages that have made our dashboard unavailable. Some that have caused newer features to not be available for a period of time. But in the last 6+ years we've not had another outage that has caused the majority of core traffic to stop flowing through our network.

An outage like today is unacceptable. We've architected our systems to be highly resilient to failure to ensure traffic will always continue to flow. When we've had outages in the past it's always led to us building new, more resilient systems.

On behalf of the entire team at Cloudflare, I would like to apologize for the pain we caused the Internet today.

Time (UTC)

Status

Description

11:05

Normal.

Database access control change deployed.

11:28

Impact starts.

Deployment reaches customer environments, first errors observed on customer HTTP traffic.

11:32-13:05

The team investigated elevated traffic levels and errors to Workers KV service.

The initial symptom appeared to be degraded Workers KV response rate causing downstream impact on other Cloudflare services.

Mitigations such as traffic manipulation and account limiting were attempted to bring the Workers KV service back to normal operating levels.

The first automated test detected the issue at 11:31 and manual investigation started at 11:32. The incident call was created at 11:35.

13:05

Workers KV and Cloudflare Access bypass implemented — impact reduced.

During investigation, we used internal system bypasses for Workers KV and Cloudflare Access so they fell back to a prior version of our core proxy. Although the issue was also present in prior versions of our proxy, the impact was smaller as described below.

13:37

Work focused on rollback of the Bot Management configuration file to a last-known-good version.

We were confident that the Bot Management configuration file was the trigger for the incident. Teams worked on ways to repair the service in multiple workstreams, with the fastest workstream a restore of a previous version of the file.

14:24

Stopped creation and propagation of new Bot Management configuration files.

We identified that the Bot Management module was the source of the 500 errors and that this was caused by a bad configuration file. We stopped automatic deployment of new Bot Management configuration files.

14:24

Test of new file complete.

We observed successful recovery using the old version of the configuration file and then focused on accelerating the fix globally.

14:30

Main impact resolved. Downstream impacted services started observing reduced errors.

A correct Bot Management configuration file was deployed globally and most services started operating correctly.

17:06

All services resolved. Impact ends.

All downstream services restarted and all operations fully restored.

Read the whole story
bogorad
12 hours ago
reply
Barcelona, Catalonia, Spain
Share this story
Delete
Next Page of Stories