As the Trump administration prepared to cancel contracts at the Department of Veteran Affairs this year, officials turned to a software engineer with no health care or government experience to guide them.

The engineer, working for the Department of Government Efficiency, quickly built an artificial intelligence tool to identify which services from private companies were not essential. He labeled those contracts “MUNCHABLE.”

The code, using outdated and inexpensive AI models, produced results with glaring mistakes. For instance, it hallucinated the size of contracts, frequently misreading them and inflating their value. It concluded more than a thousand were each worth $34 million, when in fact some were for as little as $35,000.

The DOGE AI tool flagged more than 2,000 contracts for “munching.” It’s unclear how many have been or are on track to be canceled — the Trump administration’s decisions on VA contracts have largely been a black box. The VA uses contractors for many reasons, including to support hospitals, research and other services aimed at caring for ailing veterans.

VA officials have said they’ve killed nearly 600 contracts overall. Congressional Democrats have been pressing VA leaders for specific details of what’s been canceled without success.

We identified at least two dozen on the DOGE list that have been canceled so far. Among the canceled contracts was one to maintain a gene sequencing device used to develop better cancer treatments. Another was for blood sample analysis in support of a VA research project. Another was to provide additional tools to measure and improve the care nurses provide.

ProPublica obtained the code and the contracts it flagged from a source and shared them with a half dozen AI and procurement experts. All said the script was flawed. Many criticized the concept of using AI to guide budgetary cuts at the VA, with one calling it “deeply problematic.”

Cary Coglianese, professor of law and of political science at the University of Pennsylvania who studies the governmental use and regulation of artificial intelligence, said he was troubled by the use of these general-purpose large language models, or LLMs. “I don’t think off-the-shelf LLMs have a great deal of reliability for something as complex and involved as this,” he said.

Sahil Lavingia, the programmer enlisted by DOGE, which was then run by Elon Musk, acknowledged flaws in the code.

“I think that mistakes were made,” said Lavingia, who worked at DOGE for nearly two months. “I’m sure mistakes were made. Mistakes are always made. I would never recommend someone run my code and do what it says. It’s like that ‘Office’ episode where Steve Carell drives into the lake because Google Maps says drive into the lake. Do not drive into the lake.”

Though Lavingia has talked about his time at DOGE previously, this is the first time his work has been examined in detail and the first time he’s publicly explained his process, down to specific lines of code.

Lavingia has nearly 15 years of experience as a software engineer and entrepreneur but no formal training in AI. He briefly worked at Pinterest before starting Gumroad, a small e-commerce company that nearly collapsed in 2015. “I laid off 75% of my company — including many of my best friends. It really sucked,” he said. Lavingia kept the company afloat by “replacing every manual process with an automated one,” according to a post on his personal blog.

Lavingia did not have much time to immerse himself in how the VA handles veterans’ care between starting on March 17 and writing the tool on the following day. Yet his experience with his own company aligned with the direction of the Trump administration, which has embraced the use of AI across government to streamline operations and save money.

Lavingia said the quick timeline of Trump’s February executive order, which gave agencies 30 days to complete a review of contracts and grants, was too short to do the job manually. “That’s not possible — you have 90,000 contracts,” he said. “Unless you write some code. But even then it’s not really possible.”

Under a time crunch, Lavingia said he finished the first version of his contract-munching tool on his second day on the job — using AI to help write the code for him. He told ProPublica he then spent his first week downloading VA contracts to his laptop and analyzing them.

VA press secretary Pete Kasperowicz lauded DOGE’s work on vetting contracts in a statement to ProPublica. “As far as we know, this sort of review has never been done before, but we are happy to set this commonsense precedent,” he said.

The VA is reviewing all of its 76,000 contracts to ensure each of them benefits veterans and is a good use of taxpayer money, he said. Decisions to cancel or reduce the size of contracts are made after multiple reviews by VA employees, including agency contracting experts and senior staff, he wrote.

Kasperowicz said that the VA will not cancel contracts for work that provides services to veterans or that the agency cannot do itself without a contingency plan in place. He added that contracts that are “wasteful, duplicative or involve services VA has the ability to perform itself” will typically be terminated.

Trump officials have said they are working toward a “goal” of cutting around 80,000 people from the VA’s workforce of nearly 500,000. Most employees work in one of the VA’s 170 hospitals and nearly 1,200 clinics.

The VA has said it would avoid cutting contracts that directly impact care out of fear that it would cause harm to veterans. ProPublica recently reported that relatively small cuts at the agency have already been jeopardizing veterans’ care.

The VA has not explained how it plans to simultaneously move services in-house, as Lavingia’s code suggested was the plan, while also slashing staff.

Many inside the VA told ProPublica the process for reviewing contracts was so opaque they couldn’t even see who made the ultimate decisions to kill specific contracts. Once the “munching” script had selected a list of contracts, Lavingia said he would pass it off to others who would decide what to cancel and what to keep. No contracts, he said, were terminated “without human review.”

“I just delivered the [list of contracts] to the VA employees,” he said. “I basically put munchable at the top and then the others below.”

VA staffers told ProPublica that when DOGE identified contracts to be canceled early this year — before Lavingia was brought on — employees sometimes were given little time to justify retaining the service. One recalled being given just a few hours. The staffers asked not to be named because they feared losing their jobs for talking to reporters.

According to one internal email that predated Lavingia’s AI analysis, staff members had to respond in 255 characters or fewer — just shy of the 280 character limit on Musk’s X social media platform.

Once he started on DOGE’s contract analysis, Lavingia said he was confronted with technological limitations. At least some of the errors produced by his code can be traced to using older versions of OpenAI models available through the VA — models not capable of solving complex tasks, according to the experts consulted by ProPublica.

Moreover, the tool’s underlying instructions were deeply flawed. Records show Lavingia programmed the AI system to make intricate judgments based on the first few pages of each contract — about the first 2,500 words — which contain only sparse summary information.

“AI is absolutely the wrong tool for this,” said Waldo Jaquith, a former Obama appointee who oversaw IT contracting at the Treasury Department. “AI gives convincing looking answers that are frequently wrong. There needs to be humans whose job it is to do this work.”

Lavingia’s prompts did not include context about how the VA operates, what contracts are essential or which ones are required by federal law. This led AI to determine a core piece of the agency’s own contract procurement system was “munchable.”

At the core of Lavingia’s prompt is the direction to spare contracts involved in “direct patient care.”

Such an approach, experts said, doesn’t grapple with the reality that the work done by doctors and nurses to care for veterans in hospitals is only possible with significant support around them.

Lavingia’s system also used AI to extract details like the contract number and “total contract value.” This led to avoidable errors, where AI returned the wrong dollar value when multiple were found in a contract. Experts said the correct information was readily available from public databases.

Lavingia acknowledged that errors resulted from this approach but said those errors were later corrected by VA staff.

In late March, Lavingia published a version of the “munchable” script on his GitHub account to invite others to use and improve it, he told ProPublica. “It would have been cool if the entire federal government used this script and anyone in the public could see that this is how the VA is thinking about cutting contracts.”

According to a post on his blog, this was done with the approval of Musk before he left DOGE. “When he asked the room about improving DOGE’s public perception, I asked if I could open-source the code I’d been writing,” Lavingia said. “He said yes — it aligned with DOGE’s goal of maximum transparency.”

That openness may have eventually led to Lavingia’s dismissal. Lavingia confirmed he was terminated from DOGE after giving an interview to Fast Company magazine about his work with the department. A VA spokesperson declined to comment on Lavingia’s dismissal.

VA officials have declined to say whether they will continue to use the “munchable” tool moving forward. But the administration may deploy AI to help the agency replace employees. Documents previously obtained by ProPublica show DOGE officials proposed in March consolidating the benefits claims department by relying more on AI.

And the government’s contractors are paying attention. After Lavingia posted his code, he said he heard from people trying to understand how to keep the money flowing.

“I got a couple DMs from VA contractors who had questions when they saw this code,” he said. “They were trying to make sure that their contracts don’t get cut. Or learn why they got cut.

“At the end of the day, humans are the ones terminating the contracts, but it is helpful for them to see how DOGE or Trump or the agency heads are thinking about what contracts they are going to munch. Transparency is a good thing.”

If you have any information about the misuse or abuse of AI within government agencies, Brandon Roberts is an investigative journalist on the news applications team and has a wealth of experience using and dissecting artificial intelligence. He can be reached on Signal @brandonrobertz.01 or by email brandon.roberts@propublica.org.

If you have information about the VA that we should know about, contact reporter Vernal Coleman on Signal, vcoleman91.99, or via email, vernal.coleman@propublica.org, and Eric Umansky on Signal, Ericumansky.04, or via email, eric.umansky@propublica.org.