Prior to WWII the U.S was a distant second in science and engineering. By the time the war was over, U.S. science and engineering had blown past the British, and led the world for 85 years. With the cutbacks of U.S. government support and the Chinese investing heavily for the last three decades to surpass the U.S., the long run of U.S. dominance in science is likely over.
The Government University Research Ecosystem – A Primer
The federal government is the largest source of academic research and development (R&D) funding in the United States, providing funds through more than two dozen federal agencies, with the National Institutes of Health (NIH) and the National Science Foundation (NSF) providing the largest portions of federal R&D funding to U.S. colleges and universities.
This 85 year old relationship between government research agencies and universities is what helped make the U.S. tech ecosystem the envy of the world.
Universities spend $109 billion a year on research with $60 billion coming from the federal government. That $60 billion comes from the $200 billion/year R&D budget of federal agencies – NIH, DOD, NSF, DOE, NASA, USDA — who want principal investigators (Professors) at universities to do research for them. The NIH gives universities $33 billion for research, the DOD $9 billion, NSF $6.7 billion, DOE $2.6 billion, NASA $2.3 billion and the USDA $1.7 billion. In exchange these agencies get basic research that moves science forward and/or applied research which creates prototypes of potential products.
That $60 billion from the government is just part of the $109 billion that universities spend on research. They get the rest of their research dollars from $28 billion from charitable donors and industrial collaborations. (Donations and collaborations are largely driven by government policy as well due to the tax treatment of charitable contributions. You can see the role of donors on any campus by reading the names of the buildings.) Industrial collaborations fund/contribute to the creation of buildings and research labs. $7 billion comes from non-profits, $6 billion from companies, and $5.5 billion from state and local government. These investments in U.S. university research is why foreign professors and students fight to come to U.S. universities.
The Bayh/Dole Act of 1980, had a major impact on innovation and technology transfer in the United States. It allowed universities to retain ownership of inventions that were developed using federally funded research. Before this law, any patents resulting from government-funded research were owned by the government, which meant they often went unused.
What makes this ecosystem unique is what happens to the university research: It’s the engine for U.S. startup and job creation. Each year universities license 3,000 patents, 3,200 copyrights and 1,600 other licenses to technology startups and existing companies. And these tech startups are also getting $4 billion a year in SBIR/STTR in seed funding grants from the same Government research agencies while venture capital adds $170 billion to scale those investments. In return the ecosystem has created technological leadership and enormous wealth for corporations, venture capitalists and individuals who hold investments in the stock market. Collectively, U.S. universities spin out over 1,100 new science-based startups every year, and lead to countless products which save or improve the lives of Americans.
Until now, no other nation came close to this ecosystem.
University Indirect (Facility and Administration/Overhead) Costs
During World War II, as the government funded university research they were also breaking new ground in thinking about how to pay for it. Unlike a traditional fixed-price contract, the government was not giving university researchers a set of requirements to meet or specifications to design to. They asked them to do research and, if the research looked like it might solve a military problem, then to build a prototype they could test. None of this fit in the formal definition of how contracting worked.
So OSR&D and universities agreed that they would pay the schools for both their direct research costs (e.g., professor and student salaries, materials) and the indirect costs also known as Facilities and Administrative(F+A) costs (e.g., administrative staff, finance, legal, security, procurement, compliance) supporting the researchers, utilities, the cost of constructing new labs, and maintaining those facilities. At first, the government reimbursed universities for indirect costs using a flat 25% of direct costs. Unlike businesses, universities had no profit margin, so indirect cost recovery was their only way to pay for and maintain their research infrastructure. By the end of the war some universities under OSR&D contract had a 50% indirect reimbursement rate.
While this model is great for a university, keep in mind that a 50% reimbursement rate means that a university principal investigator who gets a $1 million grant will get $667,000 to spend on direct costs of research and one-third of the total i.e., $333,333 will go to the university for indirect costs.(A common misconception is that indirect-cost rates are a percentage of the total grant, so a rate of 50% would mean that half of the award goes to overhead. Not true. Instead, the math works like this; the reimbursement rate is calculated from the direct costs of research. For each $1 million of direct costs of research, an indirect cost of 50% would mean $500k. The total size of the grant (direct + indirect costs) would be $1.5million. So, a 50% indirect cost translates to 33% of the total grant.)
Post World War II the Office of Naval Research (ONR) began negotiating indirect cost rates with universities based on actual institutional expenses. Universities had to justify their overhead costs (administration, facilities, utilities, etc.) to receive full reimbursement. ONR formalized financial auditing processes to ensure institutions accurately reported indirect costs. This led to the modern practice of negotiated indirect cost rates (NICR), which is still used today. Over the last 75 years the reimbursement process has been tweaked to prevent gaming but until February 2025 remained essentially the same.
In the 1950s the Bureau of the Budget (now the Office of Management and Budget, OMB) introduced official guidelines for indirect cost recovery and formalized Indirect Costs. Federal agencies, including the National Science Foundation (NSF) and National Institutes of Health (NIH), adopted these guidelines.
In the 1970s the Office of Management and Budget, OMB issued Circular A-21, which established standardized principles for indirect cost reimbursement. This set rules for cost allocation, rate negotiation, and what expenses were allowable.
In the 1990s reports of universities inflating overhead rates (e.g., Stanford University’s luxury yacht controversy in 1991 and their 78% indirect reimbursement rate) led to reforms, one of the major ones being capping administrative cost part of Facilities and Administrative at 26%. The OMB revised Circular A-21 to cap certain costs and increase transparency.
In 2014 OMB Circular A-21 was replaced by OMB Uniform Guidance (2 CFR 200), streamlining federal grant policies across agencies. The new rules emphasized consistency, cost controls, and transparency in indirect cost recovery.
In 2017 the administration attempted to impose a 10% Facilities and Administrative cap on NIH research. Congress rejected this attempt and added language in the annual funding bill that essentially freezes (with some exceptions) university Facilities and Administrative rates at their 2017 levels. This provision has been continued every year and is now embodied in section 224 of the Consolidated Appropriations Act of 2024, which has been extended twice by Continuing Resolutions and is still in effect as of March 3rd, 2025
Through February 2025, universities would negotiate their indirect costs/Facilities & Administrative (F&A) cost rates with either the Department of Health and Human Services (HHS) or Office of Naval Research (ONR). Most research-intensive universities received indirect cost rates of 50%–60% for on-campus research. Private foundations often have a lower rate, (10-20%) but they tend to have much more expansive criteria of what can be considered a direct cost.
In February 2025, the National Institute of Health (NIH) slashed their indirect reimbursement rate to an arbitrary 15%. That policy is currently being challenged in court. If the policy is ultimately allowed to proceed, Universities now will have to either cut their research budgets, stop building advanced labs, and/or funding of new students.
So while the U.S.’ science-based economic engine starts to sputter, many of those PhD students, postdocs, and faculty will take roles overseas, including in our economic adversaries like China. We are on the brink of handing over our GDP growth to our competitors, at the worst possible time.
While indirect reimbursement was never perfect, the overhead allowed universities to build the best science labs in the world, which attracted the best and brightest researchers and students. In them U.S. universities created breakthrough after breakthrough.
It’s likely those days are over and the action moves to China.
Sidebar
How the U.S. Became the Leader in Science and Technology
Prior to WWII the U.S was a distant second in science and engineering. By the time the war was over, U.S. science and engineering had blown past the British, and led the world for 85 years.
What was behind this surprising result? It happened because two very different people were the science advisors to their leaders with radically different views on how to use their country’s resources to build advanced weapon systems. Post war, it meant Britain’s early lead was ephemeral while the U.S. built the foundation for an innovation ecosystem that led the world – until now.
China’s leadership has spent the last three decades investing heavily to surpass the U.S. in science. With the cutbacks of U.S. government support, the long run of U.S. dominance in science is likely over.
—
The British – Military Weapons Labs
When Winston Churchill became the British prime minister in 1940, he had at his side his science advisor, Professor Frederick Lindemann, his friend for 20 years. Lindemann was a headed up the physics department at Oxford and was the director of the Oxford Clarendon Laboratory. Already at war with Germany, Britain’s wartime priorities focused on defense and intelligence technology projects, e.g. weapons that used electronics, radar, physics, etc. – a radar-based air defense network called Chain Home, airborne radar on night fighters, and plans for a nuclear weapons program – the MAUD Committeewhich started the British nuclear weapons program code-named Tube Alloys. And their codebreaking organization at Bletchley Park was starting to read secret German messages – the Enigma – using the earliest computers ever built.
As early as the mid 1930s, the British, fearing Nazi Germany, developed prototypes of these weapons using their existing military and government research labs. The Telecommunications Research Establishment built Radar, critical to Britain’s survival during the Battle of Britain, and electronic warfare to protect British bombers over Germany. The Admiralty Research Lab built Sonar and anti-submarine warfare systems. The Royal Aircraft Establishment was developing jet fighters. The labs then contracted with British companies to manufacture the weapons in volume. British government labs viewed their universities as a source of talent, but they had no role in weapons development.
Under Churchill, Professor Lindemann influenced which projects received funding and which were sidelined. Lindemann’s WWI experience as a researcher and test pilot on the staff of the Royal Aircraft Factory at Farnborough gave him confidence in the competence of British military research and development labs. His top-down, centralized approach with weapons development primarily in government research labs shaped British innovation during WW II – and led to its demise post-war.
The Americans – University Weapons Labs
Unlike Britain, the U.S. lacked a science advisor. It wasn’t until June 1940, that Vannevar Bush, ex-MIT dean of engineering, told President Franklin Roosevelt that World War II would be the first war won or lost on the basis of advanced technology electronics, radar, physics problems, etc.
Unlike Lindemann, Bush had a 20-year-long contentious history with the U.S. Navy and a dim view of government-led R&D. Bush contended that the government research labs were slow and second rate. He convinced the President that while the Army and Navy ought to be in charge of making conventional weapons – planes, ships, tanks, etc. — scientists from academia could develop better advanced technology weapons and deliver them faster than Army and Navy research labs. And he argued the only way the scientists could be productive was if they worked in a university setting in civilian-run weapons labs run by university professors. To the surprise of the Army and Navy Service chiefs, Roosevelt agreed to let Bush build exactly that organization to coordinate and fund all advanced weapons research.
(While Bush had no prior relationship with the President, Roosevelt had been the Assistant Secretary of the Navy during World War I and like Bush had seen first-hand its dysfunction. Over the next four years they worked well together. Unlike Churchill, Roosevelt had little interest in science and accepted Bush’s opinions on the direction of U.S. technology programs, giving Bush sweeping authority.)
In 1941, Bush upped the game by convincing the President that in addition to research, development, acquisition and deployment of these weapons also ought to be done by professors in universities. There they would be tasked to develop military weapons systems and solve military problems to defeat Germany and Japan. (The weapons were then manufactured in volume by U.S. corporations Western Electric, GE, RCA, Dupont, Monsanto, Kodak, Zenith, Westinghouse, Remington Rand and Sylvania.) To do this Bush created the Office of Scientific Research and Development (OSR&D).
OSR&D headquarters divided the wartime work into 19 “divisions,” 5 “committees,” and 2 “panels,” each solving a unique part of the military war effort. There were no formal requirements.
Staff at OSRD worked with their military liaisons to understand what the most important military problems were and then each OSR&D division came up with solutions. These efforts spanned an enormous range of tasks – the development of advanced electronics, radar, rockets, sonar, new weapons like the proximity fuse, Napalm, the Bazooka and new drugs such as penicillin, cures for malaria, chemical warfare, and nuclear weapons.
Each division was run by a professor hand-picked by Bush. And they were located in universities – MIT, Harvard, Johns Hopkins, Caltech, Columbia and the University of Chicago all ran major weapons systems programs. Nearly 10,000 scientists and engineers, professors and their grad students received draft deferments to work in these university labs.
Americans – Unlimited Dollars
What changed U.S. universities, and the world forever, was government money. Lots of it. Prior to WWII most advanced technology research in the U.S. was done in corporate innovation labs (GE, AT&T, Dupont, RCA, Westinghouse, NCR, Monsanto, Kodak, IBM, et al.) Universities had no government funding (except for agriculture) for research. Academic research had been funded by non-profits, mostly the Rockefeller and Carnegie foundations and industry. Now, for the first time, U.S. universities were getting more money than they had ever seen. Between 1941 and 1945, OSR&D gave $9 billion (in 2025 dollars) to the top U.S. research universities. This made universities full partners in wartime research, not just talent pools for government projects as was the case in Britain.
The British – Wartime Constraints
Wartime Britain had very different constraints. First, England was under daily attack. They were being bombed by air and blockaded by submarines, so it was logical that they focused on a smaller set of high-priority projects to counter these threats. Second, the country was teetering on bankruptcy. It couldn’t afford the broad and deep investments that the U.S. made. (Illustrated by their abandonment of their nuclear weapons programs when they realized how much it would cost to turn the research into industrial scale engineering.) This meant that many other areas of innovation—such as early computing and nuclear research—were underfunded compared to their American counterparts.
Post War – Britain
Churchill was voted out of office in 1945. With him went Professor Lindemann and the coordination of British science and engineering. Britain would be without a science advisor until 1951-55 when Churchill returned for a second term and brought back Lindemann with him.
The end of the war led to extreme downsizing of the British military including severe cuts to all the government labs that had developed Radar, electronics, computing, etc.
With post-war Britain financially exhausted, post-war austerity limited its ability to invest in large-scale innovation. There were no post-war plans for government follow-on investments. The differing economic realities of the U.S. and Britain also played a key role in shaping their innovation systems. The United States had an enormous industrial base, abundant capital, and a large domestic market, which enabled large-scale investment in research and development. In Britain, key industries were nationalized, which reduced competition and slowed technological progress. While British research institutions like Cambridge and Oxford remained leaders in theoretical science, they struggled to scale and commercialize their breakthroughs. For instance Alan Turing’s and Tommy Flower’s pioneering work on computing at Bletchley Park didn’t turn into a thriving British computing industry—unlike in the U.S., where companies like ERA, Univac, NCR and IBM built on their wartime work.
Without the same level of government support for dual-use technologies or commercialization, Britain’s post-war innovation ecosystem never took off.
Post War – The U.S.
Meanwhile in the U.S. universities and companies realized that the wartime government funding for research had been an amazing accelerator for science, engineering, and medicine. Everyone, including Congress, agreed that the U.S. government should continue to play a large role in continuing it. In 1945, Vannevar Bush published a report “Science, The Endless Frontier” advocating for government funding of basic research in universities, colleges, and research institutes. Congress argued for five years on how to best organize federal support of science.
By the end of the war, OSR&D funding had taken technologies that had been just research papers or considered impossible to build at scale and made them commercially viable – computers, rockets, radar, Teflon, synthetic fibers, nuclear power, etc. Innovation clusters formed around universities like MIT and Harvard which had received large amounts of OSR&D (MIT Radiation Lab or “Rad Lab” employed 3,500 civilians during WWII and developed and built 100 radar systems deployed in theater,) or around professors who ran one of the OSR&D divisions – like Fred Terman at Stanford.
In 1950 Congress set up the National Science Foundation to fund all basic science in the U.S. (except for Life Sciences, a role the new National Institute of Health would assume.) In the meantime the Atomic Energy Committee spun out of the Manhattan Project and the military services took back advanced weapons development. Eight years later DARPA and NASA would also form as federal research agencies.
Ironically, Vannevar Bush’s influence would decline even faster than Professor Lindemann’s. When President Roosevelt died in April 1945 and Secretary of War Stimson retired in September 1945, all the knives came out from the military leadership Bush had bypassed in the war. His arguments on how to reorganize OSR&D made more enemies in Congress. By 1948 Bush had retired from government service.
Divergent Legacies
Britain’s focused, centralized model using government research labs was created in a struggle for short-term survival. They achieved brilliant breakthroughs but lacked the scale, integration and capital needed to dominate in the post-war world.
The U.S. built a decentralized, collaborative ecosystem, one that tightly integrated massive government funding of universities for research and prototypes while private industry built the solutions in volume. This U.S. university/government research model would become the blueprint for modern innovation ecosystems around the world.
Both systems were influenced by the experience and personalities of their nation’s science advisor.
Summary
By the end of the war, the U.S. and British innovation systems had produced radically different outcomes.
Britain remained a leader in theoretical science and defense technology, but its failure to commercialize wartime innovations meant it lost ground in key industries like computing and consumer electronics.
The U.S. emerged as the global leader in science and technology, with innovations like radar, computing, and nuclear power driving its post-war economic boom. The university-industry-government partnership became the foundation of Silicon Valley, the aerospace sector, and the biotechnology industry.
This is the model that China has emulated