Longer Commutes, Shorter Lives: The Costs of Not Investing in America – The New York Times

For decades, spending on the future put the nation ahead of all others. What would it take to revive that spirit?

An illustration of various historical photographs depicting technologies in a collage.static01.nyt.com/images/2023/10/22/magazine/22mag-publicinvestment/22mag-publicinvestment-jumbo-v2.jpg?quality=75&auto=webp 963w, static01.nyt.com/images/2023/10/22/magazine/22mag-publicinvestment/22mag-publicinvestment-superJumbo-v2.jpg?quality=75&auto=webp 1926w” sizes=”100vw” decoding=”async” width=”600″ height=”638″ style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
Photo illustration by Chantal Jahchan

Every morning in 21st-century America, thousands of people wake up and prepare to take a cross-country trip. Some are traveling for business. Others are visiting family or going on vacations. Whether they are leaving from New York or Los Angeles, Atlanta or Seattle, their trips have a lot in common.

They leave their homes several hours before their plane is scheduled to depart. Many sit in traffic on their way to the airport. Once they arrive, they park their cars and make their way through the terminal, waiting in a security line, taking off their shoes, removing laptops and liquids from their bags. When they finally get to the gate, they often wait again because their flight is delayed. The flight itself typically lasts about six hours heading west, and the travelers then need to find ground transportation to their destination. Door to door, cross-country journeys often last 10 or even 12 hours.

In the sweep of human history, these trips remain a marvel of ingenuity. For centuries, long-distance travel required weeks or months and could be dangerous. Today, somebody can eat breakfast on one end of the continental United States and dinner on the other. If you narrow the focus to recent decades, however, you will notice another striking fact about these trips: Almost none of the progress has occurred in the past half-century. A cross-country trip today typically takes more time than it did in the 1970s. The same is true of many trips within a region or a metropolitan area.

Compare this stagnation with the progress of the previous century. The first transcontinental railroad was completed in 1869, and passenger trains ran on its route days later, revolutionizing a journey that had taken months. People could suddenly cross the country in a week. Next came commercial flight. In the 1930s, an airplane could beat a train across the country by hopscotching from city to city. Finally, the jet age arrived: The first regularly scheduled nonstop transcontinental flight occurred on Jan. 25, 1959, from Los Angeles to New York, on a new long-range Boeing jet, the 707.

The poet Carl Sandburg was among the passengers on the return American Airlines flight that first day. “You look out of the window at the waves of dark and light clouds looking like ocean shorelines,” he wrote about the trip, “and you feel as if you are floating away in this pleasantly moving room, like the basket hanging from the balloon you saw with a visiting circus when you were a boy.” Sandburg was born in 1878, when crossing the country took almost a week. His cross-country flight took five and a half hours.

In the more than 60 years since then, there has been no progress. Instead, the scheduled flight time between Los Angeles and New York has become about 30 minutes longer. Aviation technology has not advanced in ways that speed the trip, and the skies have become so crowded that pilots reroute planes to avoid traffic. Nearly every other part of a cross-country trip, in airports and on local roads, also lasts longer. All told, a trip across the United States can take a few more hours today than in the 1970s.

The speed at which people can get from one place to another is one of the most basic measures of a society’s sophistication. It affects economic productivity and human happiness; academic research has found that commuting makes people more unhappy than almost any other daily activity. Yet in one area of U.S. travel after another, progress has largely stopped over the past half-century.

A black-and-white photograph inside an airplane.static01.nyt.com/images/2023/10/22/magazine/22mag-investment/22mag-investment-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment/22mag-investment-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” width=”600″ height=”375″ style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
The lounge on a Pan Am Boeing 707, the first jetliner model to make regular nonstop flights across the United States.Pictorial Parade/Archive Photos/Getty Images
A black-and-white photograph inside of a train.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-02/22mag-investment-02-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-02/22mag-investment-02-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
Passengers on Amtrak’s Metroliner around 1974.Hum Images/Alamy

In 1969, Metroliner trains made two-and-a-half-hour nonstop trips between Washington and New York. Today, there are no nonstop trains on that route, and the fastest trip, on Acela trains, takes about 20 minutes longer than the Metroliner once did. Commuter railroads and subway lines in many places have also failed to become faster. When I ride the New York City subway, I don’t go from Point A to Point B much faster than my grandparents did in the 1940s. For drivers — a majority of American travelers — trip times have increased, because traffic has worsened. In the California metropolitan area that includes Silicon Valley, a typical rush-hour drive that would have taken 45 minutes in the early 1980s took nearly 60 minutes by 2019.

The lack of recent progress is not a result of any physical or technological limits. In other parts of the world, travel has continued to accelerate. Japan, China, South Korea and countries in Europe have built high-speed train lines that have tangibly improved daily life. Because the United States is less densely populated, high-speed trains would not work in much of this country. But they could transform travel in California, the Northeast and a few other regions — and it is not as if this country has been improving its highways and airline network instead of its rail system. All have languished.

Why has this happened? A central reason is that the United States, for all that we spend as a nation on transportation, has stopped meaningfully investing in it. Investment, in simple terms, involves using today’s resources to make life better in the long term. For a family, investment can involve saving money over many years to afford a home purchase or a child’s college education. For a society, it can mean raising taxes or cutting other forms of spending to build roads, train lines, science laboratories or schools that might take decades to prove their usefulness. Historically, the most successful economic growth strategy has revolved around investment. It was true in ancient Rome, with its roads and aqueducts, and in 19th-century Britain, with its railroads. During the 20th century, it was true in the United States as well as Japan and Europe.

Investments are certainly not guaranteed to pay off: Just as families sometimes buy houses that decline in value, governments sometimes waste taxpayer dollars on programs that accomplish little. Still, successful people and societies have always understood that these risks are unavoidable. Failing to invest enough resources in the future tends to be the bigger mistake.

A black-and-white photograph of two workers on the Golden Gate Bridge.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-03/22mag-investment-03-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-03/22mag-investment-03-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
Construction underway on the Golden Gate Bridge, a four-year, $34 million project, in 1936. Peter Stackpole/The LIFE Picture Collection/Shutterstock
A black-and-white photograph of two roads running over hills.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-04/22mag-investment-04-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-04/22mag-investment-04-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
Interstate 90 in 1972. A cross-country trip today typically takes more time than it did in the 1970s.Associated Press

Investment is not simply a synonym for a bigger government. For decades, liberals and conservatives have been arguing over the size of government. Liberals prefer that the public sector play a larger role, and conservatives prefer a smaller one, in one realm after another: health care, retirement, environmental protection, business regulation and more. Investment has often been swept up in that debate. Some of the steepest declines in government spending on research and development — a crucial form of investment — occurred after Ronald Reagan won the presidency in 1980 with a message that less government was the solution to the country’s economic troubles. Government investment has never recovered. In recent years, federal spending on research and development has been less than half as large, relative to the size of the economy, as it was in the mid-1960s.

Inflation F.A.Q.

What is inflation? Inflation is a loss of purchasing power over time, meaning your dollar will not go as far tomorrow as it did today. It is typically expressed as the annual change in prices for everyday goods and services such as food, furniture, apparel, transportation and toys.

In truth, investment is consistent with both a conservative and a liberal economic philosophy, as American leaders dating to Alexander Hamilton and Thomas Jefferson have recognized. Conservatives believe that the government should do the minimum necessary to create a flourishing society, and investment passes both tests: minimum and necessary. It passes the “minimum” test because many investments are surprisingly inexpensive compared with social insurance programs or the military. Last year, Social Security cost six times as much money as federal R.&D., and spending on the military and veterans was five times as large as R.&D. spending. Investment also passes the “necessary” test — because the private sector tends to do less of it than a healthy economy needs.

Investments are expensive for a private company, and only a fraction of the returns typically flows to the original investors and inventors. Despite patents, other people find ways to mimic the invention. Often, these imitators build on the original in ways that are perfectly legal but would not have been possible without the initial breakthrough. Johannes Gutenberg did not get rich from inventing the printing press, and neither did Tim Berners-Lee from creating the World Wide Web in 1989.

The earliest stages of scientific research are difficult for the private sector to support. In these stages, the commercial possibilities are often unclear. An automobile company, for example, will struggle to justify spending money on basic engineering research that may end up being useful only to an aerospace company. Yet such basic scientific research can bring enormous benefits for a society. It can allow people to live longer and better lives and can lay the groundwork for unforeseen commercial applications that are indeed profitable. A well-functioning capitalist economy depends on large investments in research that the free market, on its own, usually will not make. The most obvious recent example was the crash program to create a Covid-19 vaccine.

During the laissez-faire years leading up to the Great Depression, the United States invested relatively little money in scientific research, and the country fell behind. Europe dominated the Nobel Prizes during this period, and European countries, including Nazi Germany, began World War II with a technological advantage over the United States. The scariest evidence could be seen in the North Atlantic Ocean and Gulf of Mexico, where German U-boats sank more than 200 ships — sometimes visible from U.S. soil — early in the war and killed 5,000 Americans. Recognizing the threat from this technological gap, a small group of American scientists and government officials began an urgent effort to persuade Franklin Roosevelt to support an investment program larger than anything before. The result, called the National Defense Research Committee, funded research into radar, sonar, planes, ships, vehicles and guns. It included a race to develop an atomic bomb before the Nazis did.

That effort arguably won World War II. American factories learned how to build a ship in less than three weeks, down from eight months at the war’s start. “We were never able to build a tank as good as the German tank,” Lucius Clay, an American general, said. “But we made so many of them that it didn’t really matter.”

Clay’s boss, the American military officer overseeing this productive effort, was Gen. Dwight D. Eisenhower. He and the people around him absorbed the lesson about the awesome power of American investment. After becoming president in 1953, Eisenhower recognized that if government did not make vital investments, nobody would. “The principal contradiction in the whole system comes about because of the inability of men to forgo immediate gain for a longtime good,” he once wrote. “We do not yet have a sufficient number of people who are ready to make the immediate sacrifice in favor of a long-term investment.”

A political poster for Dwight D. Eisenhower.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-09/22mag-investment-09-jumbo.jpg?quality=75&auto=webp 758w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-09/22mag-investment-09-superJumbo.jpg?quality=75&auto=webp 1516w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
Dwight D. Eisenhower ran for president as a conservative promising to rein in the excesses of the Democratic Party. His administration demonstrated the relative affordability of government investment.Circa Images/Universal History Archive/Universal Images Group, via Getty Images

The Eisenhower investment boom has no peer in U.S. history, at least not outside a major war. Its best-known achievement, the Interstate System, allowed people and goods to move around the country much more quickly than before. That highway system was one example among many. The Cold War — especially after the 1957 launch of Sputnik raised fears that the Soviet Union had become scientifically dominant — offered a rationale. As a share of the country’s total economic output, federal spending on research and development roughly tripled between the early 1950s and early 1960s. This measure did not even capture highway construction and some other programs.

Eisenhower’s agenda demonstrated the relative affordability of government investment. He had run for president as a conservative promising to rein in the excesses of the Democratic Party’s 20-year hold on the White House. And he did restrain some forms of federal spending, balancing the budget for parts of his presidency. Still, the federal government was easily able to afford a much larger investment budget. Eisenhower was able to be both a fiscal conservative and the president who nearly tripled R.&D. spending.

Sign up for The New York Times Magazine Newsletter  The best of The New York Times Magazine delivered to your inbox every week, including exclusive feature stories, photography, columns and more.

It is worth pausing to reflect on how many global industries were dominated by American companies by the late 20th century. It happened in aviation (Boeing, American, United and Delta) and automobiles (General Motors, Ford and, later, Tesla), as well as energy (Exxon and Chevron), telecommunications (AT&T and Verizon) and pharmaceuticals (Pfizer, Johnson & Johnson, Eli Lilly and Merck). The United States built the world’s best system of higher education, with its universities occupying more than half of the top spots in various rankings of top research institutions. American citizens dominated the scientific Nobel Prizes too.

None of this was inevitable. While there were multiple causes — including the country’s large consumer market and a vibrant private sector shaped by a national ethos that celebrates risk-taking — the postwar investment boom was vital. That boom fit the historical pattern of successful government investments. First, the government paid for basic scientific research that the private sector was not conducting. Then the government helped create an early market for new products by buying them. Boeing, for example, got its start during World War I selling planes to the Navy. Later, the government paid for research that facilitated jet-airline technology and, by extension, the Boeing 707, the plane that launched transcontinental jet travel.

One of the clearest case studies is the computer industry, the same industry that would become known for a cadre of libertarian-leaning executives who dismissed the importance of government. In reality, American dominance in the digital economy would not exist without decades of generous government investment, partly because the private sector failed to see its potential in the industry’s early days.

In the 1930s, a Harvard physics graduate student named Howard Aiken designed one of the world’s first computers, nicknamed the Mark, with help from IBM engineers. But IBM’s top executives, then focused on a mundane punch-card system that helped other companies keep track of their operations, were so unimpressed that they allowed Aiken to take the computer to a laboratory at Harvard University. During World War II, the U.S. Navy took over the lab. The Mark — 51 feet long and eight feet tall, weighing nearly five tons and with 750,000 parts, including visible gears, chains and an electric motor — helped the military perform complex calculations to make weapons more efficient. The New York Times described the computer as the Algebra Machine, and a Boston newspaper called it the Robot Brain. The military came to rely on it so heavily that the lab operated 24 hours a day. The lab had a phone connected directly to the Navy’s Bureau of Ordnance in Washington so the officers could demand immediate solutions to their most pressing problems.

“We used to shake every time that darn thing rang,” recalled Grace Hopper, a former math professor at Vassar College who worked in the lab as a Navy officer and would become a pioneering computer scientist. “The pressure was terrific.” At one point, the mathematician John von Neumann arrived at the lab bearing a long set of complex problems. He did not say why he needed them solved. After he and the team there solved them, von Neumann left to continue secretly working on the atomic-bomb project.

A black-and-white photograph of operators in front of panels.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-05/22mag-investment-05-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-05/22mag-investment-05-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
Operators working for the Manhattan Project in Tennessee during World War II.Prisma Bildagentur/Universal Images Group, via Getty Images
A black-and-white photograph of a row of B-24s.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-06/22mag-investment-06-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-06/22mag-investment-06-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
B-24 bombers on the final assembly line at Ford’s Willow Run plant in Michigan in 1943.Bettmann/Getty Images
A black-and-white photograph of The Mark.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-08/22mag-investment-08-jumbo.jpg?quality=75&auto=webp 1024w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-08/22mag-investment-08-superJumbo.jpg?quality=75&auto=webp 2048w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
The Mark, an early computer relied upon by the U.S. Navy during World War II, in its laboratory at Harvard University in 1944.PhotoQuest/Getty Images

Despite the role that computers played in winning the war, most of corporate America still did not recognize their importance afterward. Into the 1950s, IBM executives — focused on their lucrative punch-card business — remained wary of investing in the development of any large new computer. “It didn’t move me at all,” Thomas Watson Jr., IBM’s chairman, wrote in 1990. “I couldn’t see this gigantic, costly, unreliable device as a piece of business equipment.”

Watson and other executives were not ignorant or uncreative. They were among the most successful businesspeople in the country. Their failure was structural, stemming from the resources at their disposal and the financial incentives that constrained them. Only one organization had enough money and a sufficient long-term horizon to bankroll the creation of the computer industry: the federal government.

It could afford the inevitable setbacks that basic research involved. The federal government could insist that researchers build on one another’s ideas, rather than working in separate laboratories, unaware of related breakthroughs. The government could patiently finance research that was making progress but not yet ready for commercial applications. As soon as an invention was shown to be useful to the largest organization in the country — the military — the federal government could guarantee huge amounts of revenue through military contracts. As late as 1959, federal agencies financed about 85 percent of the country’s electronics research.

The military even made possible IBM’s belated entrance into computing: After its top executives realized their company would otherwise fade, they made it their top priority to win a bidding competition to create the computers needed for a network of radar stations across Alaska and Canada that would watch for a Soviet attack. Watson would later say that the contract was a watershed for the company.

Without a doubt, government officials make plenty of mistakes when choosing which projects to fund. They misjudge an idea’s potential or allow political considerations to influence decisions. Some of these mistakes turn into symbols of government’s supposed fecklessness, like Solyndra, a doomed clean-energy company that the Obama administration funded. Yet these failures tend to be cheap relative to the size of the federal budget, at least in the United States. (The risks of overinvestment are more serious in an authoritarian system like the old Soviet Union or contemporary China.) Even more important, a few big investment successes can produce returns, in economic growth and the resulting tax revenue, that cover the costs for dozens of failures. IBM and Google can pay for a lot of Solyndras.

Just as important, government can reduce its involvement as an industry matures and allow the market system to take over. After the government creates the initial demand for a new product, the sprawling private sector — with its reliance on market feedback and the wisdom of crowds — often does a better job allocating resources than any bureaucratic agency. When a company makes a better version of a product, it gains market share. The incentives for selling goods that improve people’s lives can be enormous. Market capitalism may not do an adequate job of subsidizing basic scientific research, but it is very efficient at spreading the eventual fruits of that research. Government-funded research led to the development of penicillin, cortisone, chemotherapy, vaccines and cardiac treatments, which the private sector then produced and distributed. In transportation, the government built the air-traffic and Interstate-highway systems, which private companies used. Government funding helped develop the satellite, the jet engine, the snowmobile and the microwave oven.

By the end of the 1960s, the United States had become the most broadly prosperous country the world had ever known. Incomes had risen markedly for rich, middle class and poor alike — and more for the poor and middle class than the rich. The Black-white wage gap fell markedly during these decades, even in the presence of virulent racism. Americans had faith during these years that the future could be better than the past, and they forged that future.

Not every form of investment is as tangible as a highway or computer. Education also fits the definition of a program that requires spending money today mostly to improve the quality of life tomorrow. In the middle of the 20th century, education was the investment that turbocharged many other investments.

Even before the Eisenhower investment boom, the United States took a relatively inclusive approach to education. Several of the country’s founders believed that the success of their new democracy depended on an educated citizenry. The Massachusetts Constitution, which John Adams drafted, called for “spreading the opportunities and advantages of education.” The country obviously did not come close to achieving these ideals. It generally denied formal education to Black Americans, and many schools excluded girls. White boys from modest backgrounds often began working at young ages. But the early United States was nonetheless ahead of many other countries in the breadth of its grade schools. By the middle of the 19th century, the American population had surpassed Germany’s as the world’s most widely educated.

When parts of Europe began to catch up, the United States raced ahead again, opening public high schools in the early 1900s. Britain did not enact a law making it possible for many low-income students to attend high school until 1944. That same year, the United States Congress passed the G.I. Bill of Rights, and the postwar investment boom helped make good on the bill’s promise by increasing spending on both K-12 schools and universities.

A black-and-white photograph inside an elementary school in 1950.static01.nyt.com/images/2023/10/22/magazine/22mag-investment-07/22mag-investment-07-jumbo.jpg?quality=75&auto=webp 731w, static01.nyt.com/images/2023/10/22/magazine/22mag-investment-07/22mag-investment-07-superJumbo.jpg?quality=75&auto=webp 1462w” sizes=”((min-width: 600px) and (max-width: 1004px)) 84vw, (min-width: 1005px) 80vw, 100vw” decoding=”async” loading=”lazy” style=”max-width: 100%; margin: 0.5em auto; display: block; height: auto;”>
First graders in a new elementary school in Munster, Ind., in 1950. When the United States produced more and better-educated graduates than its rivals, American industries reaped the benefits.Bettmann/Getty Images

Education has always had its skeptics. Europe was slow to educate its masses because its leaders believed that doing so was a waste of resources: They didn’t see why the working class needed to read literature, study history and learn mathematics. In the United States today, many people still believe that only a narrow subset of the population benefits from college — that its benefits are overrated and that most Americans would be better off pursuing immediate employment. And different people are indeed best served by different kinds of education. Education is also not a cure-all for the American economy. Tax rates, antitrust policy, workers’ bargaining power and many other areas matter enormously.

But downplaying the importance of education is a mistake, a mistake that the United States avoided during much of its rise to global pre-eminence. Relative to its economic rivals, the country could call on more college graduates to fill its professional ranks and more high-school graduates to fill its blue-collar ranks. IBM, Boeing, Pfizer, General Motors and other leading companies benefited from government investments in both basic science and in mass schooling. As Claudia Goldin (the latest Nobel laureate in economics) and Lawrence Katz have argued, the 20th century was the American century in large part because it was the human-capital century. Education — knowledge — can help people live better by allowing them to learn from past errors and make new discoveries. It can help companies and workers accomplish tasks more effectively and produce goods that other people want to buy.

The evidence is everywhere. Today, high school graduates earn more and are less likely to be out of work than people without a high school diploma, as has been the case for more than a century. College graduates earn more yet. Not only does mass education increase the size of the economic pie; it also evens out the distribution. The spread of American high schools and then colleges meant that graduates were no longer an elite group. The wage premium that they earned was spread among a larger group of workers.

The benefits extend far beyond economic measures. Life expectancy for Americans without a college degree has fallen to its lowest level since at least the early 1990s, the scholars Anne Case and Angus Deaton have shown, while it is only slightly below its pre-Covid peak for college graduates. In 2021, the average American with a bachelor’s degree could expect to live eight years longer than somebody without one. More-educated Americans also report being more satisfied with their lives. They are less likely to suffer from chronic pain or to abuse alcohol and drugs. They are more likely to be married and to live with their young children.

Yes, the relationship between education and well-being is partly correlation rather than causation. Talented, hardworking people are more likely to finish school partly because of those characteristics, and they might have thrived even if they dropped out. But academic research has found that much of the relationship is causal. A clever study in Florida compared students whose grades and scores barely earned them admission to a public four-year college with students who just missed the cutoff; those students who were admitted fared significantly better in later life.

Although the rest of the world was slow to do so, it eventually came to recognize the benefits of the American approach to mass education and to copy it. In the 1970s, educational attainment began to surge in Europe and Asia. Political leaders effectively acknowledged that their elitist approach to education had been wrong. They understood that the amount of education that people need to thrive tends to rise over the course of history. The economy becomes more complex, thanks to technological change, and citizens need new knowledge and skills to take advantage of that technology, or else their labor will be replaced by it. When you think about education in these terms, you start to realize that the appropriate amount of schooling for a typical citizen changes over time. If 13 years — kindergarten through 12th grade — made sense a century ago, it surely is not enough today.

The chaos of the 1960s and 1970s helped end the era of great American investment. Crime rose rapidly during those decades. The country fought a losing war in Vietnam. Political leaders were murdered. A president resigned in scandal. And the economy seemed to break down, with both unemployment and inflation soaring. The causes were complex — including wars in the Middle East that upended global energy markets — but Americans understandably came to question their own government’s competence.

In their frustration, many embraced a diagnosis that a group of conservative intellectuals had been offering for decades, mostly without winning converts. It held that the post-New Deal United States had put too much faith in government regulation and not enough in the power of the market to allocate resources efficiently. These intellectuals included Milton Friedman and Robert Bork, while the politician who successfully sold their vision was Reagan. The new consensus has become known as neoliberalism, a word that in recent years has turned into a catchall epithet to describe the views of moderate Democrats and conservatives. But the word is nonetheless meaningful. The neoliberal revolution in economic policy changed the country’s trajectory. After 1980, regulators allowed companies to grow much larger, often through mergers. The government became hostile to labor unions. Tax rates on the affluent plummeted. And Washington pulled back from the major investments it had been making.

Federal spending on research and development, which had already come down from its post-Eisenhower high, declined in the 1980s and 1990s. In recent years, it has accounted for less than half as large a share of G.D.P. as it did 60 years ago. The country’s roads, bridges, rail networks and air-traffic system have all atrophied — hence the lengthening of travel times. The share of national income devoted to government spending on education stopped rising in the 1970s and has remained stagnant since. Less selective colleges, which tend to educate working-class students, tend to be especially lacking in resources. Other countries, meanwhile, have passed by the United States. Every American generation born between the late 1800s and mid-1900s was the most educated in the world. Americans under age 50 no longer hold this distinction. The lack of progress among American men has been especially stark. Men’s wages, not coincidentally, have risen extremely slowly in recent decades.

The stagnation of investment does not stem only from the size of government. It also reflects the priorities of modern government, as set by both Republicans and Democrats. The federal government has grown — but not the parts oriented toward the future and economic growth. Spending has surged on health care, Social Security, antipoverty programs, police and prisons. (Military spending has declined as a share of G.D.P. in recent decades.) All these programs are important. A decent society needs to care for its vulnerable and prevent disorder. But the United States has effectively starved programs focused on the future at the expense of those focused on the present. The country spent about twice as much per capita on the elderly as on children in recent years, according to the Urban Institute. Even the affluent elderly can receive more government help than impoverished children.

These choices help explain why the United States has fallen behind other countries in educational attainment, why our child-poverty rate is so high, why it takes longer to cross the country than it once did. As Eugene Steuerle, an economist with a long career in Washington, has said, “We have a budget for a declining nation.”

Americans have come to believe that the country is, in fact, declining. Less than 25 percent of Americans say that the economy is in good or excellent condition today. Whether the economy has been growing or shrinking during the 21st century, whether a Democrat or Republican has been in the White House, most Americans have usually rated the economy as weak.

Pundits and politicians — who tend to be affluent — sometimes express befuddlement about this pessimism, but it accurately reflects reality for most Americans. For decades, incomes and wealth have grown more slowly than the economy for every group other than the very rich. Net worth for the typical family shrank during the first two decades of the 21st century, after adjusting for inflation. The trends in many noneconomic measures of well-being are even worse: In 1980, life expectancy in the United States was typical for an industrialized country. American life expectancy now is lower than in any other high-income country — including Canada, Japan, South Korea, Australia, Britain, France, Germany, Italy and even less-wealthy European countries like Slovenia and Greece.

This great American stagnation has many causes, but the withering of investment is a major one. The economists and other experts who advise politicians have increasingly come to this conclusion, which explains why President Biden has made investment the centerpiece of his economic strategy — even if that isn’t always obvious to outsiders. He has signed legislation authorizing hundreds of billions of dollars to rebuild the transportation system, subsidize semiconductor manufacturing and expand clean energy. These are precisely the kinds of programs the private sector tends not to do on its own. All told, Biden has overseen the largest increase in federal investment since the Eisenhower era. Notably, the infrastructure and semiconductor bill both passed with bipartisan support, a sign that parts of the Republican Party are coming to question the neoliberal consensus. As was the case during the 1950s, the threat from a foreign rival — China, this time — is focusing some policymakers on the value of government investment.

There is plenty of reason to doubt that the country has reached a true turning point. Biden’s investment program remains much smaller in scale than Eisenhower’s, relative to the size of the economy. Many Republicans continue to oppose government investment, as the recent chaos in the House of Representatives indicates. It is possible that we are now living through a short exception to the country’s long investment slump.

Whatever happens, the stakes should be clear by now. A government that does not devote sufficient resources to the future will produce a society that is ultimately less prosperous, less innovative, less healthy and less mobile than it could be. The citizens of such a society will grow frustrated, and with good reason.


This article is adapted from the book “Ours Was the Shining Future,” which will be published on Oct. 24 by Penguin Random House.

Opening illustration: Source photographs from Underwood Archives/Getty Images; Bettmann/Getty Images; iStock/Getty Images; Gamma-Keystone, via Getty Images; Encyclopedia Britannica/Getty Images; Pictorial Parade/Getty Images.

David Leonhardt writes The Morning, The Times’s flagship daily newsletter. He has previously been an Op-Ed columnist, Washington bureau chief, co-host of “The Argument” podcast, founding editor of The Upshot section and a staff writer for The Times Magazine. In 2011, he received the Pulitzer Prize for commentary. More about David Leonhardt

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.