March 15, 2021
The 21st-century economy has been a two-decade series of punches in the gut.
The century began in economic triumphalism in the United States, with a sense that business cycles had been vanquished and prosperity secured for a blindingly bright future. Instead, a mild recession was followed by a weak recovery followed by a financial crisis followed by another weak recovery followed by a pandemic-induced collapse. A couple of good years right before the pandemic aside, it has been two decades of overwhelming inequality and underwhelming growth — an economy in which a persistently weak job market has left vast human potential untapped, helping fuel social and political dysfunction.
Those two decades coincide almost precisely with my career as an economics writer. It is the reason, among my colleagues, I have a reputation for writing stories that run the gamut from ominous to gloomy to terrifying.
But strange as it may seem in this time of pandemic, I’m starting to get optimistic. It’s an odd feeling, because so many people are suffering — and because for so much of my career, a gloomy outlook has been the correct one.
Predictions are a hard business, of course, and much could go wrong that makes the decades ahead as bad as, or worse than, the recent past. But this optimism is not just about the details of the new pandemic relief legislation or the politics of the moment. Rather, it stems from a diagnosis of three problematic megatrends, all related.
There has been a dearth of economy-altering innovation, the kind that fuels rapid growth in the economy’s productive potential. There has been a global glut of labor because of a period of rapid globalization and technological change that reduced workers’ bargaining power in rich countries. And there has been persistently inadequate demand for goods and services that government policy has been unable to fix.
There is not one reason, however, to think that these negative trends have run their course. There are 17.
The ketchup might be ready to flow
In 1987, economist Robert Solow said, “You can see the computer age everywhere but in the productivity statistics.” Companies were making great use of rapid improvements in computing power, but the overall economy wasn’t really becoming more productive.
This analysis was right until it was wrong. Starting around the mid-1990s, technological innovations in supply chain management and factory production enabled companies to squeeze more economic output out of every hour of work and dollar of capital spending. This was an important reason for the economic boom of the late 1990s.
The Solow paradox, as the idea underlying his quote would later be called, reflected an insight: An innovation, no matter how revolutionary, will often have little effect on the larger economy immediately after it is invented. It often takes many years before businesses figure out exactly what they have and how it can be used, and years more to work out kinks and bring costs down.
In the beginning, it may even lower productivity! In the 1980s, companies that tried out new computing technology often needed to employ new armies of programmers as well as others to maintain old, redundant systems.
But once such hurdles are cleared, the innovation can spread with dizzying speed.
It’s like the old ditty: “Shake and shake the ketchup bottle. First none will come and then a lot’ll.”
Or, in a more formal sense, economists Erik Brynjolfsson, Daniel Rock and Chad Syverson call this the “productivity J-curve,” in which an important new general-purpose technology — they use artificial intelligence as a contemporary example — initially depresses apparent productivity, but over time unleashes much stronger growth in economic potential. It looks as if companies have been putting in a lot of work for no return, but once those returns start to flow, they come faster than once seemed imaginable.
There are several areas where innovation seems to be at just such a point, and not just artificial intelligence.
2020s battery technology looks kind of like 1990s microprocessors
Remember Moore’s Law? It was the idea that the number of transistors that could be put on an integrated circuit would double every two years as manufacturing technology improved. That is the reason you may well be wearing a watch with more computer processing power than the devices that sent people into outer space in the 1960s.
Battery technology isn’t improving at quite that pace, but it’s not far behind it. The price of lithium-ion battery packs has fallen 89% in inflation-adjusted terms since 2010, according to BloombergNEF, and is poised for further declines. There have been similar advances in solar cells, raising the prospect of more widespread inexpensive clean energy.
Another similarity: Microprocessors and batteries are not ends unto themselves, but rather technologies that enable lots of other innovation. Fast, cheap computer chips led to software that revolutionized the modern economy; cheap batteries and solar cells could lead to a wave of innovation around how energy is generated and used. We’re only at the early stages of that process.
Emerging innovations can combine in unexpected ways
In the early part of the 20th century, indoor plumbing was sweeping the nation. So was home electricity. But the people installing those pipes and those power lines presumably had no idea that by the 1920s, the widespread availability of electricity and free-flowing water in homes would enable the adoption of the home washing machine, a device that saved Americans vast amounts of time and backbreaking labor.
It required not just electricity and running water, but also revolutions in manufacturing techniques, production and distribution. All those innovations combined to make domestic life much easier.
Could a combination of technologies now maturing create more improvement in living standards than any of them could in isolation?
Consider driverless cars and trucks. They will rely on long-building research in artificial-intelligence software, sensors and batteries. After years of hype, billions of dollars in investment and millions of miles of test drives, the possibilities are starting to come into view.
Waymo, a sister company of Google, has opened a driverless taxi service to the public in the Phoenix suburbs. Major companies including General Motors, Tesla and Apple are in the hunt as well, along with many smaller competitors.
Apply the same logic to health care, warehousing, heavy industry and countless other fields. Inventions maturing now could be combined in new ways we can’t yet imagine.
The pandemic has taught us how to work remotely
Being cooped up at home may pay some surprising economic dividends. As companies and workers have learned how to operate remotely, it could allow more people in places that are less expensive and that have fewer high-paying jobs to be more productive. It could enable companies to operate with less office space per employee, which in economic terms means less capital needed to generate the same output. And it could mean a reduction in commuting time.
Even after the pandemic recedes, if only 10% of office workers took advantage of more remote work, that would have big implications for the United States’ economic future — bad news if you are a landlord in an expensive downtown perhaps, but good news for overall growth prospects.
Even Robert Gordon is (a little) more optimistic!
Gordon wrote the book on America’s shortfall in innovation and productivity in recent decades — a 784-page book in 2016, to be precise. Now Gordon, a Northwestern University economist, is kind of, sort of, moderately optimistic. “I would fully expect growth in the decade of the 2020s to be higher than it was in the 2010s, but not as fast as it was between 1995 and 2005,” he said recently.
Crises spur innovation
The mobilization to fight World War II was a remarkable feat. Business and government worked together to drastically increase the productive capacity of the economy, put millions to work, and advance countless innovations such as synthetic rubber and the mass production of aircraft.
Similarly, the Cold War generated a wave of public investment and innovation, such as satellites (a byproduct of the space race) and the internet (originally intended to provide decentralized communication in the event of a nuclear attack).
Could our current crises spur similar ambition? Already the COVID-19 pandemic has accelerated the usage of mRNA technology for creating new vaccines, which could have far-reaching consequences for preventing disease.
And as the 2020s progress, the deepening sense of urgency to reduce carbon emissions and cope with the fallout of climate change is the sort of all-encompassing challenge that could prove as galvanizing as those experiences — with similar implications for investment and innovation.
Tight labor markets spur innovation, too
Why did the Industrial Revolution begin in Britain instead of somewhere else? One theory is that relatively high wages there (a result of international trade) created an urgency for firms to substitute machinery for human labor. Over time, finding ways to do more with fewer workers generated higher incomes and living standards.
But why might the labor market of the 2020s be a tight one? It boils down to two big ideas: shifts in the global economy and demographics that make workers scarcer in the coming decade than in recent ones; and a newfound and bipartisan determination on the part of policymakers in Washington to achieve full employment.
There’s only one China
Imagine an isolated farm town with 100 people.
Five of the 100 own the farms. An additional 10 act as managers on behalf of the owners. And there are five intellectuals who sit around thinking big thoughts. The other 80 people are laborers.
What would happen if suddenly another 80 laborers showed up, people who were used to lower living standards?
The intellectuals might tell a complex story about how the influx of labor would eventually make everyone better off, as more land was cultivated and workers could specialize more. The owners and their managers would be happy because they would be instantly richer (they could pay people less to plow the fields).
But the existing 80 laborers — competing for their jobs with an influx of lower-paid people — would see only immediate pain. The long-term argument that everybody gets richer in the end wouldn’t carry much weight.
That’s essentially what has happened in the past few decades as China has gone from being isolated to being deeply integrated in the world economy. When the country joined the World Trade Organization in 2001, its population of 1.28 billion was bigger than that of the combined 34 advanced countries that make up the Organization for Economic Cooperation and Development (1.16 billion).
But that was a one-time adjustment, and wages are rising rapidly in China as it moves beyond low-end manufacturing and toward more sophisticated goods. India, the only other country with comparable population, is already well integrated into the world economy. To the degree globalization continues, it should be a more gradual process.
There’s only one Mexico
For years, American workers were also coming into competition with lower-earning Mexicans after enactment of the North American Free Trade Agreement in 1994. As with China, the new dynamic improved the long-term economic prospects for the United States, but in the short run it was bad for many American factory workers.
But it too was a one-time adjustment. Even before President Donald Trump, trade agreements under negotiation were for the most part no longer focused on making it easier to import from low-labor-cost countries. The main aim was to improve trade rules for American companies doing business in other rich countries.
The offshoring revolution is mostly played out
Once upon a time, if you were an American company that needed to operate a customer-service call center or carry out some labor-intensive information-technology work, you had no real choice but to hire a bunch of Americans to do it. The emergence of inexpensive, instant global telecommunication changed that, allowing you to put work wherever costs were the lowest.
In the first decade of the 2000s, American companies did just that on mass scale, locating work in countries such as India and the Philippines. It’s a slightly different version of the earlier analogy involving the farm; a customer-service operator in Kansas was suddenly in competition with millions of lower-earning Indians for a job.
But it’s not as if the internet can be invented a second time.
Sensing a theme here? In the early years of the 21st century, a combination of globalization and technological advancements put American workers in competition with billions of workers around the world.
It created a dynamic in which workers had less bargaining power, and companies could achieve cost savings not by creating more innovative ways of doing things but by exploiting a form of labor-cost arbitrage. That may not be the case in the 2020s.
Baby boomers can’t work forever
The surge of births that took place in the two decades after World War II created a huge generation with long-reaching consequences for the economy. Now, their ages ranging from 57 to 76, the baby boomers are retiring, and that means opportunity for the generations that came behind them.
As the boomers seek to continue consuming — spending their amassed savings, pensions and Social Security benefits — there will be relatively stable demand for goods and services and a relatively smaller pool of workers to produce them.
According to the Social Security Administration’s projections of the so-called “dependency ratio,” for every 100 people in their prime working years of 20 to 64 in 2030, there will be 81 people outside that age range. In 2020 that number was 73.
That is bad news for public finances and for the headline rate of growth in gross domestic product, but good news for those in the workforce. It should give workers more leverage to demand raises and give employers incentives to invest in productivity-enhancing software or machinery.
The millennials are entering their prime
Spending has a life cycle. Young adults don’t make much money. As they age, they start to earn more. Many start families and begin spending a lot more, buying houses, cars and everything else needed to raise children. Then they tend to cut back on spending as the kids move out of the house.
That, anyway, is what the data says takes place on average. The rate of consumption spending soars for Americans in their 20s and 30s, and peaks sometime in their late 40s. It’s probably not a coincidence that some of the best years for the American economy in recent generations were from 1983 to 2000, when the ultralarge baby-boom generation was in that crucial high-spending period.
Guess what generation is in that life phase in the 2020s? The millennials, an even-larger generation than the boomers.
They’ve had a rough young adulthood, starting their careers in the shadow of the Great Recession. But all that adult-ing they’re starting to do could have big, positive economic consequences for the decade ahead.
Everybody likes it hot
Twelve years ago, a Democratic president took office at a time of economic crisis. He succeeded at ending the crisis, but the expansion that followed was a disappointment, with years of slow growth at a time millions were either unemployed or out of the workforce.
The overwhelming tone of the economic-policy discussion during those years, however, was different. President Barack Obama spoke of his plans to reduce the budget deficit. Republicans in Congress demanded even more fiscal restraint. Top Federal Reserve officials fretted about inflation risks, even when unemployment was high and inflation persistently low.
The Trump presidency changed that discussion. Even as tax cuts widened the budget deficit, interest rates stayed low. Even as the jobless rate fell to levels not seen in nearly five decades, inflation stayed low. It was evident, based on how the economy performed in 2018 and 2019, and up until the pandemic began, that the U.S. economy could run hotter than the Obama-era consensus seemed to allow. That insight has powerful implications for the 2020s.
Biden wants to let it rip
President Joe Biden and congressional Democrats were determined to learn the lessons of the Obama era. Biden was deeply involved in that stimulus plan, which proved inadequate to the task of creating and sustaining a robust recovery.
The lesson that Biden and the Democratic Party took from 2009 was straightforward: Do whatever it takes to get the economy humming, and the politics will work in your favor. That thinking helped lead to the $1.9 trillion relief bill he signed Thursday.
- Powell wants to let it rip
“To call something hot, you need to see heat,” Federal Reserve Chair Jerome Powell said in 2019. That’s as good a summary of the Fed’s approach to the economy as any.
In more formal terms, the Fed has a new framework for policy called “Flexible Average Inflation Targeting.” It is in effect a repudiation of past Fed strategies of preemptively slowing the economy to prevent an outbreak of inflation predicted by economic models.
Now, the Fed says it will raise interest rates in response to actual inflation in the economy, not just forecasts, and will not act simply because the unemployment rate is lower than models say it can sustainably get.
Nearly every time he speaks, MPowell sounds like a true believer in the church of full employment.
Republicans are getting away from austerity politics
Consider an event that took place less than three months ago (that may feel like three years ago): Overwhelming bipartisan majorities in Congress passed a $900 billion pandemic-relief bill. Then a Republican president threatened to veto it, not because it was too generous, but because it was too stingy.
Trump didn’t get his way on increasing $600 payments to most Americans to $2,000 payments, and he signed the legislation anyway, grudgingly. But the episode reflects a shift away from the focus on fiscal austerity that prevailed in the Obama era.
With the current stimulus bill, opposition in conservative talk radio was relatively muted. Republicans voted against it, but there hasn’t been quite the fire-and-brimstone sense of opposition evident toward the Obama stimulus a dozen years ago.
As the party becomes more focused on the kinds of culture-war battles that Trump made his signature, and its base shifts away from business elites, it wouldn’t be surprising if we saw the end of an era in which cutting government spending was its animating idea. This would imply a U.S. government that aims to keep flooding the economy with cash no matter who wins the next few elections.
The post-pandemic era could start with a bang
The past year has been terrible on nearly every level. But it’s easy to see the potential for the economy to burst out of the starting gate like an Olympic sprinter.
That could have consequences beyond 2021. A rapid start to the post-pandemic economy could create a virtuous cycle in which consumers spend; companies hire and invest to fulfill that demand; and workers wind up having more money in their pockets to consume even more.
Americans have saved an extra $1.8 trillion during the pandemic, reflecting government help and lower spending. That is money that people can spend in the months ahead, or it could give them a comfort level that they have adequate savings and can spend more of their earnings.
Things are also primed for a boom time in the executive suite. CEO confidence is at a 17-year high, and near-record stock market valuations imply that companies have access to very cheap capital. There is no reason corporate America can’t hire, invest and expand to take advantage of the post-pandemic surge in activity.
And on a psychological level, doesn’t everybody desperately want to return to feeling a sense of joy, of exuberance? That is an emotion that could prove the most powerful economic force of them all.
Economics may be a dismal science, and those of us who write about it are consigned to see what is broken in the world. But sometimes, things align in surprising ways, and the result is a period in which things really do get better. This is starting to look like one of those times.
c.2022 The New York Times Company
This New York Times article was legally licensed through AdvisorStream.