Where Are America’s Jobs?
What happened to the 15 million jobs that were supposed to be created in the past 10 years but weren't?
It’s a big one: “What happened to the 15 million jobs that were supposed to be created in the past 10 years but weren’t?” It’s a fascinating question and a well written and researched piece. Alas, the answer is rather unsatisfying: Damned if anyone knows.
The years between the brief 2001 recession and the 2008 financial collapse gave us solid growth in our gross national product, soaring corporate profits, and a low unemployment rate—but job creation lagged stubbornly behind, more so than in any economic expansion since World War II.
The Great Recession wiped out what amounts to every U.S. job created in the 21st century. But even if the recession had never happened, if the economy had simply treaded water, the United States would have entered 2010 with 15 million fewer jobs than economists say it should have.
We know what should have transpired over the past 10 years: the completion of a circle of losses and gains from globalization. Emerging technology helped firms send jobs abroad or replace workers with machines; it should have also spawned domestic investment in innovative industries, companies, and jobs. That investment never happened—not nearly enough of it, in any case.
If we can’t figure out why, we may be doomed to a future that feels like a long jobless recovery, no matter how fast our economy grows. “It’s the trillion-dollar question,” says David E. Altig, senior vice president and research director for the Federal Reserve Bank of Atlanta, where economists are beginning to explore the shifts that have clubbed American workers like a blackjack. “Something big has happened. I really don’t think we have a complete story yet.”
We certainly didn’t see it coming. At the turn of the millennium, the Bureau of Labor Statistics predicted that the U.S. economy would create nearly 22 million net jobs in the 2000s, only slightly fewer than the boom 1990s yielded. The economists predicted “good opportunities for jobs” and “an optimistic vision for the U.S. economy” through 2010.
Businesses would reap the gains of new trading markets, the projection said, and continue to invest in technologies to boost the productivity of their operations. High-tech jobs would abound, both for systems analysts with four years of college and for computer-support analysts with associate’s degrees. The manufacturing sector would stop a decades-long jobs slide, and technology would lead the turnaround. Hundreds of thousands of newly hired factory workers would make cutting-edge electrical and communications products, including semiconductors, satellites, cable-television equipment, and “cellular phones, modems, and facsimile and answering machines.”
A few researchers caught early warning signs of the trend. In 2003, economists Erica L. Groshen and Simon Potter at the Federal Reserve Bank of New York warned in a paper that “structural changes” in the economy appeared to be hindering job creation. Groshen and Potter noted that after the past two recessions, in 1990-91 and 2001, economic growth had picked up long before jobs began to reappear, bucking a long historical trend of growth and jobs returning in tandem. The explanation, Groshen and Potter said, was a shift away from the time-honored American tradition of laying off workers in bad times and recalling them when the clouds parted.
“Most of the jobs added during the recovery have been new positions in different firms and industries, not rehires,” they wrote. “In our view, this shift to new jobs largely explains why the payroll numbers have been so slow to rise: Creating jobs takes longer than recalling workers to their old positions and is riskier” when recovery still appears fragile.
In other words, American companies had adopted a more cold-blooded attitude toward recessions, one that fit the new model of globalization and automation. Technology made it easier to lay off your 100 least-effective workers and ship their jobs to India, or to replace them with a software program that made your remaining workforce dramatically more productive.
Guesses abound: Americans aren’t sufficiently educated or properly trained for the jobs that are being created. Companies are hoarding cash, for whatever reason, rather than investing in equipment and people. Profits are going to shareholders rather than being reinvested. Mostly, though, it appears that Americans just don’t make anything anymore.
A recent paper by researchers at the Asian Development Bank Institute concluded that the iPhone, one of the United States’ top innovations of the past decade, actually contributes nearly $2 billion to our trade deficit because it is almost entirely produced and assembled in Asia. The paper also raises a conundrum for lawmakers and business leaders alike: If Apple moved its assembly line to the United States and created domestic jobs but didn’t raise the cost of the iPhone, the company would still turn a 50 percent profit on every one it sold.
Maybe Apple’s greed is at fault. Maybe the government is to blame for not making the industrial climate more hospitable to Apple and other job producers. The harsh reality is that workers, companies, and lawmakers all need to readjust if we ever hope to rev up the job-creation machine again.
Regardless, there’s no consensus on the cause, much less what, if anything, we should do about it. But it’s not good. The iPhone case is scary, in that we just skip the few years where something is “high tech” and therefore manufactured by American workers. Which means the money is in inventing and marketing such things, not building them. That’s not a recipe for a robust middle class.