February 23, 2015
The combination of decades of deficit spending and more recent experiments in radical monetary policy has contributed to a slow but steady increase in the cost of living for all Americans.
WHEN PICMO founder William Gross coined the term the “new normal,” he both stated the obvious and offered a fresh insight. Most people understand in a visceral way that things have changed dramatically when it comes to jobs and economic opportunities since the financial crisis of 2008. But more than something new, the current state of the U.S. economy represents a reversion to the old normal—the price deflation and slack job market that existed in the 1920s and 1930s—which was interrupted by World War II and the subsequent decades of the Cold War and massive government spending.
It is safe to say that everyone wishes for a return to business as usual, at least insofar as “normal” is understood by most Americans. Plentiful jobs, along with rising home and stock prices, worked for most of us. The only problem is that the old normal economy of the 2000s, for example, saw prices for homes, stocks and other asset classes growing at levels that were clearly not sustainable. When we saw annual double-digit increases in home prices in the United States during the mid-2000s, the one thing you could be sure about was that this rate of price change was unsound and probably a function of external factors such as low interest rates and easy credit.
Since the 2008 financial bust, the U.S. economy has been anything but normal. The housing market, for example, rebounded at double-digit rates in 2011–2013, but now seems to be losing momentum rapidly. Near-zero interest rates maintained by the Federal Open Market Committee (FOMC) prevented an immediate apocalypse in the form of a 1930s-style price deflation, but this is both good and bad news. The lack of a true debt deflation commensurate with the degree of excess prior to 2008 has left the U.S. economy hanging in a form of economic stasis. Without price deflation and debt restructuring, there is no economic “bounce” and thus no recovery in demand or jobs.
TODAY, THE U.S. economy is like a cardiac patient on artificial life support. Flat employment, flat credit growth (at least for productive purposes) and falling inflation-adjusted incomes are the attributes of the new normal. Nobel laureate Robert Shiller draws an explicit parallel between today’s “new normal” of no or slow wage and job growth and the late 1930s, when the U.S. economy began to sink under the weight of FDR’s New Deal experiment:
The depression that followed the stock-market crash of 1929 took a turn for the worse eight years later, and recovery came only with the enormous economic stimulus provided by World War II, a conflict that cost more than 60 million lives. By the time recovery finally arrived, much of Europe and Asia lay in ruins.
Shiller’s point about how World War II rescued America from the deflation of the late 1930s is often missed, deliberately, by many economists. FDR’s antibusiness rhetoric during the New Deal actually made the deflation of the 1930s worse by chasing private capital out of the U.S. economy. In their classic book A Monetary History of the United States, 1867–1960, Milton Friedman and Anna Jacobson Schwartz documented how private capital formation in the United States essentially went to zero by the late 1930s, leaving the public sector as the only engine of growth into the 1950s and 1960s. Large corporations and banks aligned with the federal government were the most significant source of credit and economic prosperity in that period. It took until the 1970s for private risk taking to truly reemerge in the U.S. economy, driving growth for decades thereafter. After these dramatic swings in growth and demand, however, we still have a muddled view of what constitutes long-term economic expansion.
While politicians and central banks can artificially increase the nominal growth rate for relatively short periods of time—we know this as a “bubble”—such machinations create no real wealth. We feel wealthier for a time, as in the Roaring Twenties and the 2000s. Yet when any significant proportion of the population tries to take its chips off the gaming table, the good times end. Given that an economy only truly grows wealth at the rate of real GDP growth, as Alex Pollock of the American Enterprise Institute observes, why do so many economists and the members of the FOMC call for policies to push higher and unsustainable rates of economic growth? The answer comes down to a basic difference between conservatives and liberals when it comes to inflation, a conflict of visions that has its roots in the dark days of the Great Depression.
Some on the left, like author William Greider, believe that a little inflation is good for working people and debtors, even if it erodes the purchasing power of wages. But just as a steady 2 percent increase in real wealth provides enormous benefits to a society, a steady 2 percent annual inflation rate can rob workers and families of the ability to meet basic needs in a matter of a few scant years. For example, an item that cost $20 in 1930 would cost $283 as of this writing, reflecting a cumulative rate of inflation of 1,315 percent, according to the Consumer Price Index (CPI) maintained by the Bureau of Labor Statistics (BLS).
Remember that because of various adjustments and omissions from the underlying data, the CPI greatly understates the actual rate of inflation experienced by individual consumers. Inflation, after all, is a monetary phenomenon that occurs when the value of money declines relative to the goods and services it can purchase. Small wonder that Americans have seen a steady decrease in real income over the past several decades. And yet the Federal Reserve and other central banks explicitly target inflation levels that are ultimately destroying consumer purchasing power.
WHEN POLITICIANS or members of the FOMC promise growth above that 2 percent long-term average, they are being more than a little disingenuous. Not only will using government policy to stimulate demand and keeping interest rates low create financial bubbles and other problems in the short term, but such expedients will also actually hurt all of us by eroding the purchasing power of wages and income. The housing boom of the 2000s, for example, was supported with public-policy initiatives from Washington and low interest rates from the Fed, but the result was a massive financial collapse and the destruction of trillions of dollars in notional wealth. Meanwhile, the cost of housing has continued to climb, even as real incomes have fallen.
Nobel laureate Paul Krugman is one of the leading exponents of the inflationist view. Week in and week out in his New York Times column, Krugman derides those who spend too much time worrying about inflation and advocates an increase in government spending, fueled by higher taxes or additional debt, as a means of stimulating demand for goods and services, and thus jobs. Krugman ridicules “the wealthy” for advocating low inflation and insists that the road to salvation is to continue the policies of the past half century, which includes using government spending and easy money to increase nominal private demand.
But Krugman is wrong. The problem we all face is not runaway wage and price inflation of the type seen in the 1970s, but a more pernicious and deadly form of slow erosion in purchasing power for all people, combined with slow or no real economic growth. Krugman and his fellow travelers on the left correctly point out that there is little or no wage inflation in the U.S. economy, but that does not mean that inflation, broadly defined, is not a serious problem. The combination of decades of deficit spending and more recent experiments in radical monetary policy has contributed to a slow but steady increase in the cost of living for all Americans, an increase that’s caused real incomes and the value of savings to fall.
Krugman and other advocates of secular inflation point to the period after World War II as proof that deficit spending and a large national debt help economic growth. But such views are myopic. The massive government spending during and after World War II helped to pull the United States out of the debt and deflation of the 1930s, much of which was worsened by the excesses of FDR’s New Deal. But you cannot treat the period immediately following the Second World War as “normal” in any dimension.
In fact, the key driver of prosperity following World War II was not government spending but demographics. Johnny came marching home, got married and had lots of babies. Between 1950 and 2000, the civilian labor force grew by an average of 1.6 percent per year, according to the BLS. This may not sound like a big number, but over fifty years that meant that the cohort of working-age Americans more than doubled in number from 62 million to 141 million by the start of the twenty-first century. The BLS estimates that, between 2000 and 2050, the working-age population will grow just 36 percent, or about 0.6 percent annually.
With a smaller demand “pull” from shrinking demographic growth, a slower economy is hardly surprising. If we recall that the real, long-term growth of wealth is a function of increases in population and production, then the fact of slower U.S. population growth in the twenty-first century suggests that we will also see more modest growth in GDP. Under such circumstances, what is normal? More important, with nominal GDP growth in the 2–3 percent range absent shocks from external factors, and the Fed targeting similar levels of price inflation, will Americans see any improvement in their real inflation-adjusted income or wealth?
Sadly, you will never hear Federal Reserve chair Janet Yellen and the members of the FOMC admit that the real, long-term growth rate for wealth or GDP is just 2 percent. Because of the dual mandate given to the U.S. central bank of encouraging employment and ensuring price stability, the FOMC has tended to focus policy on trying to encourage job growth while pretending that inflation is not a problem. Indeed, over the past two decades, as real growth prospects have waned, the FOMC has used progressively lower interest rates to both stimulate growth and rescue the economy from the aftereffects of the latest Fed-inspired boom. Whatever concerns the Fed still harbors regarding long-term price stability have been overwhelmed by the political imperative to achieve short-term job growth.
Economists in both private and public life make a living by talking about levels of potential growth that are far above the long-term average increase in real wealth. One reason for this is that suggesting that the long-term average growth rate will not exceed 2 percent implies a future of limited job opportunities, something that’s hardly popular with voters or elected officials. The remarkable growth rates claimed by China’s authoritarian regime illustrate the political imperative behind such efforts. As a result of the one-child policy, China’s population is growing at just 0.5 percent annually, according to the World Bank. When you see official Chinese GDP growth rates of more than ten times the rate of population increase, the one thing you can be sure about is that the claimed rate of “growth” is unsustainable and driven by politically motivated government spending.
OVER THE past several years, members of the FOMC have maintained ultralow interest rates, ostensibly to boost economic activity in such areas as housing and job growth. But despite low interest rates and massive purchases of government debt and mortgage securities by the FOMC, volumes of residential mortgage lending have plummeted down to decade-low levels, and job growth remains anemic and of poor quality. Instead of stimulating a recovery in the real economy, the policies followed by the FOMC under first Ben Bernanke and now Janet Yellen have only created new asset bubbles in sectors like real estate, public equities and the corporate bond market. With interest rates and commodity prices now falling around the world and the dollar soaring against other currencies, the FOMC seems to have created a “deflation trap” whereby investors are unwilling to put capital at risk as they await higher interest rates. Meanwhile, job creation and spending suffer due to a lack of investment.
Some Fed officials are increasingly uncomfortable with the Fed’s policies. Richard Fisher, president of the Federal Reserve Bank of Dallas, dissented from his colleagues on the FOMC, saying he’d like to see rates begin to go up in 2015. Philadelphia Fed chief Charles Plosser also dissented on similar grounds. But both Fisher and Plosser no longer vote on the FOMC. A decidedly left-wing majority on the Fed’s policy-making body continues to support the extraordinary low-rate policies in an effort to boost job growth. In the European Union, the European Central Bank is pursuing a similar policy.
But the sad fact remains that the use of interest rates or fiscal policy to stimulate nominal growth is of limited utility today. In the 1970s and 1980s, when the children of the post–World War II baby boom were starting families of their own, a little bit of push in the form of low interest rates or increased government spending resulted in a substantial increase in job creation and economic activity—along with higher inflation. Today, with lower population growth rates and relatively high levels of public debt in most industrial nations, the utility of fiscal or monetary policy in boosting growth rates is very limited—but inflation remains a problem that affects all consumers, rich and poor.
In the Fed’s most recent report to Congress, Yellen repeated the Fed’s explicit embrace of a 2 percent inflation rate, in order to help the employment picture, all the while paying lip service to the Fed’s responsibility to ensure stable prices. She stated:
The inflation rate over the longer run is primarily determined by monetary policy, and hence the Committee has the ability to specify a longer-run goal for inflation. The Committee reaffirms its judgment that inflation at the rate of 2 percent, as measured by the annual change in the price index for personal consumption expenditures, is most consistent over the longer run with the Federal Reserve’s statutory mandate.
What Yellen is saying explicitly is that it is not possible for the FOMC to achieve the legal mandate of “maximum employment” without tolerating a 2 percent inflation rate. But a 2 percent rate of inflation, compounded over twenty years, will rob American consumers of half of the purchasing power of their wages and savings. Not only do the Fed’s publicly stated policies doom many Americans to poverty in the future, but they are also an explicit admission that the Fed’s dual legal mandate set by Congress in 1978 is unworkable. The Fed cannot both pursue “maximum employment” and safeguard against inflation. Indeed, there is a growing doubt that the Fed can truly change the employment picture. But the political attraction of promising people higher wage and job growth, it seems, is so powerful that members of the FOMC and central bankers around the world cannot help themselves. Ultimately, using low interest rates in an attempt to boost demand and job creation will fail.
Christopher Whalen is senior managing director and head of research at Kroll Bond Rating Agency. He is the author of Inflated: How Money and Debt Built the American Dream (Wiley, 2010) and the coauthor, with Frederick Feldkamp, of Financial Stability: Fraud, Confidence, and the Wealth of Nations (Wiley, 2014).
No comments:
Post a Comment