Today must be the day for comparing apples and oranges. In his latest Washington Post column Robert Samuelson wonders why wage growth is so weak despite the number of people getting jobs, something that must surely be obvious to anyone who isn’t writing from an office in Washington, DC or a house in Alexandria, VA:
After correcting for inflation, wage gains remain sluggish. In April, average weekly earnings for nonsupervisory workers were up 3 percent from a year earlier, to $785.55. Meanwhile, prices as measured by the consumer price index were up 2 percent. Considering that the economy has been expanding for nearly a full decade — a record if it continues through June — this is perplexing, even allowing that wages are growing faster at the top than in the middle.
Theories abound to explain wage behavior. Average workers (it’s said) still recall the ferocity of the 2007-09 recession and are more reluctant to chase higher wages by leaving their present jobs. For similar reasons, employers resist large wage gains. They want to remain competitive in another recession. Both are willing to trade stronger job security for slightly lower pay.
Other theories blame sluggish wage growth on changes in the labor market. The decline of unions — a phenomenon that stretches back to the 1960s — has weakened workers’ bargaining power. Globalization has had the same effect, because in many industries production can be moved abroad where wages are lower. China is an obvious example.
Weak productivity gains amplify the effect. In the long run, strong productivity improvements are the source of higher wages and salaries. From 2010 to 2017, annual productivity increases averaged only 0.5 percent, according to the Bureau of Labor Statistics. This compared with a post-World War II average of 2 percent. Slower productivity advances mean smaller increases in labor compensation for most workers.
We now have a new theory from the McKinsey Global Institute, the research arm of the McKinsey consulting company. It has long been known that the labor share of national income (GDP, for gross domestic product) has been shrinking. In 1947, the labor share was 65.4 percent of GDP; in 2016, it was 56.7 percent of GDP. These figures combined all forms of labor compensation: wages, salaries, fringe benefits.
Meanwhile, the capital share of income — income accruing to shareholders, business owners and other investors — rose roughly from 34.6 percent to 43.3 percent. Worryingly, three quarters of this shift has occurred since 2000. Again, these trends had been known. But McKinsey went a step further. It estimated how much of the slowdown in wages could be attributed to the rise in capital income’s share.
The answer is: about a quarter. That’s the impact of the shift from labor to capital income. The rest of the wage slowdown reflects poor productivity growth (general efficiency) and the tendency of high-income wages and salaries to grow faster than middle-income wages. If the distribution between labor and capital income had remained unchanged since 1998, the average American worker would have a whopping $4,000 in extra annual pay, according to McKinsey’s calculations.
Let me propose some explanations from the ground rather than 50,000 feet. There are several reasons. The first is fear. People are afraid to ask for a raise for fear of being replaced by someone from a temp firm or placement company who’ll work for lower pay and no benefits. Additionally, people find it hard to leave their present jobs to find better-paying ones. There are multiple reasons for that. Multiple-job households are one reason. I could also go into a diatribe on how resume-screening software provides an advantage to people who look good on paper but in practice are incapable of doing the job.
In a slight digression I heard recently from an excellent source that the IT operations of a major financial services company were shut down for a week due to malware, resulting in the loss of at least a week’s work. That would never have happened, say, twenty years ago. Their present operations are being run by temps and placements. There’s a different ethos at work.
Another prospective explanation is that the job reports aren’t telling the whole story. The jobs that are being created don’t pay better than the ones that are being lost. The jobs being created in health care aren’t jobs for physicians or the highest-paid technicians. They’re jobs for bedpan emptiers that pay minimum wage. People earning $25 an hour are still getting fired and the best jobs they can find pay $15 an hour. That depresses the wage figures.