Washington Post editorial writer Charles Lane admits he has no earthly idea what a $15 minimum wage will do to California’s economy. Still, that doesn’t stop him from squeezing an 800-word column out of his utter absence of knowledge. Nice work if you can get it.
There’s a total lack of evidence that the potential benefits would outweigh potential costs — and ample reason to worry they would not.
Yes, true in the sense that it is literally impossible to gather evidence on the actual consequences of something that has yet to happen. But it’s not like cities, states, and the federal government haven’t raised the minimum wage hundreds of times over the past 80 years while producing little compelling evidence that the actual benefits (you know, higher wages) have ever been outweighed by the actual costs.
And no, contrary to Lane’s assertion, California’s proposed increase from $10 an hour in 2016 to $15 in 2022 — 50 percent over six years — is not particularly “Yuuuge.” From a historical perspective, it’s kinda the norm. In fact, if anything, it’s on the low side.
From 1939 to 1945 the federal minimum wage climbed 60 percent. From 1961 to 1968 the federal minimum wage climbed 60 percent. From 1974 to 1980 the federal minimum wage climbed 94 percent. From 1990 to 1997 the federal minimum wage climbed 54 percent. Hell, from 2007 to 2009, over just three years, the federal minimum wage climbed 41 percent.
And what of the devastating job-killing consequences of these
unprecedented wage hikes? If the evidence shows some equally obvious pattern of correlated job losses, I’ve never seen it. In fact, in the years following the largest one-year minimum wage hike in our nation’s history — a whopping 87.5 percent increase from $0.40 in 1949 to $0.75 in 1950 — the unemployment rate plummeted from 5.9 percent in 1949 to 2.9% by 1953. Likewise, here in Washington State, after tipped workers enjoyed an 85 percent wage hike from 1989 to 1990, restaurant industry employment growth outpaced the rest of the economy over the following decade.
In short, California has no idea what it’s getting into, because it can’t; there is simply no experience from which to learn.
In short, that is simply a load of bull. We have plenty of experience with minimum wage hikes at this scale, and the evidence displays no discernible correlation between rising wages and rising unemployment. (If it did, minimum wage opponents would be busy citing that correlation, instead of just imagining it.)
As for Lane’s assertion that $15 as a percent of the median hourly wage would be “unprecedented;” yeah, maybe, but really, not by all that much. And should we even care?
In 1968 the federal minimum-to-median ratio stood at 55 percent. By 2022 California’s ratio would rise to about 69 percent — a smaller differential from 1968 than is the current 38 percent federal ratio. Yet I don’t hear Lane warning about the risk of falling so far behind the norm. But moreover, the 50 percent “benchmark” that Lane and others cite is an arbitrary ideal grounded more in economic tradition than in rigorous intellectual analysis:
Other industrial democracies with statutory minimum wages typically set theirs at half the national median wage, too.
And if other industrial democracies were to jump off a bridge, should we follow suit? Sorry, but “that’s the way we’ve always done it” isn’t a compelling economic argument.
But even if the 50 percent benchmark had a rational justification once upon a time, given the enormous changes to our economy over the past half-century, it’s no longer clear that minimum-to-median remains a relevant index even for purposes of comparison. The “median hourly wage” figure is for full-time non-supervisory work, a metric that ignores the rise of part-time employment, particular among low-wage workers: Only 13.5 percent of US workers were part-timers in 1968. Today, that number stands at 18.5 percent. Yet about 64 percent of at-or-below minimum wage employees work part-time. And over the same period of time, the median wage has all but flat-lined, generating a 50 percent minimum-to-median benchmark that might never justify giving minimum-wage workers another raise.
Whatever the relevance of the minimum-to-median ratio a half-century ago, that ratio is simply not an apples-to-apples comparison to today’s.
No doubt Lane is correct that “economic theory strongly suggests” that the consequences in California could be dire. But standard economic theory always suggests such dire consequences from raising the minimum wage — predictions that never turn out to be true!
The basic trade-off, per Economics 101, is that the increased earnings that a higher minimum wage gives workers at the low end of the income scale might be offset by pricing those workers out of jobs they could have had at less than the new, higher minimum wage.
So says the theory. But the evidence from 80 years of minimum wage hikes suggests that Economics 101 is wrong.
The biggest flaw in the standard economic models is that they never account for the increased consumer demand generated by a higher minimum wage. They correctly consider the reduction in wages due to capital-labor substitution and productivity gains. They correctly consider the reduction in consumer spending due to higher prices. And they correctly consider the reduction in jobs and GDP due to these cumulative effects.
But what these standard models have consistently failed to consider is the “income effect“: the countervailing increase in consumer spending due to higher wages. Instead, these old models — the ones on which most economists still rely — seem to assume that the money paid out in higher wages is simply pissed into a black hole or something.
Economist Michael Reich of UC Berkeley addresses this glaring oversight by creating a new model that adds the income effect to the cumulative impact of substitution and scale. And in a policy brief on a proposed $15 minimum wage for New York State, Reich and his co-authors project a small cumulative net gain in employment, concluding that “the costs of the minimum wage will be borne by turnover reductions, productivity increases and modest price increases.”
Whatever your predisposition to the minimum wage, it’s hard to argue that Reich’s approach doesn’t make sense: The wages earned by low-wage workers are not sucked out of economy; they’re plowed right back in, and at higher rates than those of more affluent workers who do not need to spend every penny they earn. I may lack the expertise to speak to Reich’s execution, but I have enough common sense to understand that the income effect is real.
But more convincingly, unlike the standard models touted by credulous status quoists like Lane, Reich’s model actually explains the actual data collected from actual minimum wage increases over the past 80 years. It predicts what happened rather than what did not.
And if this model proves true, it demands a paradigm shift in our entire approach to this debate: For rather than reflexively asking the question of whether the benefits are worth the jobs we might lose if we raise the minimum wage, we need to start asking how many jobs might we lose if we don’t? For if raising the minimum wage would result in a cumulative net gain in employment, however small, doesn’t that necessarily mean that we have fewer jobs today than we otherwise would have had we not kept the minimum wage so low?
How many jobs have we lost — how much GDP have we sacrificed — by allowing the minimum wage to fall so far behind growth in productivity, median wage, and even inflation? “How low is too low?” minimum wage critics should be forced to answer.
Or to borrow a phrase from their own smug rhetoric: “If $7.25 an hour, why not $5.00? If $5.00 an hour, why not zero?”
The failure of opponents and supporters alike to even consider these questions just demonstrates how stuck we all are in the muddy intellectual morass of the old equilibrium economics. So convinced are we that there must be a cost to raising wages — a “basic trade-off, per Economics 101,” as Lane might say — that it’s never occurred to us to model the cumulative cost of keeping wages too low.
Well, no need.
Lane warns that with $15, California is running a dangerous “experiment” (or a “gamble” as Timothy B. Lee pejoratively chimes in on Vox), but we have already been running the largest economic experiment in our nation’s history — a low-wage trickle-down experiment — for the past forty-some years.
Sometime in the 1970s, after three decades of unparalleled growth and shared prosperity (an era in which we raised the minimum wage in step with productivity), we chose to conduct a massive experiment on the American economy: We chose to cut taxes on billionaires and to deregulate the financial industry. We chose to starve our schools and to saddle our children with more than $1.2 trillion worth of student debt. We chose to erode the minimum wage and the overtime threshold and the bargaining power of labor. We chose to believe in the promise that a rising tide of capital accumulation would lift all boats.
So now, after decades of stagnant wages, growing inequality, and the staggering rise of Trump, it is reasonable to conclude that America’s trickle-down experiment has failed.
(Alas, it is a failure that most journalists have yet to fully comprehend. For in reflexively repeating the discredited meme that $15 is a risky “experiment,” they repeat the most pernicious lie in the trickle-down repertoire: Not that if the rich get richer it’s good for the economy, but that if the poor get richer it’s bad.)
That is the lesson to learn from California and New York and Seattle and everywhere else that $15 has taken hold: the trickle-down experiment has failed. And so reasonable people, looking at this failed experiment’s results, have reasonably chosen to move on to something else.
Of course $15 is an experiment too. Everything in life is. That is how we and all our social institutions evolve. But $15 is an experiment based on a wealth of experience, a ton of supporting evidence, and an economic model that — unlike trickle-down — appears to work in practice as well as it works in theory.