Income inequality

The Real Lesson from $15? America’s Trickle-Down Experiment Has Failed

Minimum wage model

Washington Post editorial writer Charles Lane admits he has no earthly idea what a $15 minimum wage will do to California’s economy. Still, that doesn’t stop him from squeezing an 800-word column out of his utter absence of knowledge. Nice work if you can get it.

There’s a total lack of evidence that the potential benefits would outweigh potential costs — and ample reason to worry they would not.

Yes, true in the sense that it is literally impossible to gather evidence on the actual consequences of something that has yet to happen. But it’s not like cities, states, and the federal government haven’t raised the minimum wage hundreds of times over the past 80 years while producing little compelling evidence that the actual benefits (you know, higher wages) have ever been outweighed by the actual costs.

And no, contrary to Lane’s assertion, California’s proposed increase from $10 an hour in 2016 to $15 in 2022 — 50 percent over six years — is not particularly “Yuuuge.” From a historical perspective, it’s kinda the norm. In fact, if anything, it’s on the low side.

california minimum wage

Hanna Brooks Olsen does spreadsheets old school.

From 1939 to 1945 the federal minimum wage climbed 60 percent. From 1961 to 1968 the federal minimum wage climbed 60 percent. From 1974 to 1980 the federal minimum wage climbed 94 percent. From 1990 to 1997 the federal minimum wage climbed 54 percent. Hell, from 2007 to 2009, over just three years, the federal minimum wage climbed 41 percent.

And what of the devastating job-killing consequences of these unprecedented wage hikes? If the evidence shows some equally obvious pattern of correlated job losses, I’ve never seen it. In fact, in the years following the largest one-year minimum wage hike in our nation’s history — a whopping 87.5 percent increase from $0.40 in 1949 to $0.75 in 1950 — the unemployment rate plummeted from 5.9 percent in 1949 to 2.9% by 1953. Likewise, here in Washington State, after tipped workers enjoyed an 85 percent wage hike from 1989 to 1990, restaurant industry employment growth outpaced the rest of the economy over the following decade.

In short, California has no idea what it’s getting into, because it can’t; there is simply no experience from which to learn.

In short, that is simply a load of bull. We have plenty of experience with minimum wage hikes at this scale, and the evidence displays no discernible correlation between rising wages and rising unemployment. (If it did, minimum wage opponents would be busy citing that correlation, instead of just imagining it.)

As for Lane’s assertion that $15 as a percent of the median hourly wage would be “unprecedented;” yeah, maybe, but really, not by all that much. And should we even care?

In 1968 the federal minimum-to-median ratio stood at 55 percent. By 2022 California’s ratio would rise to about 69 percent — a smaller differential from 1968 than is the current 38 percent federal ratio. Yet I don’t hear Lane warning about the risk of falling so far behind the norm. But moreover, the 50 percent “benchmark” that Lane and others cite is an arbitrary ideal grounded more in economic tradition than in rigorous intellectual analysis:

Other industrial democracies with statutory minimum wages typically set theirs at half the national median wage, too.

And if other industrial democracies were to jump off a bridge, should we follow suit? Sorry, but “that’s the way we’ve always done it” isn’t a compelling economic argument.

But even if the 50 percent benchmark had a rational justification once upon a time, given the enormous changes to our economy over the past half-century, it’s no longer clear that minimum-to-median remains a relevant index even for purposes of comparison. The “median hourly wage” figure is for full-time non-supervisory work, a metric that ignores the rise of part-time employment, particular among low-wage workers: Only 13.5 percent of US workers were part-timers in 1968. Today, that number stands at 18.5 percent. Yet about 64 percent of at-or-below minimum wage employees work part-time. And over the same period of time, the median wage has all but flat-lined, generating a 50 percent minimum-to-median benchmark that might never justify giving minimum-wage workers another raise.

Stagnant wages

Whatever the relevance of the minimum-to-median ratio a half-century ago, that ratio is simply not an apples-to-apples comparison to today’s.

No doubt Lane is correct that “economic theory strongly suggests” that the consequences in California could be dire. But standard economic theory always suggests such dire consequences from raising the minimum wage — predictions that never turn out to be true!

The basic trade-off, per Economics 101, is that the increased earnings that a higher minimum wage gives workers at the low end of the income scale might be offset by pricing those workers out of jobs they could have had at less than the new, higher minimum wage.

So says the theory. But the evidence from 80 years of minimum wage hikes suggests that Economics 101 is wrong.

The biggest flaw in the standard economic models is that they never account for the increased consumer demand generated by a higher minimum wage. They correctly consider the reduction in wages due to capital-labor substitution and productivity gains. They correctly consider the reduction in consumer spending due to higher prices. And they correctly consider the reduction in jobs and GDP due to these cumulative effects.

But what these standard models have consistently failed to consider is the “income effect“: the countervailing increase in consumer spending due to higher wages. Instead, these old models — the ones on which most economists still rely — seem to assume that the money paid out in higher wages is simply pissed into a black hole or something.

Economist Michael Reich of UC Berkeley addresses this glaring oversight by creating a new model that adds the income effect to the cumulative impact of substitution and scale. And in a policy brief on a proposed $15 minimum wage for New York State, Reich and his co-authors project a small cumulative net gain in employment, concluding that “the costs of the minimum wage will be borne by turnover reductions, productivity increases and modest price increases.”

Whatever your predisposition to the minimum wage, it’s hard to argue that Reich’s approach doesn’t make sense: The wages earned by low-wage workers are not sucked out of economy; they’re plowed right back in, and at higher rates than those of more affluent workers who do not need to spend every penny they earn. I may lack the expertise to speak to Reich’s execution, but I have enough common sense to understand that the income effect is real.

But more convincingly, unlike the standard models touted by credulous status quoists like Lane, Reich’s model actually explains the actual data collected from actual minimum wage increases over the past 80 years. It predicts what happened rather than what did not.

And if this model proves true, it demands a paradigm shift in our entire approach to this debate: For rather than reflexively asking the question of whether the benefits are worth the jobs we might lose if we raise the minimum wage, we need to start asking how many jobs might we lose if we don’t? For if raising the minimum wage would result in a cumulative net gain in employment, however small, doesn’t that necessarily mean that we have fewer jobs today than we otherwise would have had we not kept the minimum wage so low?

How many jobs have we lost — how much GDP have we sacrificed — by allowing the minimum wage to fall so far behind growth in productivity, median wage, and even inflation? “How low is too low?” minimum wage critics should be forced to answer.

Or to borrow a phrase from their own smug rhetoric: “If $7.25 an hour, why not $5.00? If $5.00 an hour, why not zero?

The failure of opponents and supporters alike to even consider these questions just demonstrates how stuck we all are in the muddy intellectual morass of the old equilibrium economics. So convinced are we that there must be a cost to raising wages — a “basic trade-off, per Economics 101,” as Lane might say — that it’s never occurred to us to model the cumulative cost of keeping wages too low.

Well, no need.

Lane warns that with $15, California is running a dangerous “experiment” (or a “gamble” as Timothy B. Lee pejoratively chimes in on Vox), but we have already been running the largest economic experiment in our nation’s history — a low-wage trickle-down experiment — for the past forty-some years.

Sometime in the 1970s, after three decades of unparalleled growth and shared prosperity (an era in which we raised the minimum wage in step with productivity), we chose to conduct a massive experiment on the American economy: We chose to cut taxes on billionaires and to deregulate the financial industry. We chose to starve our schools and to saddle our children with more than $1.2 trillion worth of student debt. We chose to erode the minimum wage and the overtime threshold and the bargaining power of labor.  We chose to believe in the promise that a rising tide of capital accumulation would lift all boats.

It didn’t.

So now, after decades of stagnant wages, growing inequality, and the staggering rise of Trump, it is reasonable to conclude that America’s trickle-down experiment has failed.

(Alas, it is a failure that most journalists have yet to fully comprehend. For in reflexively repeating the discredited meme that $15 is a risky “experiment,” they repeat the most pernicious lie in the trickle-down repertoire:  Not that if the rich get richer it’s good for the economy, but that if the poor get richer it’s bad.)

That is the lesson to learn from California and New York and Seattle and everywhere else that $15 has taken hold: the trickle-down experiment has failed. And so reasonable people, looking at this failed experiment’s results, have reasonably chosen to move on to something else.

Of course $15 is an experiment too. Everything in life is. That is how we and all our social institutions evolve. But $15 is an experiment based on a wealth of experience, a ton of supporting evidence, and an economic model that — unlike trickle-down — appears to work in practice as well as it works in theory.

Why Paid Sick Leave is Feminist AF

paid sick leave is feminist

What is or is not feminist when it comes to politics will likely never be fully resolved, but boy howdy, we sure do love to talk about it. There have been acres of pixels dedicated to the debate over whether a vote for Hillary Clinton or Bernie Sanders is more or less feminist than a vote for the other candidate, not to mention plenty about the perceived feminism of candidates like Carly Fiorina and yes, even Sarah Palin. And of course, there’s John Kasich and his line about female voters “leaving the kitchen” to elect him—a statement which I would call arguably one of the least feminist comments uttered at a stump speech in the last decade if it weren’t for almost every single thing out of the mouths of Donald Trump and Mike Huckabee and any number of other white men who have run or are currently running for office.

So I completely understand the skepticism and even exhaustion around this line of political criticism and critical thinking. But I feel quite confident in saying that regardless of which presidential bubble you color in on your November ballot, a vote on behalf of paid sick leave is one of the more inclusive, intersectional votes you can cast. Basically, paid sick leave is feminist AF.

Allow me to explain. First, let’s get straight that intersectional feminism means raising up all kinds of people; it’s not about limiting the access or rights of men or white people or whatever else teenage trolls on Twitter seem to think. By “feminist,” I mean paid sick leave is extremely good for furthering the cause of equity, generally. Now let’s move on.

Both an increase to the minimum wage and requiring employers to provide paid sick leave would directly benefit women and families (of note: for the rest of this piece, I’ll be using male and female pronouns for clarity, but please note that plenty of people don’t fall into one of these categories and basically all of the data available is based on heterosexual couples; I would love it if there were data on non-binary couples and same-sex couples, but I have been hard-pressed to find much, though according to some of the existing research, lesbian households are pretty dang egalitarian) ; the majority of minimum wage workers identify as female, and minimum wage workers are the most likely to work in industries which don’t offer paid sick leave. But it’s much deeper than that.

A 2009 report entitled The New Breadwinners found that record numbers of women are either solely supporting families, or are co-supporting their families with a partner. In total, close to 64% of households relied on income from the mother; just over 41% were entirely reliant on her income. Which means that if she can’t take time off when she’s sick (or when someone else in the family is sick), or is forced to take a pay cut, the income of the family will certainly suffer.

feminist politics paid sick leave

This has huge implications on the gender wage gap and gender equity; though we often like to cite the 77 cents to the dollar figure, the truth is that the earning gap between genders isn’t just down to women being paid less per hour (also, it leaves out women of color, who make way less than that and every time a white woman cites this figure she effectively erases that truth).

Instead, it’s important to look holistically at how women end up bearing the brunt of a lack of paid time off in their actual lives, at work, and generally.

As Melinda Gates pointed out in the Gates Foundation’s annual letter this year, “time poverty” is a real and significant issue; unpaid work is still very much a barrier for women across the board (and across the world) because almost universally, it’s expected to be performed by women for no money, effectively limiting the amount of time they can spend on their paid work or on themselves. Gates defines it as such:

Unpaid work is what it says it is: It’s work, not play, and you don’t get any money for doing it. But every society needs it to function. You can think of unpaid work as falling into three main categories: cooking, cleaning, and caring for children and the elderly. Who packs your lunch? Who fishes the sweaty socks out of your gym bag? Who hassles the nursing home to make sure your grandparents are getting what they need?

A Pew study from 2013 found an interesting time breakdown, which ended with women accounting for one more hour per week than men; fathers reported spending 42 hours per work on paid work, nine hours per week on housework, and seven hours per week on childcare, adding up to a total of 58 hours per week. Women, meanwhile, reported spending 31 hours per week on average doing paid work, 16 hours on housework, and 12 hours on childcare, totaling 59 hours. However, in most dual-parent households, both the father and mother are working full time (and the mothers tend to disagree about how much housework the fathers actually do).

Paid sick leave, of course, can’t solely overcome centuries of conditioning, nor can it solve the problem of work created by one partner who does less of it (seriously, husbands literally make more unpaid work)—but it can help alleviate some of the career stress and pressure that women feel as part of this work, particularly when it comes to cold and flu season. Because of course, it’s not just the worker who needs to take time off sometimes.

A 2014 study found that in families with children, women are not only 10 times more likely to stay home with kids when they’re sick, they’re also five times more likely than husbands to make and attend doctor’s appointments. Even if the mother is working, she’s much more likely to be the one who takes time out of her day. From the Atlantic:

For working moms, 39 percent report missing work to care for their sick children, 33 percent report sharing the responsibility with their spouse, 16 percent report calling someone else to help, and 6 percent report their partner taking time off. Of the 39 percent of women who report taking time off to care for their sick children 60 percent report not getting paid. That’s up significantly from 2004, when 45 percent reported not being paid for missing work.

Without paid time off, women are then forced to make a choice: Stay home and get docked the pay (and risk losing their job), or try to find someone to stay home with the kid, thus still losing money and also feeling terrible for not being there, not to mention potentially spreading whatever disease your child may have given you?

The other alternative is sending kids to school sick; though most daycares require that children who aren’t feeling well stay home, schools offer no such requirement. Again, from the National Partnership for Women and Families, “parents without paid sick days are more than twice as likely to send a sick child to school or daycare as parents with paid sick days…[and] are five times as likely to report taking their child or a family member to the emergency room because they were unable to take time off work during normal work hours.” Unnecessary ER visits, the Partnership cites, “cause additional burdens on our health care system totaling more than $1.1 billion per year.”

Even in families without children, the bulk of caretaking falls on women; caring for ailing family members, particularly aging parents, pushes women into early retirement, often because they can’t get the time off that they need.

Taking unpaid time off disproportionately impacts women because, often, they work in industries where tips are essential to their income; three-quarters of tipped employees identify as female, according to the National Women’s Law Center. That means that their ability to, say, pay rent or buy groceries is directly dependent on their ability to perform their job both capably and with a smile (and possibly while being sexually harassed). That’s incredibly difficult to do when you’re sick—and it poses a huge risk to your customers; 63% of restaurant workers admit to cooking and serving food while sick.

Paid leave is a policy that will save taxpayers money, will cut down on the spread of disease, will improve worker productivity, and may even save lives—but it will also undeniably help close the gender wage gap. As women continue to balance the bulk of unpaid work, increasing the number of hours they do get paid for can help create a more egalitarian workforce, as well as ensure that workers at all levels are able to lead their most fruitful lives.

There Are Kind of a Lot of Reasons Why Poor Kids’ Degrees are Worth Less

poor kids earn less

Hooray! Debt!

New numbers from the Brookings Institute demonstrate something that a lot of first generation college students already know: Your degree, despite being printed on the same paper and costing every bit as much (actually, if you took out loans, it could be much more expensive by the time you’re done paying it off) as that of all the students in your class, seems to be worth less than you’d thought it would be—and certainly less than your guidance counselor promised you.

Exactly how much less, though, is pretty startling. From Brookings:

College graduates from families with an income below 185 percent of the federal poverty level (the eligibility threshold for the federal assisted lunch program) earn 91 percent more over their careers than high school graduates from the same income group. By comparison, college graduates from families with incomes above 185 percent of the FPL earned 162 percent more over their careers (between the ages of 25 and 62) than those with just a high school diploma.

So while a bachelor’s degree will help you earn more than if you had no bachelor’s degree (at least until you’re in your 60s), if you grew up in an economically distressed household, you can expect it to be a much smaller bump than if your parents had means.

That result flies in the face of the popular notion that going to college is a one-way ticket out of poverty and into a better life…which again, is something that most students who grew up poor and went to college have already discovered. And it’s not for one single reason; instead, the modesty of the “bachelor bump” for students who come from poorer households can be explained in any number of ways.

In the Brookings blog post on the data, nonresident fellow Brad Hershbein posits several explanations, including “family resources during childhood and the place where one grew up, to the colleges that low-income students attend“—all of which are plausible and in fact likely to contribute to the earnings gap. However, those are just a handful of the many, many, many, many forces and systems and assumptions and realities which make digging out of poverty difficult, despite a degree.

This is why, though reducing the economic burden of debt would be a large step in reducing income inequality, free college (which we recently wrote in favor of!) will not—in fact, it cannot—by itself break the cycle of poverty. Because though the cost of tuition is certainly a barrier to entry, it’s not the only one—and reducing it doesn’t curb the obstacles upon entrance or exit. Until myriad other systemic and academic issues are addressed, poor kids may earn a little more if they get a degree, but they certainly won’t be delivered from poverty entirely simply by virtue of obtaining it. 

It literally begins at childhood and continues through the maze of secondary school, in and out of college, past the burden of student debt, and then onward into the job market.

Let’s start at the beginning, when poor kids experience more stressful, traumatic situations than affluent children, which can lead to behavioral issues and learning delays. Let us also consider the well-documented impact of poverty on the education students receive in elementary, middle, and high schools, and the scholastic advantages that wealthier kids get from an early age. Before a child is even close to studying for the SATs—where they’re likely to under-perform—they’re set up for failure.

Now, assume a child from a poor household graduates from high school—which is statistically less likely than if they were wealthy—and goes to college. They are more likely than wealthier kids to have parents who did not go to college, which present a host of social and economic barriers which their peers don’t have to deal with. First-generation college and poor college students are more likely to self-select mediocre colleges or simply not get accepted to better schools due to their background, often have to work during school, may still be supporting family back home, land typically have to take out more loans, which means they graduate with more debt (though even if that weren’t the case, investigations have found that colleges just saddle poor kids with more debt anyway.) That is, if they graduate at all, which again, they are less likely to do even if they’re smarter than rich kid sitting next to them.

But still, lots of poor students do graduate from college, obtaining that coveted degree and also a hefty loan tab which they’ll have to pay back. The specter of that debt often drives middle-income kids back in with their parents (you’ve heard of boomerang kids, I assume), but for poorer students, moving back home often isn’t an option, or it means moving back to an economically-depressed area where job prospects are limited. This is one of many reasons that just under half of low-wage workers also happen to have college degrees.

If the new college graduate, though, does manage to finagle their way into a situation where they can both make their loan payments and their rent in a city where they may be able to find a job, they’ll still miss out on key indicators of success that their wealthier peers may enjoy. Those include but aren’t limited to: Social and networking connections throughs parents or alumni organizations that can lead to more prestigious internships or jobs, the ability to take unpaid internships which may pay off down the road, the ability to take slightly lower-paying positions which may ladder up to something more lucrative, and not knowing what options are available to them.

In its initial report, Brookings doesn’t break down the “bachelor bump” by race, but it would be a mistake, not to factor in the role of racial income disparities when talking it; though class and income are, of course, important indicators, race is still a major division when it comes to test scores, graduation rates of both high school and college (though those gaps are closing slowly), unemployment even with a degree, and overall earnings. Today’s most lauded jobs—the ones with the highest earning power—are often those in tech, which statistically has a terrible reputation of hiring a diverse workforce. The truth is that people of color are still simply doing worse in the United States, even when they go to school.

Though college is still one of the better tools in breaking the cycle of poverty, it’s untrue to plainly state, without qualification, that going to school will unequivocally help poor kids earn more than their parents. If we are serious about closing the gap between the wealthy and the not-wealthy, the solutions we choose much start early, acutely and directly address systemic racial discrimination, and prioritize more than just college attendance and graduation.

Republicans’ Counterproductive Fixation On A Worst-Case Scenario That Never Happens

minimum wage increase

Bread lines: Not since before the minimum wage was created, and probably unlikely to be seen after it goes up. (Image: Library of Congress)

“Conservatives have argued for years that no matter how well-meaning, efforts to increase the minimum wage end up hurting the most vulnerable, those looking to grasp the first rung on the employment ladder,” writes right-wing blogger Jennifer Rubin in the Washington Post today. And while I, personally, would have just stopped the article right there with a quick note that, of course, those warnings have never actually come to fruition, she, of course, does not.

Rubin goes on to cite the problematic, incorrect, and generally underwhelming conservative, anti-minimum wage economist Mark Perry as a source (at least she’s consistent), and then proceeds to chide Democrats about how “the impact on employment and on the poor, specifically, may be profound” should we keep hammering on about raising wages. Her thesis: That we’re not thinking this through, and that, just as the right has been warning for literally decades, our efforts to increase wages could possibly end in disaster.

And yet, what Rubin never quite manages to get around to is the fact that the]is predicted apocalypse has yet to arrive, and likely never will.

Since its birth, the minimum wage has been drawing concern and outright ire from the right, who have warned of the economic fire and brimstone it will rain down upon us, ensuring that no teen will have a job and that businesses will be forced to shutter more quickly than you can say “trickle-down.”

Truly, the quotes go way back. In case you think I’m exaggerating, here are some:

  • 2006: “If a simple legislative act increasing the minimum wage to $7.75 is all that is needed to improve the lot of the working poor by just a little, then why not raise it to $10 an hour and get them to the poverty level? For that matter, why not raise it to $50 an hour, assuring every working Californian a comfortable living?” — California State Senator Tom McClintock in the Los Angeles Times
  • 1999: “Minimum wage increases that even approach an average livable wage would result in significantly fewer jobs for low-wage workers. A substantial increase in the relative cost of labor will result in a reduction in the amount of labor used.” Thomas Kavet, Deborah Brighton, Douglas Hoffer, and Elaine McCrate in a report delivered to the State Legislature of Vermont
  • 1970: “It is unrealistic to assume that somehow the increase will be squeezed out of profits….In plain fact, the burden of an increased minimum wage will fall heavily on those least able to bear it. The fringe employers, the unskilled worker, the young and the handicapped are those who will be priced out of the job market.” — The Chamber of Commerce
  • 1938: “[The Fair Labor Standards Act] will destroy small industry….[these ideas are] the product of those whose thinking is rooted in an alien philosophy and who are bent upon the destruction of our whole constitutional system and the setting up of a red-labor communistic [sic] despotism upon the ruins of our Christian civilization.” — Representative Edward Cox (D-GA). 1938.

Yes, the scary stories of the right have been spookily whispered in the ears of the working population for decades, allowing plenty of time for these ideas—that a minimum wage that’s actually livable will bankrupt businesses and lead to an explosion of unemployment—to fully marinate and become baked into the the collective understanding of the economy.

The unfortunate thing, though, for the ultra-wealthy who benefit off these ideas (because truly, no one else is) is that they are simply not true, and they don’t hold up.

Take, for example, Washington State—the state with the highest minimum wage in the nation—was recently ranked by Business Insider as having the most robust economy in the country.

“Its Q2 2015 annualized GDP growth rate was a stunning 8.0%, by far the highest among the states and DC. The November 2015 average weekly wage of $1,073 was the second highest in the country, and was 5.6% higher than the weekly wage in November 2014, the third highest wage growth rate,” they explained.

The next highest-ranked was the District of Columbia, which boasts a $9.50 minimum wage. After that was Colorado—also with a minimum wage higher than the Federal level—and then, down at #5 was Nebraska, a state whose “unemployment rate of 2.9% was the second lowest in the country,” and whose minimum wage is also above $7.25.

Does correlation necessarily indicate causality? Of course not. But it’s worth pointing out that economic growth in states with minimum wages above the Federal floor isn’t exactly stagnate. In fact, higher minimum wages actually seem to be spurring job creation and growth.

Which means that the only thing Rubin and I can agree on is her first statement—that Conservatives have been beating this drum for a long, long time.

Marco Rubio Says You Can’t Live on $10/Hr, So…What? People Just Die?

marco rubio minimum wage

Marco Rubio: Thirsty for jobs that pay more

At a campaign event back in October, Marco Rubio said something that is factually accurate: That poverty wages simply are not enough to support a family.

“I have full confidence that the American private sector…won’t just create millions of jobs. They’ll create millions of jobs that pay more,” he said, standing in a backyard in (according to the clip) Portsmouth. “Because even the jobs that are being created now don’t pay enough. You can’t live on $10 an hour! You can’t live $11 an hour! We need to create jobs that pay much more than that. But we have to have an economy and economic policy that make America the best place in the world to create jobs that pay more.”

There’s a lot to unpack here, so let’s go point by point:

  • Marco Rubio believes that the private sector, not the government, should be creating jobs and spreading wealth, even though he’s often said that a tax credit is the best way to put more money in the pockets of Americans.
  • Marco Rubio doesn’t like the jobs that are being created now, even though he’s very much a believer that the economy is a game of straight supply-and-demand and thus, theoretically should believe that the jobs being created are the ones that are most in demand.
  • Marco Rubio admits that the minimum wage—well below $10 or $11 in all states—is not enough to live on, and yet, does not suggest what to do about that.
  • Marco Rubio wants people to be paid more than $11 per hour, but somehow refuses to admit that a quick way to do that is to raise the minimum wage.
  • Marco Rubio says that to pay people more, we need to have “an economy and economic policy” that would favor job creators, though he fails to quite put together that a good way to do that is to, again, give more purchasing power to people who spend their money with job creators.

In this address, he admits that the minimum wage is not enough to live on, which forces the question: Who does he think works for the minimum wage?

People who…are not alive? People who don’t need to live on those wages?

Perhaps, like a lot of misinformed people, Rubio believes that the minimum wage is not for people who need to survive on it, i.e., it is just for teenagers. If so, that could be a major problem; BLS numbers showing that more than 3.3 million Americans earn at or below the Federal minimum wage, and Census data demonstrates that the US has the lowest percentage of teenagers we’ve ever had. If minimum wage jobs really are just for those who don’t have a family to support, we may actually not have enough of a workforce to keep it afloat.

Or, perhaps Rubio is just perfectly ok with a caste system, wherein some people get to live and others do not. After all, he’s previously stated that the best way to raise wages is to “make America the best place in the world to start a business”—but of course, in suggesting this, he’s stating that business owners deserve a living wage, but their employees, who would ostensibly be pulling down make wages of $10 or $11 per hour do not.

Rubio’s statements echo those of fellow Republican Paul Ryan, who delivered his first major policy address as Speaker of the House earlier this month, and focused almost entirely on poverty and income inequality. And, much like Rubio, he got oh-so-close to actually admitting that increasing the minimum wage could actually be the best possible thing for the economy…but he, too, couldn’t quite get there.

Writing for the Nation, Rebecca Vallas and Melissa Boteach explained it pretty concisely:

As Speaker Ryan so eloquently points out, our minimum wage is a poverty wage and not nearly enough for working parents to support their families, leaving many with no choice but to turn to public assistance to make ends meet.

“So say you’re a single mom with one kid. You’re making minimum wage. You’re on food stamps, Medicaid, housing assistance, and other assistance.”

So, by raising the minimum wage to $12 by 2020 as the Murray-Scott bill would do, not only would 35 million Americans get a raise, but we would also save nearly $53 billion over the next 10 years in the Supplemental Nutrition Assistance Program alone.

Unfortunately, Ryan has voted against raising the minimum wage at least 10 times since he’s been in office.

Because here is what is becoming extremely clear about the Republican party: They can see the issue (which is that people simply are not earning enough money), but for some reason they can’t seem to just say the words:

We could solve these problems by ensuring that all workers receive a wage that is economically feasible.

The GOP loves to tout the private sector and decry the use of social services by people living in poverty, but it’s the private sector’s unwillingness to pay its workers enough to purchase basic necessities—like a one-bedroom apartment, which the minimum wage can’t cover in any state—that results in the reliance on social services. Republican presidential hopefuls talk a big game about “creating jobs” and “pushing up wages,” but can’t quite follow the end of that thought to “if people had more money, demand for goods and services would increase.”

There seems to be some cognitive dissonance that exists right in the way of drawing these conclusions. What is it?

marco rubio minimum wage

I tried to do some scratch math but was unable.

Could it be that their major donors are the exact members of the private sector who are posting record high CEO pay while paying their workers a wage that all but requires them to rely on food assistance and other social services? That would certainly make sense statistically; in an article for Salon, Sean McElwee points to the fact that while plenty of GOP voters actually do support raising the minimum wage, it’s the donors who do not.

“A whopping 63 percent of Republican non-donors support a higher minimum wage, compared to only 32 percent of donors who gave more than $1,000,” McElwee notes—indicating that conservatives who are struggling, who don’t have $1,000 or more to give, really do believe in higher wages, while those with means are hoping to keep their money by railing against redistribution.

Or is it just that all of these men are so bound by the ideas of trickle-down economics that they legitimately do not see the fact that it’s not working?

Because that’s exactly what Rubio, Ryan, and their ilk are saying when they make these claims about the needs to “create jobs that pay more”—they are saying that our current system, a system where the minimum wage has neither kept up with inflation or productivity, but where tax cuts for the wealthy are tipping the scales, isn’t doing what they want it to do.

A Few Brief Thoughts On the Coming Robot Apocalypse

Do you want fries with that?

Would you like fries with that?

Here at Civic Ventures we spend a lot of time thinking about the economic impact of new technologies, and for the most part, we’re pretty damn sanguine about the prospects. Innovation is how we solve human problems and improve living standards, and we see plenty of opportunity to collectively innovate ourselves into a more comfortable, sustainable, and broadly prosperous future.

But a lot of other folks aren’t so sure, especially when it comes to the seemingly inevitable rise of the robots.

There are two competing visions of robotic dystopia, both of which have become familiar mainstays of popular science fiction. The first is the robot apocalypse, in which super-intelligent self-aware machines decide to enslave or eliminate their weaker human creators—think Terminator or The Matrix. And there are some awfully smart people sounding this alarm. Tesla and Space-X founder Elon Musk calls artificial intelligence mankind’s “biggest existential threat,” while renowned physicist Stephen Hawking warns that “the development of full artificial intelligence could spell the end of the human race.”

Yikes! I mean, I guess. And I agree with Musk that something as potentially world-altering as AI is probably worthy of at least a little government regulation. But from what I understand about the state of the technology, I don’t believe we’re anywhere close to making the leap from self-driving cars to self-aware ones, and to be honest, I’ve never quite understood why we should expect “the singularity” to be malevolent? (A little projection there, humankind?)

My more immediate concern is the second robotic dystopia: a viciously unequal job-scarce future in which a handful of super rich people enjoys the fruits of technology while the disempowered masses struggle to scratch out a bleak subsistence—think Blade Runner or The Hunger Games, or that crappy Matt Damon movie, Elysium. This is the future that ubercapitalists like Warren Buffett and Bill Gates warn of (well, sorta) when they publicly fret that a rising minimum wage would only accelerate our relentless drive toward automation, softening labor markets and pushing up unemployment: “Capitalism over time will create more inequality and technology over time will adjust labor demand,” cautions Gates.

If I’m understanding him correctly, Gates is saying that this is simply the inevitable outcome of the interplay between market capitalism and technology: the rich get richer while the poor lose their jobs to machines. Except, I’m pretty sure his bleak vision of the future of market capitalism is definitionally impossible.

You see, the problem with this scenario is that while markets can be very useful tools for efficiently allocating scarcity, they fail utterly when it comes to allocating abundance. I mean, how do you price labor if it is in unlimited supply? You can’t. It’s like dividing by zero. As the cost of job-displacing technology plummets and the rate of unemployment soars, the labor market simply ceases to function.

Without a labor market, there are no wages. Without wages, no income. Without income, no consumer demand. Without demand, no robots! + Read More

Jeb Bush and Hillary Clinton Launch Their Campaigns with Two Very Different Takes on Income Inequality


Are you sitting down? What I’m about to say  next may surprise you: Jeb Bush just announced he’s running for president. I know. I know. It came as a total shock to me, too. Bush, whose logo conveniently doesn’t include the last name “Bush,” is sticking with his phony concern-trolling about income inequality, saying that he wants to “make opportunity common again.” Of course, his prescription for the problem of income inequality is the same platform as every Republican presidential candidate since 2000: tax cuts for the wealthy, cutting back on regulations, and more of the same policies that got us into this mess in the first place.

Compare this with the speech that Hillary Clinton gave over the weekend at her first large public event since announcing her candidacy. I know a lot of Democrats and progressives who are skeptical about Clinton, but she said exactly the right things in that speech. She identifies the problem at the very beginning:

You see corporations making record profits, with CEOs making record pay, but your paychecks have barely budged.

While many of you are working multiple jobs to make ends meet, you see the top 25 hedge fund managers making more than all of America’s kindergarten teachers combined. And, often paying a lower tax rate.

So, you have to wonder: “When does my hard work pay off? When does my family get ahead?” “When?”

So what do we need to combat inequality? Clinton correctly identifies the solution: “It takes an inclusive society.”  She presents a few solutions: “Reward businesses who invest in long term value rather than the quick buck,” focus on infrastructure relief, create universal pre-K, tackle student debt, standardize paid sick leave, promote the right of workers to receive their schedules in advance, support “a constitutional amendment to undo the Supreme Court’s decision in Citizens United,” and make it easier to vote.
+ Read More