The Apple corporation's coffers are overflowing. By a recent count, the tech giant has an overall cash reserve of $135 billion—and, according to Moody's, another $35 billion could be added to that stockpile by the end of the year.
Not all of Apple's investors are pleased with the size of the surplus. While a massive rainy day fund may be good insurance against market fluctuations, it's also money that is not being returned to the shareholders. As a result, one hedge fund manager, whose firm owns some $610 million in Apple stock, actually sued Apple last month in order to prevent a new corporate policy which "could limit how the company could return some of its $137 billion cash pile to investors."
The debate over what to do with Apple's reserve money has been "one-dimensional, with a nearly exclusive focus on how the reserves should be used to reward its shareholders," writes the Economic Policy Institute's Isaac Shapiro. The workers who manufacture and sell Apple products, he notes, have been left entirely out of the debate. Despite the notoriously grim conditions at some factories which make Apple products, low-level employees are unlikely to see any benefit from a massive corporate surplus.
Apple is not exceptional in this regard. As of 2012, publicly listed corporations had hoarded roughly $4.75 trillion in surplus income. More recently, even as stock prices have gone beyond pre-recession levels, and overall economic productivity continues to climb, workers' wages stagnate.
Recently, various progressive leaders have called for legislation that would ameliorate this imbalance by raising the minimum wage. While this policy would not directly affect workers making Apple products in China and elsewhere abroad, it could change the market in which domestic Apple salespeople and technicians operate. More to the point, the minimum wage debate clarifies the dividing lines between progressive support for evening out income distribution and conservative faith in a relatively unhindered market.
Libertarians and conservatives often argue that a minimum wage hike would do more harm than good, because, as Mark Wilson argues for the libertarian Cato Institute, "it may be partly passed on in prices" to the consumer. Furthermore, companies might seek to reduce labor costs in the face of a minimum wage hike by laying off workers, thereby boosting unemployment. In other words: An increase in labor costs will automatically trigger a scramble to raise prices or cut costs elsewhere.
If that's the case, one might wonder what those gargantuan cash reserves are for in the first place. Presumably, an increase in the wage floor counts as precisely one of those changes to market conditions for which Apple and other large companies maintain a rainy day fund.
By the same token, it is unlikely that a modest increase in the minimum wage would compel businesses to adopt radical labor-saving measures totally out of the blue. Declining wages relative to productivity indicate that companies are already wholly dedicated to wringing as much value out of each dollar spent on labor as possible. The drive to wring maximum productivity out of minimum labor costs is a fundamental component in the engine of the market, and intrinsic to the constant growth of which market evangelicals are so fond. (That may help explain why the research is inconclusive on how raising the minimum wage affects employment.)
From the perspective of both shareholders and workers, leaving trillions of dollars in cash reserves may seem rather inefficient. But there is also reason to doubt the efficiency of distributing a greater share of that money among only the investors. This is due to a principle called diminishing marginal utility.
Diminishing marginal utility is an economic theory adopted from the thought of utilitarian philosopher (and Enlightenment radical) Jeremy Bentham. He summed it up when he wrote that "ten thousand times the quantity of wealth will not bring with it ten thousand times the quantity of happiness."
To put it another way: Imagine that Jim and Carol are sitting on opposite sides of a large pizza. Both are equally hungry, but Jim purchased the pizza himself, so he gets to decide the distribution of slices. Before either Jim or Carol have eaten anything, the marginal utility or overall value of a pizza slice is very high for both of them. But once Jim eats his first three slices, he's no longer very hungry. The marginal utility of pizza has declined for him.
After a fourth slice, Jim is full, but still thinks pizza sounds tasty, so the marginal utility of a slice hovers slightly above zero for him. However, if a sixth slice of pizza—or, worse, a seventh, or an eighth—actually has negative marginal utility. He can't enjoy that much pizza, and all it does is hurt his stomach. Not only does it cost Jim practically nothing in terms of utility to split half the pizza with Carol, but it increases the overall utility for both of them if she gets to have four slices.
Money is unlikely to have negative marginal utility for investors, but one thousand dollars of Apple's money possesses much greater utility for a minimum-wage worker than for a millionaire shareholder.
To which a conservative might respond: That's all well and good, but the investors deserve a large return on their risk, in the same way that Jim deserves all the pizza he purchased with his own hard-earned money.
What's interesting about that argument is it assumes that people only “deserve” what they purchase or earn through the market, when the whole reason for proposing redistributive policies is that progressives simply disagree. Carol “deserves” pizza because she is hungry, and because giving her pizza does no real harm to Jim, the progressive might argue. To simply claim that Carol shouldn’t have food she didn’t buy is just to talk past the progressive argument and reiterate an earlier claim. For those who don't also believe that claim, massive, unused corporate cash reserves might seem ripe for appropriation and redistribution.