Sunday, June 30, 2013

Roosevelt and the "Southern" New Deal

In attempting to understand the polarization that has characterized our political process, it is critical to recognize that the United States was created by merging two separate nations together: the nation of non-slave states and the nation of slave states. Such a marriage was destined for hard times. The Civil War put an end to the formal institution of slavery, but it did not change the political, social, and racial attitudes of the former slave states. Much of subsequent national history has concerned their continued attempts to maintain as much as possible of their race-based social order.

David Runciman has produced an excellent article in the London Review of Books that discusses how the polarized politics of the mid-twentieth century shaped our policies as we struggled to emerge from the Great Depression: Destiny v. Democracy. Runciman’s article is a review of the book:

Fear Itself: The New Deal and the Origins of Our Time by Ira Katznelson
Norton, 706 pp, £22.00, April, ISBN 978 0 87140 450 3

While Southern politicians were interested in economic recovery, they were also interested in maintaining their racial caste system and their Jim Crow laws. Their political power provided them an excessive amount of influence in policy making, and, in Katznelson’s opinion, this was to the detriment of our country.

"Katznelson persuasively argues that the core features of the New Deal were fashioned out of Northern acquiescence and Southern intransigence. The give-and-take was highly asymmetrical. Roosevelt needed the votes of Southern representatives in Congress, above all in the Senate, to get his legislative programme for reviving the American economy passed. Though in a permanent minority, Southern senators wielded disproportionate influence."

A little historical perspective is required in order to fully understand the political dynamics involved. The South was effectively a one-party nation. By restricting the vote to economically well-off whites, politicians were able to control elections and guarantee the desired results.

"In the 1936 presidential election, FDR’s Democratic ticket won 97 per cent of the vote in Mississippi, 99 per cent in South Carolina. In some counties no votes at all were recorded for Republican candidates."

"The population of Mississippi in the late 1930s was more than two million. Yet the number of people whose votes were counted in the 1938 congressional midterms was barely 35,000."

What provided Southern politicians their undue influence was absolute loyalty to their segregated ways. They formed the most consistently coherent voting bloc in congress.

"....they were more united than any rival grouping. What united them was race, and a shared sense that they represented the final bulwark against the destruction of the white order. Any local differences were put aside when segregation was on the line. Unity made them disciplined but also highly adaptable: Southern representatives would forge whatever alliances were needed to keep the South intact. If that meant doing deals with Republicans, so be it. Roosevelt knew he couldn’t rely on the Southern politicians in his party if they had any sense that their way of life was under threat. So he was loath to put them to the test."

Having absolute control over elections allowed the Southern states to return the same politicians to office over and over. Seniority rules favored the long-serving Southerners for important committee chairmanships.

"There was no way for a Democratic president to legislate without letting the South get its fingerprints all over his bills."

Our country essentially had a three party system consisting of a socially and politically conservative bloc of Democrats, a fiscally conservative Republican bloc, and a liberal Democratic bloc. These three groups would form alliances as necessary. Roosevelt had little leverage over his own party.

Katznelson provides another bit of insight into the constraints Roosevelt faced, and indicates why he might hesitate to be perceived as bullying Congress.

"....he arrived in office in 1933 at a moment of acute danger, not just for his country but for democracy worldwide. A widespread feeling had arisen that only dictatorship could tackle the chaos and misery that the Great Depression had unleashed. Mussolini, Stalin and now Hitler were all being cast as men of action; the indecisive democracies seemed to be trailing in their wake."

There was considerable sympathy for strong executive action that might bypass the normal workings of Congress. Consider this advice attributed to the respected and influential Walter Lippmann:

"The pre-eminent columnist Walter Lippmann wrote that ‘strong medicine’ was needed. He advocated a large increase in presidential powers, and a temporary suspension of Congressional checks and balances. In a private visit to the president-elect in February, Lippmann told him: ‘The situation is critical, Franklin. You may have no alternative but to assume dictatorial powers’."

Roosevelt recognized the danger such notions represented for a Democracy and chose not to pursue any additional powers not explicitly granted by Congress. This wise decision insured the ability of the Southern bloc to manipulate legislation in accordance with their goals.

The marriage of the liberal and Southern wings of the Democratic Party worked because they needed each other. The economically dysfunctional South needed the public largesse that a Democratic administration could provide, and the non-Southerners needed the South to win the presidency. This provided an uneasy alliance that could be broken by any attempt to change the social structure of the former slave states.

"If the Democrats controlled Washington, the benefits for the South were enormous, as resources were reallocated from more affluent parts of the country. After 1933, this was the lifeline on which the South depended to survive the Great Depression. The price paid was the uncertainty that came with having Northern and Midwestern politicians pulling the purse-strings. These men (and the occasional woman) were answerable to electorates very different from those of their Southern counterparts; indeed, many of them represented districts in which black voters, even in small numbers, could swing an election. Southern Democrats could never be sure what might be required of them in return for all that federal largesse; they had to be perpetually vigilant for signs of slippage on the race question."

The game the Southern politicians played was to use the poverty of their constituents to gain public funds, but to insure that those funds did not alter the conditions created by segregation.

"But the Southerners were playing a double game. They wanted to maximise the help the federal government provided to the worst hit parts of the country, while at the same time minimising the amount of control the federal government could exercise over the way the help was distributed within individual states."

The calls for a small federal government and state control that are bellowed by Southern politicians to this day were born not out of political idealism, but out of racism.

"When it came to drawing up the legislation that underpinned the New Deal, Southern members of Congress made sure that segregation was not simply recognised: it was reinforced. So the NIRA [National Industrial Recovery Act of 1933] codes that established minimum wages and maximum hours for workers explicitly excluded domestic and farm labour, thus ensuring that the vast majority of African Americans in the South would not be covered by their provisions."

One of Roosevelt’s embarrassing collapses before southern intransigence involved the exclusion of "traditional" Southern industries.

"Roosevelt also agreed to exemptions for various Southern industries, including citrus packing and cotton ginning. He knew what he was doing. ‘It is not the purpose of the administration,’ he announced in 1934, ‘by sudden or explosive change, to impair Southern industry by refusing to recognise traditional differentials.’ In order to count as ‘Southern’, an industry needed only to show that the majority of its workers in a given state were black."

Any industry that predominantly hired black workers was excluded from laws designed to protect workers’ rights. The underpinnings of the New Deal were thus blatantly racist.

One of the hallmarks of the South is the hostility to unions and collective bargaining. This tendency did not arise from political or economic theorizing, or from a sense of individual freedom or responsibility, but from racism. Unions might treat blacks and whites equally; therefore they must be suppressed.

"The focus of Southern fears was labour legislation. Plans to strengthen the bargaining position of labour unions and extend the minimum wage were fiercely resisted by many Southern Democrats, who rightly saw that organised labour posed the greatest threat to segregation (unions were almost the only public bodies of the time that welcomed black and white workers together). A 1938 labour standards bill, which would have brought Southern agriculture within the ambit of New Deal regulations, was defeated in the House, despite its overwhelming Democratic majority. ‘Unworkable’, ‘un-American’, ‘impractical’ and, above all, ‘dictatorial’ was how Southern representatives characterised it. They allied with Republicans, who had their own reasons for wanting to curtail the unions, to pass a much watered-down version, which kept the important exemptions intact."

Other accommodations to Southern sensibilities included a segregated military and the inability to declare lynching to be a federal crime.

Katznelson views these compromises with a sense of "regret" at the lost opportunities for greater social and economic justice, and suggests that more could have been done. Runciman is less judgmental and is comfortable with the notion that Roosevelt did what he had to do.

It is arguable as to the extent of the compromises with the Southern racists. What is not arguable is the level of their power and influence.

The Civil War was fought to end slavery as an institution. In its aftermath, slavery was replaced by segregation and other forms of subjugation. The civil rights turmoil of the 1950s and 1960s succeeded in overturning laws supporting segregation and discrimination. Given that this movement also included forms of invasion of the South in order to attain its goals, this might be considered a second Civil War.

With the passage of civil rights legislation by the Democrats, the Democratic South soon became the Republican South. The goals of this political bloc are little changed; limiting workers rights, suppressing the vote of racial minorities, and limiting the ability of federal regulations to modify state behavior are still the focus of its activities.

Will we require a third Civil War before these issues are finally settled?

Monday, June 24, 2013

Bad Pharma: Suppressing Drug Data Is Most Dangerous of Bad Practices

Ben Goldacre is a British physician who has provided the general reader a detailed and often passionate look into practices that have become common throughout the medical field in his book Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients. While British himself, his data, examples, and claims apply to people of all countries.

Goldacre begins with this statement:

"Medicine is broken. And I genuinely believe that if patients and the public fully understand what has been done to them—what doctors, academics and regulators have permitted—they will be angry."

He then provides this explanation:

"We like to imagine that medicine is based on evidence and the results of fair tests. In reality, those tests are often profoundly flawed. We like to imagine that doctors are familiar with the research literature, when in reality much of it is hidden from them by drug companies. We like to imagine that doctors are well-educated, when in reality much of their education is funded by industry. We like to imagine that regulators let only effective drugs onto the market, when in reality they approve hopeless drugs with data on side effects casually withheld from doctors and patients."

Developing drugs for medicinal use is expensive and time-consuming. In an ideal world an independent agency would be involved in testing and validating the effectiveness of a given drug and comparing it against similar drugs to determine which is most effective. We do not live in such a world.

Most testing is performed by the drug companies themselves. There is a lot of money involved in getting a drug into the marketplace, and there is a lot of competition from other drug makers. Without effective regulation of such systems, ethics tend to fall to the lowest feasible level.

"Drugs are tested by the people who manufacture them, in poorly designed trials, on hopelessly small numbers of weird, unrepresentative patients, and analyzed using techniques which are flawed by design, in such a way that they exaggerate the benefits of treatments. Unsurprisingly, these trials tend to produce results that favor the manufacturer."

Biases in favor of a positive result for the drug under investigation are well-documented and can result from intentionally fraudulent practices as well as from mere irrational enthusiasm on the part of researchers and the medical community as a whole. Early testing of a new drug tends to be dominated by the manufacturer. Subsequent testing by independent agencies often obtains different results, but by then it is too late to matter much. For more on this topic see Medical Science and the Vanishing Truth.

When peoples’ lives are at stake one should be able to expect strong and effective regulators to be involved. Unfortunately, that is not the case.

"When trials throw up results that companies don’t like, they are perfectly entitled to hide them from doctors and patients, so we only ever see a distorted picture of any drug’s true effects. Regulators see most of the trial data, but only from early on in a drug’s life, and even then they don’t give this data to doctors or patients, or even to other parts of the government. This distorted evidence is then communicated and applied in a distorted fashion."

Incredibly, regulators allow embarrassing data to be buried from view as "valuable proprietary information." Drug regulators are subject to the same perverse incentives as regulators of the financial industry. They are subject to varying political winds. At one time they are being told that their main job is to protect the public; at another time they are told that their focus should be on facilitating the drug companies business by making the introduction of new drugs easier. Regulators are also subjected to the same lobbying practices as all other government actors. Goldacre believes bureaucratic turf protection is also an important factor in suppressing the dissemination of data to others who would have a stake in the analysis. And last, but not least, the easiest way for a regulator to make a lot of money is to quit regulating and take a remunerative position with a company being regulated. That being the case, it makes no sense to anger a potential future employer.

More on the issue of regulation specific to the FDA can be found in How the FDA Fails to Protect Us.

Medicines are dispensed by doctors. Patients like to believe that their physician is a source of reliable information.

"Doctors spend forty years practicing medicine, with very little formal education after their initial training. Medicine changes completely in four decades, and as they try to keep up, doctors are bombarded with information: from ads that misrepresent the benefits and risks of new medicines; from sales reps who spy on patients’ confidential prescribing records; from colleagues who are quietly paid by drug companies; from ‘teaching’ that is sponsored by industry; from independent ‘academic’ journal articles that are quietly written by drug company employees; and worse."

The drug companies spend astounding amounts of money on marketing. If the prescription of medications is based on knowledge of drug performance, then why would marketing be appropriate? Goldacre provides an appropriate perspective: the purpose of marketing is to confuse doctors and to avoid evidence-based decisions on the part of doctors.

"....we will see that pharmaceutical companies spend tens of billions of pounds every year trying to change the treatment decisions of doctors: in fact, they spend twice as much on marketing and advertising as they do on the research and development of new drugs. Since we all want doctors to prescribe medicine based on evidence, and evidence is universal, there is only one possible reason for such huge spends: to distort evidence-based practice...."

The regulators facilitate the marketers by not demanding that drug companies determine that their product is not only better than nothing, the current low bar for entry into the marketplace, but that it is also better than existing products. This facilitates the proliferation of similar drugs into the marketplace. The competition then is based not on relative effectiveness (unknown) but on advertising campaigns.

"....for several of the most important and enduring problems in medicine, we have no idea what the best treatment is, because it is not in anyone’s financial interest to conduct any trials at all."

Perhaps Goldacre’s biggest concern is the lack of comparative data. If a doctor has twenty drugs to choose from, how is she to make a valid choice? Clearly some will provide better long-term outcomes for patients than others. Knowledge of relative efficacy would produce one or two winners; lack of knowledge produces twenty winners. By not selecting the best performing drugs, doctors put the health of their patients in unnecessary risk.

Goldacre provides plenty of data and examples to illustrate this sorry state of affairs. He includes lengthy discussions of the sleazy and unethical (by normal standards—or even by political standards) activities of highly regarded members of the medical profession. He also details the criminal marketing activities for which numerous drug companies have admitted guilt. He views these as outrages that can and will be addressed as the public becomes more familiar with the harm that is being done to them.

"Bad behavior in marketing departments is unpleasant, but it’s the one thing that has already received public condemnation, because the issues are tangible, with covert payments, misleading messages, and practices that are obviously dishonest, even to the untrained eye. But for all that they may be disappointing, these distortions can be overcome by any good doctor."

Goldacre is most worried about the practice of suppressing the dissemination of data because it is deemed a business threat, an embarrassment, or just plain uninteresting. He provides an example from his own practice where he was misled by a company that withheld critical data. Goldacre reviewed the published data on an antidepressant, reboxetine, and that convinced him that it would be appropriate for a particular patient. He thought he had done everything a good doctor should do. Subsequently, he learned the following:

"Seven trials had been conducted comparing reboxetine against placebo. Only one, conducted in 254 patients, had a neat, positive result, and that one was published in an academic journal, for doctors and researchers to read. But six more trials were conducted, in almost ten times as many patients. All of them showed that reboxetine was no better than a dummy sugar pill. None of these trials was published. I had no idea they existed."

"It got worse. The trials comparing reboxetine against other drugs showed exactly the same picture: three small studies, 507 patients in total, showed that reboxetine was just as good as any other drug. They were all published. But 1,657 patients’ worth of data was left unpublished, and this unpublished data showed that patients on reboxetine did worse than those on other drugs. If all this wasn’t bad enough, there was also the side-effects data. The drug looked fine in the trials which appeared in the academic literature: but when we saw the unpublished studies, it turned out that patients were more likely to have side effects, more likely to drop out of taking the drug, and more likely to withdraw from the trial because of side effects if they were taking reboxetine rather than one of its competitors."

Nothing done by the drug manufacturer (Pfizer) was illegal.

"Because nobody broke any law in that situation, reboxetine is still on the market, and the system that allowed all this to happen is still in play, for all drugs, in all countries in the world."

As an aside, reboxetine was approved for distribution in Europe where, apparently, one good trial can negate any number of bad ones. It was not approved for use in the US, apparently because the FDA demands two positive trials to negate any number of bad ones. One assumes that Pfizer, try as they might, just couldn’t get a second positive trial.

We will conclude with Goldacre’s thought on "missing data."

"Missing data is different, because it poisons the well for everybody. If proper trials are never done, if trials with negative results are withheld, then we simply cannot know the true effects of the treatments we use. Nobody can work around this, and there is no expert doctor with special access to a secret stash of evidence. With missing data, we are all in it together, and we are all misled....evidence in medicine is not an abstract academic preoccupation. Evidence is used to make real-world decisions, and when we are fed bad data, we make the wrong decision, inflicting unnecessary pain and suffering, and death, on people just like us."

Wednesday, June 19, 2013

Can the Mind and Brain Be Different? What Does Free Will Mean?

David Brooks wrote a thought-provoking column for the New York Times: Beyond the Brain. The topic is neuroscience. Brooks suggests that workers in the field have demonstrated exuberance for their research that is inconsistent with expected results.

"The field is obviously incredibly important and exciting. From personal experience, I can tell you that you get captivated by it and sometimes go off to extremes, as if understanding the brain is the solution to understanding all thought and behavior."

Brooks seems to be suggesting that thought and behavior can be independent of brain function.

He provides this summary of what he views as extreme positions taken by neuroscientists.

"At the lowbrow level, there are the conference circuit neuro-mappers. These are people who take pretty brain-scan images and claim they can use them to predict what product somebody will buy, what party they will vote for, whether they are lying or not or whether a criminal should be held responsible for his crime."

"At the highbrow end, there are scholars and theorists that some have called the "nothing buttists." Human beings are nothing but neurons, they assert. Once we understand the brain well enough, we will be able to understand behavior. We will see the chain of physical causations that determine actions. We will see that many behaviors like addiction are nothing more than brain diseases. We will see that people don’t really possess free will; their actions are caused by material processes emerging directly out of nature. Neuroscience will replace psychology and other fields as the way to understand action."

Then Brooks adds this curious observation:

"These two forms of extremism are refuted by the same reality. The brain is not the mind."

If the brain is not the mind, then what is the mind? Where might it reside? Does Brooks believe in a nonphysical object such as a soul that determines our actions? How about the positions of the stars and planets at the instant of our birth?

Brooks seems captivated by the notion that humans possess "free will," something that supersedes mere brain function.

"....there is the problem of agency, the problem that bedevils all methods that mimic physics to predict human behavior. People are smokers one day but quit the next. People can change their brains in unique and unpredictable ways by shifting the patterns of their attention."

The brain is an extremely complex construct with many operational components. These components and the interactions between them can adjust in an attempt to accommodate changes in the demands placed upon them. One can claim that the brain is changing continually as we progress through each day. An act of "free will" is not required.

In fact, if the brain is the only relevant organ we have, the question of whether we have a "free will" or if our actions are the outcome of complex neurological processes becomes obvious. Complex neurological processes are all we have to work with; the issue becomes confused only because we have too little respect for the complexity of our brain and too much respect for our conscious processes.

A typical act of "free will" is to make a decision and select one of a number of options. Often there is no obvious choice. How do we proceed? We examine each option until we decide that one of them "feels right." Presumably, at this point the chemical balances at the various pleasure/satisfaction sites in our brain have been mated with emotional responses and optimized by the anticipated solution.

We recognize that this is a neurochemical process without even realizing it. How often do we address the need for a difficult decision by deciding to "sleep on it?" This is recognition that our futile conscious thrashing about can be replaced by subconscious thrashing about. We are essentially getting out of the way in order to allow our subconscious brain to address the problem. And how often do we return to the issue with a new perspective or suddenly conclude that the appropriate choice has become obvious and congratulate ourselves for having worked so diligently on the problem.

One can think of this as an exercise of "free will" if one wishes. It certainly is comforting to think so. But in reality we only have neural signaling and chemical levels to work with. That is all there is. And it’s pretty damn impressive!

Monday, June 17, 2013

Peak Water and Food Bubbles

The concept of peak oil has been much discussed over the past several decades. Less note has been taken of the concept of "peak water." Whereas peak oil was related to an anticipated depletion of the total supply of oil, peak water is a more nuanced concept. Peter Gleick provides a discussion in an article on Is the U.S. Reaching Peak Water? 

"To be clear, "peak water" doesn’t mean the U.S. or the world is running out of water. Overall, there is plenty of water on the planet and it is (mostly) a renewable resource. But there are serious physical, environmental, and economical constraints on water availability that make regional water problems increasingly urgent."

Gleick provides three categories of water usage to illustrate different issues: peak renewable water, peak non-renewable water, and peak ecological water. The first concerns surface or rapidly-renewed subsurface water supplies.

"For a number of major river basins, we have reached the point of peak renewable water limits, including the Colorado River in the United States. All of the water of the Colorado (indeed, more than 100% of the average flow) is already spoken for through legal agreements with the seven US states and Mexico and in a typical year river flows now often fall to zero before they reach their ends. This is true for a growing number of rivers around the world."

The nonrenewable category refers to sources that are accessible, but for geologic reasons are effectively non-rechargeable. Usage of these sources ultimately leads to depletion and increased costs for extraction.

"This kind of unsustainable groundwater use is already occurring in the Ogallala Aquifer in the Great Plains of the United States, the North China plains, parts of California’s Central Valley, and numerous regions in India. In these basins, extraction may not fall to zero, but current rates of pumping cannot be maintained. Worldwide, a significant fraction of current agricultural production depends on non-renewable groundwater. This is extremely dangerous for the reliability of long-term food supplies."

Water use affects not only humans, but all of nature.

"By some estimates, humans already appropriate almost 50% of all renewable and accessible freshwater flows, leading to significant ecological disruptions. Since 1900, half of the world’s wetlands have disappeared. The number of freshwater species has decreased by 50% since 1970, faster than the decline of species on land or in the sea. The term "peak ecological water" refers to the point where taking more water for human use leads to ecological disruptions greater than the value that this increased water provides to humans."

Gleick suggests that the US has already exceeded "peak" conditions in all three categories. He provides this supporting data:

Water use peaked over 30 years ago. One must conclude that if more water could have been economically extracted from the environment, it would have been.

Gleick is interested in business and investment issues. He sounds an alarm by indicating the potential risk to food supplies from over-extraction of water, but finishes on a high note by pointing out that even though water usage has leveled off, our economy has continued to grow. In other words, water productivity has continued to increase and it is not yet clear that water availability has been a hindrance to the economy.

Lester R. Brown, president of the Earth Policy Institute, is directly concerned with the security of the world’s food supply in his book Full Planet, Empty Plates: The New Geopolitics of Food Security. He is greatly concerned about the tendency to use water in an unsustainable manner because the ability to produce food is tied to the water supply. If water usage is unsustainable, then food production is unsustainable. He refers to such a condition as living in a "food bubble."

"We live in a world where more than half the people live in countries with food bubbles based on overpumping. The question for each of these countries is not whether its bubble will burst, but when."

Brown provides this list of the "bubble" countries.

Note that most countries in the Middle East make his list. The causes and political ramifications of food shortages in that region have been discussed in Climate change, Food security, and Revolution.

Brown provides a short summary of the water/food supply issues in the major countries. A measure of a country’s capability to utilize water supplies to produce food is its ability to increase or maintain land under irrigation.

"In most of the leading U.S. irrigation states, the irrigated area has peaked and begun to decline. In California, historically the irrigation leader, a combination of aquifer depletion and the diversion of water to fast-growing cities has reduced the irrigated area from nearly 9 million acres in 1997 to 8 million acres in 2007. In Texas, the irrigated area peaked in 1978 at 7 million acres, falling to some 5 million acres in 2007 as the thin southern end of the Ogallala aquifer that underlies much of the Texas panhandle was depleted."

In China, Brown is concerned about the long-term health of agriculture in the North China Plain, an area that produces half of its wheat and a third of its corn. The region receives little rainfall and is dependent on groundwater.

"China has had ample warning. A groundwater survey done more than a decade ago by the Geological Environment monitoring institute (GEMI) in Beijing found that under Hebai Province, in the heart of the North China Plain, the average level of the deep aquifer dropped 2.9 meters (nearly 10 feet) in 2000. Around some cities in the province, it fell by 6 meters in that one year alone."

"....concerns are mirrored in the unusually strong language of a World Bank report on China’s water situation that foresees ‘catastrophic consequences for future generations’ unless water use and supply can quickly be brought back into balance."

Of most concern to Brown is India, a country of over a billion in population that is barely able to feed its people today, yet is expected to add another half billion before its population tops out.

"In this global epicenter of well drilling, where farmers have drilled 21 million irrigation wells, water tables are dropping in much of the country....The wells, powered by heavily subsidized electricity, are dropping water tables at an accelerating rate. In North Gujarat, the water table is falling by 6 meters, or 20 feet, per year. In some states half of all electricity is now used to pump water."

"In communities where underground water sources have dried up entirely, all agriculture is rainfed and drinking water is trucked in. Tushaar Shah, a senior fellow at the International Water Management Institute, says, ‘When the balloon bursts, untold anarchy will be the lot of rural India’."

Brown expresses this fear:

"For the world as a whole, the near simultaneous bursting of several national food bubbles as aquifers are depleted could create unmanageable food shortages."

And he leaves us with this ominous note:

"Underlying the urgency of dealing with the fast-tightening water situation is the sobering realization that not a single country has succeeded in arresting the fall in its water tables."

A declining water supply competes with soil erosion, desertification, population growth, global warming, and diversion of food stocks to fuel as threats to world food security. Brown’s book was intended to warn us of the danger of following such an unsustainable path.

"Scientists and many other concerned individuals have long sensed that the world economy had moved onto an environmentally unsustainable path....It now seems that the most imminent effect will be tightening supplies of food. Food is the weak link in our modern civilization—just as it was for the Sumerians, Mayans, and many other civilizations that have come and gone. They could not separate their fate from that of their food supply. Nor can we."

Tuesday, June 11, 2013

Social Security and the Indexing of Benefits

There are a number of minor tweaks to the Social Security system that would render it fiscally sound for the indefinite future without any decrease in promised benefits. These modifications would require some or all workers to pay more into the system. Consequently, any such changes are politically unpopular. 

Recently, the issue of which estimate of the consumer price index (CPI) should be used to increase retiree’s benefits has come up for discussion as a way to lower benefits without actually admitting that benefits were being lowered. Robert J. Shiller provided an interesting perspective on Social Security benefits indexing in an article in the New York Times.

Shiller makes the argument that tying benefits to a CPI is misguided because the CPI has little direct correlation with the availability of revenue to support the increase in benefits. Instead, he recommends an index correlated with changes in per-capita GDP. Such a change makes sense, but not necessarily for the reasons Shiller presents.

"The purpose of Social Security is to help families. It reinforces the intergenerational sharing that families already — though imperfectly — provide. It helps retirees by stabilizing their income, and it helps their grown children, who are relieved of any excessive burden of supporting them. This purpose strongly suggests that the Social Security benefits should be indexed to some measure of the available, aggregate economic pie. That means a formula that looks completely different from the ones being discussed today."

Shiller believes it is necessary that this concept of ‘intergenerational sharing" be incorporated into the social safety net.

"THE point of G.D.P. indexing is to align the interests of the retired with society as a whole. Older Americans should share both the windfalls and the losses with other generations — with working adults, and with children. It shouldn’t be otherwise. If the system’s rules force us to maintain the real benefits of retirees at all costs, we may find ourselves, in difficult times, taking excessively from our children via cuts in education, or from our working adults. Why should retired people feel no effect at all from the recession while younger people are left suffering? They should be sharing the hardships, if we have familial feelings for one another."

Whether or not the concept of "family" that affects intergenerational interactions in private life is a suitable analogy for society-wide interactions is arguable. The notion that all generations should suffer together is also on shaky ground. Shiller’s argument would be more compelling if formulated as "all society should benefit or suffer as the good times come and go."

Shiller provides critical data that would provide a more compelling basis for the GDP-based benefits index. It resides in a response to growing income inequality.

" fact, since 1982, Social Security benefits as a fraction of G.D.P. have generally been falling, at least until the latest recession."

When social Security was last modified in the 1980s, it could not have been anticipated that wages, in real dollars, would fall for many people—and particularly for those most dependent on Social Security benefits for retirement income. Social Security revenue and benefits depend on wages; falling wages contribute to the anticipated revenue shortfall and produce reduced retirement benefits..

Wages and the level of economic activity have lost their correlation. Wealth continues to be crated, but the distribution of that wealth has changed—to the detriment of those who depend most on Social Security.

Shiller is correct in asserting that benefits should be managed in a manner that makes sense to society as a whole. What makes sense is to take the opportunity to help retirees recover some of the ground lost as the business community has relentlessly sought to lower wages and eliminate jobs in order to maximize profits.

"The fact is that per capita G.D.P., in current dollars, has grown 1.2 percent faster a year, on average, than the CPI-W in the last 20 years, despite the Great Recession. If we indexed Social Security benefits to G.D.P. per capita, and not to the C-CPI-U, people who retire today and live 20 additional years would get a third more in real, inflation-corrected benefits — provided, of course, that the next 20 years are like the last."

Instead of creating a society that "shares the pain," why not take a step towards a society that keeps its promises. In our capitalist society one is expected to earn an income sufficient to support oneself. It is society’s responsibility to provide the opportunities to accomplish that. It is also society’s responsibility to provide sufficient retirement income to maintain a life of dignity as a senior citizen. When we say we cannot afford to keep our promises, we are really saying that we are not willing to pay the price. We can afford it.

"Achieving the right benefit formula while fixing Social Security’s solvency problem might require increasing the contribution rate — now in the form of 6.2 percent taxes on employees and employers. But so be it. We need to say what is right, keeping our eyes on the integrity of Social Security, which is crucial to our identity as a civil society."

Robert J. Shiller is Sterling Professor of Economics at Yale.

Lets Talk Books And Politics - Blogged