Monday, October 28, 2013

Distrusting Science

Science can be a stimulating, and sometimes beautiful, field of endeavor. One constructs theories or hypotheses about how physical systems work, then the theorist or others test these concepts by performing experiments to determine the validity. When well-posed experiments can be performed, where extraneous variables can be controlled and only those of interest are operative, knowledge is gained and the theories or hypotheses are either supported or not. The physical systems being tested are reproducible so that others can verify the validity of a single experiment. Progress is made, new products may be developed, and more advanced theories ensue.

This is the ideal situation. However, what happens when the physical systems trying to be studied are too complex to be able to control variables? When theories and hypothesis abound and there is no definitive way to test them the result is often much less than beautiful. Two such complex systems of great importance are the planet earth and the human body. When complexity combines with critical importance and the potential for power, prestige, and economic gain, one discovers that science is less than rigorous, and scientists suffer from the same human failings as many other professionals.

A recent issue of The Economist examined the reliability of science by devoting two articles to the problem: How science goes wrong, and Trouble at the lab. The articles were meant to refer to science in a general sense, but, tellingly, the examples of interest were all from the medical and social sciences.

The main concern of the articles lies in the growing awareness that research findings that have been published in peer-reviewed journals, and accepted as "scientific truth," cannot be reproduced by other investigators.

"A few years ago scientists at Amgen, an American drug company, tried to replicate 53 studies that they considered landmarks in the basic science of cancer, often co-operating closely with the original researchers to ensure that their experimental technique matched the one used first time round. According to a piece they wrote last year in Nature, a leading scientific journal, they were able to reproduce the original results in just six. Months earlier Florian Prinz and his colleagues at Bayer HealthCare, a German pharmaceutical giant, reported in Nature Reviews Drug Discovery, a sister journal, that they had successfully reproduced the published results in just a quarter of 67 seminal studies."

Such evidence has led to this conclusion:

"When an official at America’s National Institutes of Health (NIH) reckons, despairingly, that researchers would find it hard to reproduce at least three-quarters of all published biomedical findings, the public part of the process seems to have failed."

The articles indicate a number of reasons why shoddy scientific research finds its way into the published literature.

"Various factors contribute to the problem. Statistical mistakes are widespread. The peer reviewers who evaluate papers before journals commit to publishing them are much worse at spotting mistakes than they or others appreciate. Professional pressure, competition and ambition push scientists to publish more quickly than would be wise. A career structure which lays great stress on publishing copious papers exacerbates all these problems. 'There is no cost to getting things wrong,' says Brian Nosek, a psychologist at the University of Virginia who has taken an interest in his discipline’s persistent errors. ‘The cost is not getting them published’."

Considerable discussion is given to the misuse of statistics by researchers. Statistical analysis is rather simple when dealing with something like a flipped coin. There are only two possible outcomes, the initial conditions are well defined, and the final condition (heads or tails) is easily determined. Now consider a test of a medical intervention on a human being. All humans respond differently to a drug; consequently, the results of a study can depend critically on the characteristics of those chosen for testing. Also, there is rarely any result as unequivocal as the heads or tails of a coin toss. It is often difficult to precisely determine the effect of a medication on an individual patient.

A professor Ioannidis addressed the statistical issues inherent in medical research and came to this conclusion:

"In 2005 John Ioannidis, an epidemiologist from Stanford University, caused a stir with a paper showing why, as a matter of statistical logic, the idea that only one such paper in 20 gives a false-positive result was hugely optimistic. Instead, he argued, "most published research findings are probably false." As he told the quadrennial International Congress on Peer Review and Biomedical Publication, held this September in Chicago, the problem has not gone away."

Ioannidis is a respected expert in evaluating the methods and results of medical science. His work was covered more completely in an article in The Atlantic by David H. Freedman: Lies, Damned Lies, and Medical Science. Ioannidis believes that there is much more wrong with medical science than poor statistical analysis.


"….he worries that the field of medical research is so pervasively flawed, and so riddled with conflicts of interest, that it might be chronically resistant to change—or even to publicly admitting that there’s a problem."

It is too easy for research results to contain inadvertent or purposeful biases.

"‘The studies were biased,’ he says. ‘Sometimes they were overtly biased. Sometimes it was difficult to see the bias, but it was there.’ Researchers headed into their studies wanting certain results—and, lo and behold, they were getting them. We think of the scientific process as being objective, rigorous, and even ruthless in separating out what is true from what we merely wish to be true, but in fact it’s easy to manipulate results, even unintentionally or unconsciously. ‘At every step in the process, there is room to distort results, a way to make a stronger claim or to select what is going to be concluded,’ says Ioannidis. ‘There is an intellectual conflict of interest that pressures researchers to find whatever it is that is most likely to get them funded’."

Ben Goldacre lists at least ten ways in which drug trials can be manipulated by those conducting the trial to produce a misleading result. As an extreme suggestion of bias he includes this comment indicating that trial results can be strongly dependent on the financial interest of the researchers in the results.

"In 2007, researchers looked at every published trial that set out to explore the benefit of a statin. These are cholesterol-lowering drugs which reduce your risk of having a heart attack….This study found 192 trials in total, either comparing one statin against another, or comparing a statin against a different kind of treatment….they found that [drug] industry-funded trials were twenty times more likely to give results favoring the test drug….that’s a very big difference."

Medical research is not the only area in which scientific research has had trouble delivering credible results.

Carl Hart has also pointed out that early research into the effects of narcotics and other drugs were often biased in such a way as to produce the results that government funding agencies wished for. This led to major misunderstandings of the nature of human response to such drugs. The results of such biased research included unnecessary criminalization of drug use and the wasteful and ineffective mass incarceration of recreational drug users.

One sees disturbing over-reach in the area of climate modeling. Green house gases are increasing and altering the heat load of the planet. The global temperature is rising. That much is clear. What is uncertain is the prediction of precisely how the planet will respond to this increase in atmospheric carbon load. It is impossible, as of now, to develop a model that can accurately treat all the important physical phenomena. All the important planetary responses have probably not even been identified yet, let alone rendered into accurate models.

The danger is that ambitious scientists, eager to make headlines, are promoting model results and making predictions that are little more than educated guesses. When these predictions fail, the climate naysayers are provided with ammunition to support their dangerous views. This is too serious an issue to allow academic competition to undermine the programs that need to be initiated.

The general public does not read the scientific literature. What it knows of science is generally obtained through the popular media. Newspapers will pick up an article on a scientific result if it is newsworthy or has some entertainment value. Exciting new discoveries are what are being sought. Subsequent studies demonstrating that an exciting new discovery was actually false are not newsworthy and rarely see the light of day. Let the reader beware.

Ben Goldacre is the author of Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients.

Carl Hart is the author of High Price: A Neuroscientist’s Journey of Self-Discovery That Challenges Everything You Know about Drugs and Society.

Wednesday, October 23, 2013

Raining on Fracking’s Parade: The Red Queen Syndrome

The increase in oil and natural gas production as fracking became more efficient and drilling more widespread caused quite a bit of excitement. Last year the International Energy Agency (IEA) predicted that the US would exceed Saudi Arabia in oil production by 2020. Many articles were issued predicting energy "independence" for both total energy and for oil in the coming years. Unfortunately, not everyone is in agreement with the cheery projections issued by the IEA. In particular, the US Energy Information Administration (EIA) provides an improving, but more limited picture of future gains.

EIA has the US approaching total energy independence but not quite making it.




The major reason why we don’t quite make it is because the surge in oil production from fracking is expected to peak and total oil production will then resume a long-term gradual decline.



Given this projection, the conclusion is that oil imports, while lower, will have to continue indefinitely.



An article by Asjylyn Loder in Bloomberg Businessweek explains what is behind these predictions and introduces us to the Red Queen syndrome.

Wells produced by fracking behave differently than traditional wells. They are characterized by a large initial surge followed by a rapid drop off and then a more gradual but steady decline.

"Chesapeake Energy’s (CHK) Serenity 1-3H well near Oklahoma City came in as a gusher in 2009, pumping more than 1,200 barrels of oil a day and kicking off a rush to drill that extended into Kansas. Now the well produces less than 100 barrels a day, state records show. Serenity’s swift decline sheds light on a dirty secret of the oil boom: It may not last."

Production from these wells drilled into rock and shale tend to lose 60 to 70 percent of their production in the first year. This rapid decline means that if one wishes to maintain a steady supply of oil from fracking it is necessary to continue to keep drilling in order to just stay even. Thus arises the Red Queen syndrome.

"Shale wells start strong and fade fast, and producers are drilling at a breakneck pace to hold output steady. In the fields, this incessant need to drill is known as the Red Queen, after the character in Through the Looking-Glass who tells Alice, ‘It takes all the running you can do, to keep in the same place’."

The level of sustained drilling is enormous and may be reaching diminishing returns as evidence accumulates that the drilling companies started with the most promising sites, and future efforts may not be quite as productive.

"Global Sustainability’s Hughes estimates the U.S. needs to drill 6,000 new wells per year at a cost of $35 billion to maintain current production. His research also shows that the newest wells aren’t as productive as those drilled in the first years of the boom, a sign that oil companies have already tapped the best spots, making it that much harder to keep breaking records. Hughes has predicted that production will peak in 2017 and fall to 2012 levels within two years."

Loder indicates that the costs for these wells are high and vary considerably with terrain.

"The cost of drilling a horizontal shale well ranges from $3.5 million in the Mississippi lime to $9 million or more in the Bakken. That’s far more than the cost of a similar vertical well, which goes from $400,000 to $600,000, according to Drillinginfo."

He also provides this illuminating breakout of the costs.

"The well, known as Begonia 1-30H, will cost about $3.7 million. One-third of that is the cost of fracking: First, thin pipes loaded with explosives are threaded into the hole to blast the ancient reef. Then, at a cost of about $80,000, the Begonia will consume 50,000 gallons of hydrochloric acid to dissolve the limestone; another $68,000 will pay for 1,000 gallons of antibacterial solution to kill microorganisms that chew up the pipes; $110,000 goes for a soapy surfactant to reduce friction; $10,000 covers a scale inhibitor to prevent lime buildup; and $230,000 purchases 2 million pounds of sand to prop the fractures open so the oil and gas can flow into the well. Then there’s $300,000 in pumping charges, plus the cost of equipment rental, pipe, and water, which brings the price tag for fracking the well to $1.2 million. A host of other things, from cement to Porta Potty rentals, accounts for the rest of the cost."

The fracking industry may yet run into environmental restrictions. It has avoided serious problems thus far, but the constant extraction of fresh water needed for this process and the large fraction that returns as waste water can be a long-term problem. Cost is also an issue. Currently around $70 per barrel is the breakeven point. If yields fall, or the cost of business increases, then drilling may decline.

Industry spokespersons will counter with the claim that fracking is still a developing technology and yields will improve and costs will go down.

We shall see.  Meanwhile, the surest path to energy independence requires lowering demand.

Tuesday, October 22, 2013

The Political Insanity Will Continue—As It Always Has?

Adam Gopnik posted an interesting note on The New Yorker website titled The John Birchers’ Tea Party. The point he made was that the current Tea Party looks a lot like the John Birch Society of 50 years ago.

"The real analogue to today’s unhinged right wing in America is yesterday’s unhinged right wing in America. This really is your grandfather’s right, if not, to be sure, your grandfather’s Republican Party."

What has changed over the past 50 years is not so much the incidence of craziness but its distribution. With the conversion of southern Dixiecrats to Republicans the crazies are now all in one place.

"Half a century ago, the type was much more evenly distributed between the die-hard, neo-Confederate wing of the Democratic Party and the Goldwater wing of the Republicans, an equitable division of loonies that would begin to end after J.F.K.’s death. (A year later, the Civil Rights Act passed, Goldwater ran, Reagan emerged, and we began the permanent sorting out of our factions into what would be called, anywhere but here, a party of the center right and a party of the extreme right.)"

Gopnik uses the recent book Dallas 1963 by Bill Minutaglio and Steven L. Davis to compare the extremists of 1963 to those of the present day. Those authors

"....demonstrate in luxuriant detail just how clotted Dallas was with right-wing types in the period before Kennedy’s fatal visit. The John Birch Society, the paranoid, well-heeled, anti-Communist group, was the engine of the movement then, as the Tea Party is now...."

"....the continuity of beliefs is plain: Now, as then, there is said to be a conspiracy in the highest places to end American Constitutional rule and replace it with a Marxist dictatorship, evidenced by a plan in which your family doctor will be replaced by a federal bureaucrat—mostly for unnamable purposes, but somehow involving the gleeful killing off of the aged. There is also the conviction....that the man presiding over that plan is not just a dupe but personally depraved, an active collaborator with our enemies, a secret something or other, and any necessary means to bring about the end of his reign are justified and appropriate."

Gopnik pinpoints a few examples of the eerie similarities between the environments faced by Kennedy and Obama.

"Medicare then, as Obamacare now, was the key evil. An editorial in the Morning News announced that ‘JFK’s support of Medicare sounds suspiciously similar to a pro-Medicare editorial that appeared in the Worker—the official publication of the U.S. Communist Party.’ At the same time, Minutaglio and Davis write, ‘on the radio, H.L. Hunt (the Dallas millionaire) filled the airwaves with dozens of attacks on Medicare, claiming that it would create government death panels: ‘The plan provides a near[sic] little package of sweeping dictatorial power over medicine and the healing arts—a package which would literally make the President of the United States a medical czar with potential life or death power over every man woman and child in the country’."

"The whole thing came to a climax with the famous black-bordered flyer that appeared on the day of J.F.K.’s visit to Dallas, which showed him in front face and profile, as in a "Wanted" poster, with the headline "WANTED FOR TREASON." The style of that treason is familiar mix of deliberate subversion and personal depravity. ‘He has been wrong on innumerable issues affecting the security of the United States’; ‘He has been caught in fantastic lies to the American people, including personal ones like his previous marriage and divorce.’ Birth certificate, please?"

Gopnik could have gone further with his analogy between the Birchers of then and the Tea Party of now. The Birchers have at times come out in favor of disassembling the Federal Reserve System, and returning to the gold standard. They have always been against the United Nations claiming it was conduit through which the Communist conspiracy would form a world government that it could then control. These are all themes that occasionally emerge in Tea Party demonstrations.

Gopnik’s claim that the craziness is passed down from one generation to another is clearly supported by the fact that one of the twelve founding members of the John Birch Society was John Koch of Koch Industries, father of the two right-wing activist brothers who now run that company and support Tea Party activities.

Gopnik addresses race as an issue and describes the sentiments of the right-wing extremists as being only marginally based on race.

"The common core belief, then and now, is....the federal government exists to take money from hard-working white people and give it to lazy black people, and the President is helping to make this happen. This conviction, then and now, may not fairly be called racist in the sense that it isn’t just (or always) an expression of personal bigotry; rather, it is more like a resentment at an imagined ethnic spoils system gone wrong."

Gopnik could have made one more comparison between Kennedy and his time and Obama and his. Kennedy was a Catholic and, based on today’s data, to be one of the crazies you have to be a Protestant. Hatred between Catholics and Protestants has a longer and bloodier history than any such animosity between whites and blacks. His Catholicism gave Kennedy a hint of that "otherness" that Obama evokes from his political enemies.

A case can be made that race is a big issue in the confrontation between the right-wing extremists and Obama. The Southern Poverty Law Center (SPLC) keeps track of hate groups, antigovernment "patriot" groups, and armed militia movements. It concludes:

"Since 2000, the number of hate groups has increased by 67 percent. This surge has been fueled by anger and fear over the nation’s ailing economy, an influx of non-white immigrants, and the diminishing white majority, as symbolized by the election of the nation’s first African-American president."

"These factors also are feeding a powerful resurgence of the antigovernment "Patriot" movement, which in the 1990s led to a string of domestic terrorist plots, including the Oklahoma City bombing. The number of Patriot groups, including armed militias, has grown 813 percent since Obama was elected – from 149 in 2008 to 1,360 in 2012."

An article in The Economist provided this chart based on the SPLC data and coined the phrase "The Obama Effect.".




The ignorant, the violent, and the bigoted get disturbed when a Democrat is elected president; they really get riled up when that Democrat is also black.

Gopnik’s article provides a timely correlation between the current madness and past madness. But what are we to draw from that correlation? That madness is conserved through time and that that the similarly mad have always been among us? But the John Birch Society had a definite beginning, and its main concern was Communist domination. That is a fear that can be firmly placed in time.

Consider this comment by Archibald MacLeish written in 1949 and reproduced in the Introduction to Studs Terkel’s wonderful oral history "The Good War".

"Never in the history of the world was one people as completely dominated, intellectually and morally, by another as the people of the United States by the people of Russia in the four years from 1946 through 1949. American foreign policy was a mirror image of Russian foreign policy: whatever the Russians did, we did in reverse. American domestic politics were conducted under a kind of upside-down Russian veto: no man could be elected to public office unless he was on record as detesting the Russians, and no proposal could be enacted, from a peace plan at one end to a military budget at the other, unless it could be demonstrated that the Russians wouldn’t like it. American political controversy was controversy sung to the Russian tune; left-wing movements attacked right-wing movements not on American issues but on Russian issues, and right-wing movements replied with the same arguments turned round about....."

"All this....took place not in a time of national weakness or decay but precisely at the moment when the United States, having engineered a tremendous triumph and fought its way to a brilliant victory in the greatest of all wars, had reached the highest point of world power ever achieved by a single state."
This is a description of a country having a nervous breakdown; a country that had lost faith in itself. There is the reek of fear about that quote. What could have caused that fear?

World War II ended with the dropping of two nuclear weapons. The world could never be the same after that. The United States had to face the task of fighting the next war with nuclear weapons. It would take a while before the absurdity of that concept became evident. We also had to face the certainty that the Russians would also possess nuclear weapons. In that eventuality, we could no longer be protected by the broad oceans that separated us from potential enemies. A single bomb could destroy an entire city. We could no longer feel quite so "exceptional."

The Europeans suffered mass devastation and patiently went about rebuilding their nations and developing kinder, gentler versions of their former selves. We merely feared mass devastation and went bonkers, descending into wave after wave of destructive paranoia.

It is comforting to view our madness as having a specific beginning. One can then more easily believe that it can be dissipated or subdued. Rather than being passed from generation to generation as a genetic defect, it is a cultural defect that is transmitted. Those are a bit easier to deal with.

Saturday, October 19, 2013

Race-Based Affirmative Action: When It Was for Whites

Affirmative action became an issue after the civil rights gains made by the blacks in the 1960s. It was recognized that centuries of slavery followed by generations of discrimination had put the black population at a severe disadvantage. Mere elimination of discrimination would not be sufficient to overcome this disadvantage. The path affirmative action has taken was to legislate nondiscriminatory policies in hiring and other social and economic transactions, and to attempt to augment opportunities for blacks and other minorities in activities where it was deemed appropriate for the engaged population to be representative of the local demographics. Higher education and public service occupations such as policing and fire fighting were obvious diversity targets. Such race-based considerations are becoming less popular and have an uncertain legal future.

Ira Katznelson makes the case that to understand the issues surrounding affirmative action one must begin not from the perspective of the 1960s, but go back to the New Deal legislation of the 1930s. In his book When Affirmative Action Was White he describes how the social legislation of the 1930s and 1940s that was intended to help the large fraction of the population in need was constructed in such a way as to exclude most blacks and some other minorities. The result was, in effect, affirmative action for whites. As a result of these government actions the divergence in the economic health of the white and the black populations grew even greater.

This part of our history is unfamiliar to the typical citizen. If affirmative action is intended to redress grievances for past behavior, then it should be cast with a full understanding of all the grievances involved. Katznelson’s study leads him to conclude that a totally different program is required. As we shall see, his most viable approach need not be based on race at all.

To understand the structure of the New Deal era legislation it is necessary to understand the political realities of the period. Roosevelt’s party, the Democratic Party, was made up of two components: a southern bloc that was liberal in terms of social legislation as long as no effect would be seen on their racial caste system, and a non-southern bloc that was eager to pursue progressive legislation. The Republican Party was fiscally conservative and pro-business. The southern bloc would align with the other Democrats when convenient or with the Republicans if necessary to protect their racial system. Roosevelt could not be elected president without the votes of the southern states, and he could not pass any of his social legislation without the support of the southern legislators. Consequently, any legislation passed had to be consistent with the maintenance of the "southern way of life."

The "southern way of life" has always been based on maintaining a low-wage work force—and it still is today. The region had a mostly agricultural basis that required a large supply of manual labor. Slavery was the initial solution. When that was prohibited, it turned to convict labor and sharecropping as means of keeping the cost of labor low. The South was actually pro-union, or at least neutral to labor unions until they began to make inroads in the region and people began to worry about the tendency to treat whites and blacks more like equals. At that point the southern politicians teamed with the pro-business Republicans to pass legislation that crippled the union movement.

It should be noted that all the efforts to maintain low wages for blacks also hindered the southern white population. The caste system required that blacks be kept in poverty, but it also required that a population of whites must be maintained at a level just above that of the blacks. The role of the poor whites was to serve as a ceiling on the aspirations of the blacks. Racial hatred was the unofficial policy of the region. It was assumed that these poor whites would not allow significant numbers of blacks to surpass them economically or socially.

Katznelson describes a series of laws and policy decisions during the New Deal period in detailing how racial bias was built into them. We will briefly discuss the Social Security Act, the Fair Labor Standards Act (FLSA), and the GI Bill to illustrate the basis for his claim of "affirmative action for whites."

The Social Security Act was passed in 1935. From Wikipedia:

"The Act provided benefits to retirees and the unemployed, and a lump-sum benefit at death. Payments to current retirees are financed by a payroll tax on current workers' wages, half directly as a payroll tax and half paid by the employer. The act also gave money to states to provide assistance to aged individuals (Title I), for unemployment insurance (Title III), Aid to Families with Dependent Children (Title IV), Maternal and Child Welfare (Title V), public health services (Title VI), and the blind (Title X)."

The southern politicians were in favor of this legislation. They badly needed an influx of federal money to support their states, but they also wanted to control how the money was spent and who would receive the funds.

"At the hearings in the Senate, Harry Byrd, the leader of Virginia’s powerful Democratic machine, cautioned that unless adequate protections were introduced, Social Security could become an instrument by which the federal government would interfere with the way white southerners dealt with ‘the negro question’."

The strategy the southerners adopted and used throughout this period was to exclude from the provisions of the law work descriptions that were predominantly populated by blacks, and to insist that social welfare programs be administered at the state and local levels rather having national standards imposed.

They succeeded in having agricultural and domestic workers excluded from coverage under Social Security legislation.

"Across the nation, fully 65 percent of African Americans fell outside the reach of the new program; between 70 and 80 percent in different parts of the South. Of course, this excision also left out many whites; indeed, some 40 percent in a country that was still substantially agrarian. Not until 1954, when Republicans controlled the White House, the Senate, and the House of Representatives, and southern Democrats finally lost their ability to mold legislation, were occupational exclusions that had kept the majority of blacks out of the Social Security system eliminated."


The social support aspects of the Social Security legislation, aid to dependent children (ADC), support for the elderly, and unemployment compensation, were given over to state administration.

"These were federal programs whose costs were to be shared between the federal government and the states; even more important, these policies would be decisively shaped and administered by the individual states, which were granted a great deal of discretion in setting benefit levels."

In program after program, and state after state, blacks were discriminated against as whites received preferential treatment in the South.

"Consider the situation in Georgia. Of the nearly 24,000 white and 23,000 black children eligible for aid [ADC] in 1935, the state offered funds to only a small fraction. There was a huge disparity by race. Drawing on both a Social Security survey and an account by the State Department of Public Welfare, Swedish demographer Richard Sterner found that ’14.4 percent of white eligibles but only 1.5 percent of the Negro eligibles’ were funded."

Note how the intention to discriminate against blacks also caused harm to many poor whites as well.

The Fair Labor Standards Act was passed in 1938. From Wikipedia:

"The FLSA introduced a maximum 44-hour seven-day workweek, established a national minimum wage, guaranteed "time-and-a-half" for overtime in certain jobs, and prohibited most employment of minors in "oppressive child labor", a term that is defined in the statute."

This was a major piece of legislation which revolutionized the interaction between workers and employers. The enactment of a national minimum wage provided a floor on the compensation that employers must provide. The southern politicians again insisted that agricultural workers and domestics be excluded from the reach of this law before they would allow it to pass. Once again the majority of blacks were excluded, along with many others.

The G.I. Bill was passed in 1944. From Wikipedia:

"The Servicemen's Readjustment Act of 1944.... known informally as the G.I. Bill, was a law that provided a range of benefits for returning World War II veterans (commonly referred to as G.I.s). Benefits included low-cost mortgages, low-interest loans to start a business, cash payments of tuition and living expenses to attend college, high school or vocational education, as well as one year of unemployment compensation. It was available to every veteran who had been on active duty during the war years for at least ninety days and had not been dishonorably discharged; combat was not required."

This act excluded no veteran who had served. There was no overt racial bias indicated. However, the southern politicians again insisted that the administration of the act be performed locally rather than nationally. This would be deemed one of the biggest and most successful federal programs of all time, but most black veterans would be unable to take advantage of it.

If a black man wished to go to a college or a vocational school, he most likely lived in a state with segregated schools where only a few all-black schools existed. They were of poorer quality than the all-white schools and were in no position to accommodate a large number of new applicants. Outside the South most universities had racial quotas limiting the number of blacks. Some did manage to take advantage of the opportunity, but most were unable.

If a black man wished to take advantage of low interest rate, zero down payment home loans being offered, he had to convince a white banker somewhere to find him loan-worthy. Banks were not making mortgage loans to blacks before the war; why should they change policy after the war? With northern and western cities being even more highly segregated in housing than the South, there was no place to go anyways.

This imbedded discrimination severely limited access by blacks to the many provisions of the G.I. Bill and prompted Katznelson to conclude:

"On balance, despite the assistance that black soldiers received, there was no greater instrument for widening an already huge racial gap in postwar America than the GI Bill. As southern black veterans attempted to gain from these new benefits, they encountered many well-established and some new restrictions. This combination of entrenched racism and willful exclusion either refused them entry or shunted them into second-class standing and conditions."

Katznelson believes an effective—and legally defensible—affirmative action effort should be directed at correcting specific harm to specific individuals or their descendents. The differences in treatment accorded blacks under the programs described above form a basis for justifying some form of compensation. If it is too cumbersome to track individuals and their claims for compensation, the next best approach is to initiate a broad assault on inequality and poverty for those affected by discrimination.

Katznelson focuses almost exclusively on the harm done to blacks. However, the tools of discrimination were quite blunt and also harmed large numbers of poor whites. Many Latinos would have also been subject to the same forms of discrimination as blacks. It would seem that the best approach to redressing wrongs through affirmative action is an effort to reduce poverty and inequality irrespective of race. That approach would capture the greatest number of those who had a discrimination-based claim, and do some good in general.

Is such a thought ridiculous? Perhaps not. Katxnelson refers to the G.I. Bill as a social revolution.

"More than 200,000 used the bill’s access to capital to acquire farms and start businesses. Veterans Administration mortgages paid for nearly 5 million new homes.

"By 1950, the federal government had spent more on schooling for veterans than on expenditures for the Marshall Plan...."

"By 1948, 15 percent of the federal budget was devoted to the GI Bill...."

Today, 15 percent of the federal budget would amount to about $570 billion per year. That is a scale big enough to accomplish great changes within our society. If we could have a social revolution while emerging from the Great Depression and World War II, why can’t we do it again—now?

Wednesday, October 16, 2013

The G.I. Bill of Rights, Social Revolution, and the Future

In 1944 Congress passed the Servicemen’s Readjustment Act. It was commonly referred to as the G.I. Bill of Rights, or, more commonly, the G.I. Bill.

"Benefits included low-cost mortgages, low-interest loans to start a business, cash payments of tuition and living expenses to attend college, high school or vocational education, as well as one year of unemployment compensation. It was available to every veteran who had been on active duty during the war years for at least ninety days and had not been dishonorably discharged; combat was not required."

The G.I. Bill is considered as possibly the most effective piece of legislation ever passed by Congress. Ira Katznelson devotes a chapter to it in his book When Affirmative Action Was White. Katznelson’s subject was the systematic way in which New Deal legislation was biased in order to favor whites over minorities in general, and blacks in particular. While the G.I. Bill was unique in being written in a race-neutral manner, its implementation was left to local customs and situations where discrimination was common, particularly in the South, but also in the other parts of the nation. The topic of race-based affirmative action for whites is important and will be discussed in a subsequent post.

Of interest here is Katznelson’s reference to the G.I. Bill as a "social revolution." Political scientists might argue about the application of such a term in this case, but to the uninitiated it seems quite appropriate.

The government faced the prospect of 16 million military personnel returning to a home that no longer existed. They men had been changed by the war experience, and the homeland had also been changed. War had changed the economy and society in so many ways that a social revolution was inevitable. The task for the government was to devise a means whereby the returning soldiers could be accommodated in this new world.

The main features of the law were designed to provide time and resources to help those returning make the transition to civilian life. Katznelson provides this perspective:

"Even today, this legislation, which quickly came to be called the GI Bill of Rights, qualifies as the most wide-ranging set of social benefits ever offered by the federal government in a single comprehensive initiative....it reached eight of ten men born during the 1920s."

"One by one, family by family, these expenditures transformed the United States by the way they eased the pathway of the soldiers—the generation that was marrying and setting forth into adulthood—returning to civilian life. With the help of the GI Bill, millions bought homes, attended college, started business ventures, and found jobs commensurate with their skills....this legislation created middle class America. No other instrument was nearly as important."

The scale of the investment in human capital is staggering to those accustomed to today’s parsimonious legislators.

"More than 200,000 used the bill’s access to capital to acquire farms and start businesses. Veterans Administration mortgages paid for nearly 5 million new homes. Prior to the Second World War, banks often demanded that buyers pay half in cash and imposed short loan periods, effectively restricting purchase to the upper middle class and upper class. With GI Bill interest rates capped at modest rates, and down payments waived for loans up to thirty years, the potential clientele broadened dramatically."

The government spent more on educating its returning soldiers than it spent on rebuilding devastated Europe.

"By 1950, the federal government had spent more on schooling for veterans than on expenditures for the Marshall Plan....On the eve of the Second World War, some 160,000 Americans were graduating from college each year. By the end of the decade, this number had tripled, to some 500,000. By 1955, about 2,250,000 veterans had participated in higher education. The country gained more than 400,000 engineers, 200,000 teachers, 90,000 scientists, 60,000 doctors, and 22,000 dentists....Another 5,600,000 veterans enrolled in some 10,000 vocational institutions to study a wide array of trades from carpentry to refrigeration, plumbing to electricity, automobile and airplane repair to business training."

"For most returning soldiers, the full range of benefits—the entire cost of tuition plus a living stipend—was relatively easy to obtain...."

The numbers quoted above indicate the scale of the investment in a population that was a bit less than half of the nation’s current population. Another way to examine the immensity of the program is by looking at expenditures.

"By 1948, 15 percent of the federal budget was devoted to the GI Bill...."

Today, 15 percent of the federal budget would amount to about $570 billion per year.

One could summarize the goal of the G.I. Bill as guaranteeing the means for the returning soldiers to develop skills to the limit of their abilities in order to be able to earn a living. Note that popular opinion appended the term "Rights" when referring to the program. That was a noble goal and sentiment then. Why has this sentiment and the concept of a "right" to earn a living been allowed to evaporate rather than be applied to successive generations?

One could argue that the economic situation viewed by young adults today is just as unnerving as that the soldiers would have encountered after World War II without the G.I. Bill. The economy is not providing enough jobs for everyone, and most of the jobs being created are of the low-wage—non-living-wage—variety. It is not the economy returning soldiers faced so many of the actions taken then would not be appropriate today. But surely some action seems to be called for.

The phrase "full employment economy" occasionally emerges in political and economic dialogue as a specific national goal. Over the last several decades economic conditions have changed dramatically. During that period public economic policy has hardly changed at all. We have one political party that is paralyzed by nostalgia for past solutions, and one party that believes the best way to solve a problem is to do nothing.

The nation exited World War II with a more dire debt problem than the one we face today. Yet they managed to fund a very ambitious and very expensive program to revolutionize our society. The return on investment was huge and an age of prosperity ensued.

It is time for another social revolution.

Sunday, October 13, 2013

Gun Control in Britain

The precise meaning of the Second Amendment to the Constitution is debatable. One of the arguments made in support of a broad right to possess guns is based on the assumption that this right had been firmly established in English legal tradition. Since the writers of the Constitution were predominately of English heritage, they must have intended this tradition to continue. That makes sense. However, "the right to bear arms" can mean quite different things to different countries.

An article in The Economist titled Guns for hire indicated that when it comes to guns we in the US live in a quite different world than the British. Guns that could be of use to a criminal are so rare that when one wishes to have a gun to commit a crime the principle option available is to rent one from another criminal.

"In A turf war in Birmingham, the two gangs involved used the same gun for their tit-for-tat shootings, renting it in turn from the same third party, says Martin Parker, head of forensics at the National Ballistics Intelligence Service (NABIS). The paucity of guns in Britain is both testament to the success of its gun-control regime and one of the reasons for it."

The NABIS organization is part of the anti-crime establishment. It creates a database on the ballistic characteristics of all guns encountered in an attempt to be able to correlate any bullet recovered with a given weapon. Since essentially all handguns are illegal in Britain, this is a practical task; and since scarce weapons tend to be reused by criminals, it is useful in establishing evidence in criminal cases.

"Shortages also mean criminals use weapons repeatedly, leaving a useful trail of evidence. Some, like the Birmingham gangsters, hire them from others. Clean guns—ones that have not been used before—are both rare and expensive. Other countries, such as Ireland and Spain, use the same database system, allowing NABIS to share information and track guns beyond Britain’s borders. America uses it too, but tracking the guns used in crimes there would be a Sisyphean task: few guns are used repeatedly there because it is so easy to buy new ones. Gun-starved Britons cannot be so cavalier."

It is informative to view how "the right to bear arms" has evolved in Britain. Wikipedia provides an excellent summary of that history.

Yes, British subjects did have the right to bear arms, and that right was provided so that guns could be used in self-defense, but the right of the crown to limit access to weapons was at least as strong as the right of ownership. During one of the main squabbles over succession, the then ascendant Catholics tried to confiscate the arms of the Protestants. That did not end well. As a result the right to possess weapons for self defense was restated in the Bill of Rights in this manner:

"Declare,....That the Subjects which are Protestants may have Arms for their Defence, suitable to their Condition, and as allowed by Law."

This right was similarly noted in Blackstone’s Commentaries, the recognized basis for common law in England and also in the American colonies—soon to become the United States of America.

"The fifth and last auxiliary right of the subject, that I shall at present mention, is that of having arms for their defence, suitable to their condition and degree, and such as are allowed by law."

A law based on such a statement might be ruled unconstitutional in the US because it could be said to be inconsistent with the intent of the country’s founders, even though the country’s founders based the US’s initial legal system on English Common Law and particularly on Blackstone’s description of it.

The British approach to possession of weapons is tied closely to their historical concerns about insurrections. The potential for class-based strife has always been an issue. England itself has also often been threatened with internal acts of violence as it tried to keep England, Wales, Scotland, Northern Ireland, and Ireland under one big tent. The result has been a populace willing to accept limitations on access to weapons. Even with strict controls in place, the British have endured significant acts of violence. The response has always been to institute ever more stringent restrictions.

This is what is involved in acquiring a gun license in Britain.

"With a few specialised exceptions, all firearms in the United Kingdom must be licensed on either a 5-year firearm certificate (FAC) or a shotgun certificate (SGC)issued by the police for the area in which they normally reside."

"When applying for a firearm certificate, justification must be provided to the police for each firearm, and they are individually listed on the certificate by type, calibre, and serial number. A shotgun certificate similarly lists type, calibre and serial number, but permits possession of as many shotguns as can be safely accommodated. To gain permission for a new firearm, a "variation" must be sought, for a fee, unless the variation is made at the time of renewal, or unless it constitutes a one-for-one replacement of an existing firearm that will be disposed of. The certificate also sets out, by calibre, the maximum quantities of ammunition someone may buy or possess at any one time, and is used to record ammunition purchases (except where ammunition is bought to use immediately on a range under s11 or s15 of the Firearms Act)."

Not only are weapons controlled but ammunition as well.

To obtain this firearm certificate, the process is rather more involved than just checking a person’s name on a national database.

"To obtain a firearm certificate, the police must be satisfied that a person has "good reason" to own each firearm, and that they can be trusted with it "without danger to the public safety or to the peace". Under Home Office guidelines, firearms certificates are only issued if a person has legitimate sporting, collecting, or work-related reasons for ownership. Since 1968, self-defence has not been considered a valid reason to own a firearm. The current licensing procedure involves: positive verification of identity, two referees of verifiable good character who have known the applicant for at least two years (and who may themselves be interviewed and/or investigated as part of the certification), approval of the application by the applicant's own family doctor, an inspection of the premises and cabinet where firearms will be kept and a face-to-face interview by a Firearms Enquiry Officer (FEO) also known as a Firearms Liaison Officer (FLO). A thorough background check of the applicant is then made by Special Branch on behalf of the firearms licensing department. Only when all these stages have been satisfactorily completed will a licence be issued, which must be renewed every 5 years."

The notion that people in the US believe they should have the right to walk around openly carrying loaded weapons must be viewed with astonishment in Britain. This occurred in December, 2012 in Connecticut:

"A 20-year-old man wearing combat gear and armed with semiautomatic pistols and a semiautomatic rifle killed 26 people — 20 of them children — in an attack in an elementary school in central Connecticut on Friday. Witnesses and officials described a horrific scene as the gunman, with brutal efficiency, chose his victims in two classrooms while other students dove under desks and hid in closets."

"Hundreds of terrified parents arrived as their sobbing children were led out of the Sandy Hook Elementary School in a wooded corner of Newtown, Conn. By then, all of the victims had been shot and most were dead, and the gunman, identified as Adam Lanza, had committed suicide. The children killed were said to be 5 to 10 years old."

The government’s response to this act has yet to occur. Gun-rights activists were moved to conclude that such events would be less likely to occur if every citizen was armed at all times (5 to 10 year-olds?).

This incident, referred to as the Dunblane Massacre, occurred in Britain in 1996.

"On 13 March 1996, Thomas Hamilton, aged 43, a former scout leader who had been ousted by The Scout Association in 1974, shot dead sixteen young children and their teacher, Gweneth Mayor, in Dunblane Primary School's gymnasium with two Browning Hi-Power 9 mm pistols and two S&W .357 Magnum revolvers. He then shot himself."

As a result of this incident the police improved their communications between different departments and instituted more complete background checks. Then the nation went much further and passed the 1997 Firearms Act.

"Following the Dunblane massacre, the government passed the Firearms (Amendment) (No. 2) Act 1997, banning private possession of handguns almost completely. Exceptions to the ban include muzzle-loading "black powder" guns, pistols produced before 1917, pistols of historical interest (such as pistols used in notable crimes, rare prototypes, unusual serial numbers and so on), starting pistols, pistols that are of particular aesthetic interest (such as engraved or jewelled guns) and shot pistols for pest control."

So complete is this ban on handguns that British shooting teams must leave the country to practice for international competitions. Special dispensation had to be issued by the government before the relevant shooting events in the 2012 Olympics could proceed. Handguns are intended for killing, not for sport.

How can two nations with so much in common develop such different approaches to handling firearms? One could make a good argument that life in the USA as it expanded and matured required a more robust relationship with weapons than life in the closed and contained Britain. However, there are times when one might suspect that the British kicked all their crazies out of the country and sent them to America to breed their little crazies over there. Once the transmission was complete, they then encouraged the American colonies to revolt so that they could be done with them forever.

Wednesday, October 9, 2013

Law, Memory, and the Memory Wars

Our system of law entitles those accused of a crime to be judged by a jury of peers. Jury members are often asked to make judgments of the veracity of various witnesses’ recollections of events. Two issues must be considered: is the witness being truthful, and are the recollections of the witness accurate. People have much experience in dealing with those who would seek to deceive them and might feel comfortable in making a judgment as to whether or not truth is being told, but what about the accuracy of memory? What guidance do ordinary people have in assessing whether or not a person’s memories might be accurate?

Alison Winter has produced an excellent discussion of these issues in her book Memory: Fragments of a Modern History. She provides a history of the various fads and beliefs regarding memory that have come and gone over the past century. Two contending views of memory have battled for supremacy over the years. One view has it that:

"....in principle one is capable of remembering every scrap of experience—such as the shape of a crack in the ceiling in the bedroom where you slept when you were five....This kind of claim has in fact been a theme of many of the projects discussed....It is no less present now."

Not everyone can recover these memories on demand, but there were always those who claimed to be able to assist in recalling these inaccessible experiences using hypnosis, drugs, psychoanalysis, or combinations of all three.

The opposing school of thought viewed memory as a complex phenomenon highly subject to conscious and, more importantly, subconscious biases and mechanisms that confounded any claim to accuracy. These people liked to use the term "confabulation" to describe the process in the context of legal proceedings.

."Confabulation was a psychoanalytic term dating back to the early years of psychoanalysis, referring to a process that knitted memories of past experiences together with fantasies or suggestions to form an imaginative construction that appeared to be a memory. In this context it meant a constructive, imaginative filling-in-the-blanks in which incoming suggestions from an interlocutor played a key role."

Winter does not come out and declare a winner between these two views, but she presents a sequence of confrontations between the two in which the latter view always seems to come out the victor. Given the important role memory can play in legal proceedings it is not surprising that the battle between these two sides was often waged in that arena.

This struggle for dominance reached an apex in the 1980s and 1990s when a rash of claims by adults of "repressed" memories of childhood sexual abuse, usually by parents, emerged as an important legal issue. One view of memory supported the accuser, the other supported the accused. In such an emotionally-charged context the confrontation was intense and came to be referred to as "the memory wars."

A typical case might involve a woman in her twenties or thirties who had no memory of such a childhood experience until it suddenly emerged in adulthood. The construct that supported the validity of such recovered memories emerged from considerations of posttraumatic stress disorder (PTSD). In these cases, the traumatic memories were supposedly suppressed or consciously rejected and stored in a "disassociated" form until brought back to consciousness at a later date when recall was triggered in some way.

As more people came forward with similar stories of sexual abuse, this phenomenon acquired the trappings of a medical disease. Support groups were formed. Diagnoses and cures were proposed. A political dimension was added as feminist groups got involved and described these cases as the outcome of a patriarchal society.

The net effect was to provide a great deal of publicity and to suggest to other women that such an instance of abuse could have happened to them even though they could not remember it at the time. Psychologists and psychoanalysts began to claim that such suppressed memories had specific physical manifestations in later life. Eating disorders became a convenient choice as a symptom that a therapist could use to pursue a connection to parental abuse as a child.

The heart-rending stories that emerged elicited much sympathy for the presumed victims. They were encouraged to confront their parents and even take legal action against them as a means of bringing their therapy to a conclusion. Needless to say, the courts had a problem dealing with these suddenly-appearing memories of past events.

Statutes of limitation originally limited most opportunities to seek legal punishment for the abusive parent. The growing number of cases and the sympathy aroused by the victims generated public pressure to provide a path for redress. The courts have wisely been conservative in the face of unusual feats of memory. Eventually, many states modified legal codes to allow memories of long past abuse to be entered as evidence in civil cases, but only if they were of the recently-recalled variety.

This tortuous logic prohibited those adults with a definite and continuous memory from access to the courts, but provided a path for those who claimed only a recent recollection. Courts also usually assume that old memories are less reliable than fresh memories. This legal reasoning implicitly acquiesced to the controversial notion that repressed memories were somehow equivalent to fresh memories as if they had been kept stored in some safe container for all those years.

As the number of women who claimed abuse grew dramatically, those who doubted the validity of such memories became more alarmed. Certainly, such cases were not impossible, but the possibility that they were generated by the power of suggestion seemed the more likely explanation. Organizing for opposition was difficult because to cast doubt was often interpreted as an attack on a helpless victim.

Eventually the accused began to fight back. Peter and Pamela Freyd found themselves suddenly estranged from an adult daughter who accused Peter of having sexually abused her as a child. They convinced themselves that the accusation could not be true, and concluded that if this could be happening to them, then it could also be happening to others. They responded by forming an organization that was intended to provide the exact counter to the ever more prevalent recovered memory syndrome. They called their organization the False Memory Syndrome Foundation (FMSF). It became active in the early 1990s. The memory wars were on!

FMSF formed support groups, encouraged other accused to share their stories with each other and publicly, and provided a rallying point for specialists who were dubious of recovered memories and were interested in research that would demonstrate the ways by which false memories could be generated. Since memory is something about which everyone has an opinion, and popular conceptions were important in fashioning legal standards, their cause also took on the features of a public relations campaign. The idea was to wage the battle on the same platforms where recovered memory syndrome had received public blessing years earlier.

There was a legal aspect to the FMSF strategy, and it turned out to be a very effective one. Those who believed themselves to have been falsely accused were encouraged to countersue—not only against their own children, but also against the therapists who had treated their children.

A famous case, and a turning point in the memory wars, involved Gary Ramona who had been accused of incest by his daughter.

"Holly Ramona had been treated by psychotherapist Marche Isabella in 1989. Isabella gave Ramona the old truth drug sodium amytal in order to recover memories of suspected abuse. After the memories returned, Holly Ramona decided to sue her father. The initial suit was unsuccessful, but this was not the end of the matter. The accusations cost Ramona his marriage, his job, and his reputation. He blamed her therapist. In 1991 he brought suit against Isabella and her medical center, alleging that their irresponsible treatment of his daughter had injured him."

Such third-party suits are difficult to win. Ramona had an advantage in that the therapist’s records concerning his daughter had already been made publically accessible in the course of his daughter’s suit against him.

The jury ruled in Ramona’s favor and held the therapist liable. They were uncomfortable with the fact that Holly Ramona went to a therapist looking for help with bulimia and depression and ended up being guided towards a diagnosis of childhood incest. Isabella defended her actions by claiming that bulimia and depression were the body’s response to childhood incest and she was merely following where the symptoms led her.

Isabella was being guided by an unproven assumption which she had no need to justify. Her only responsibility was to her patient. The result of Ramona’s victory in court was to radically alter the terms of engagement for therapists.

"....it was assumed that psychotherapists did not have formal obligations to the wider world surrounding their clients. The Ramona decision asserted that they did. It was now impossible to separate the individual biography being developed in the consulting room from the collective memory of the family, or indeed the broader collective memory of the patient’s wider community. The ruling legitimized a popular assumption that all these kinds of memory must align before their representation of the past could be considered ‘true.’ And if they did not, the therapist could be held responsible for injuries."

More cases like that of Ramona were to come. Public, legal, and scientific opinions were beginning to shift.

"In 1993 the American Medical Association declared that recovered memories were ‘of uncertain authenticity’ and needed ‘external verification.’ The American Psychiatric Association warned that it was impossible to distinguish false memories from true ones."

The combination of a less welcoming public response to claims of victimhood and a more risky environment for therapists led to a calming of what some had described as "mass hysteria."

"In the late 1990s, repressed memory syndrome was in disgrace. Therapists struggled to defend themselves as parents sued them for damages. Survivors who had once found it comforting and energizing to speak out publicly now felt isolated. Indeed, while professional false memory proponents continued to be outspoken, it became all but impossible to connect with survivor communities. These communities—or at least individual survivors and probably informal networks of mutual support—still exist, but they are reclusive. It seems that the memory wars ended, and that false memory won."

The false memory believers won, but surely it must feel a bit of a hollow victory. Given that all things are possible, it is likely that there are some valid victims of childhood sexual abuse who are now denied even the sympathy that society would accord them. "Victory" may merely mean that society has moved from one extreme to another.

The lesson to be learned from the memory wars is that memory is unreliable, and it is easily manipulated.

If one takes the admonition of the American Psychiatric Association at face value, no individual’s memory can be verified as being "true" in a court of law. Yet, decisions of life and death are being based on such judgments. It is difficult to see how society can proceed without making such decisions, but it should be able to proceed is if under the assumption that any legal decision might be proven wrong. The elusiveness and unreliability of memory would seem to make the death penalty unwise, if not absurd.

Winter’s book begins with the story of a man who confessed to murder under questioning by police. He continued to profess his guilt even though there were twelve people who provided an alibi for him that indicated he could not have committed the crime—another example of false memory syndrome. His case is not unique. If even confessions of guilt are not to be trusted, the legal system has a responsibility to recognize that fact.

Why make an irreversible decision if you don’t have to?

Thursday, October 3, 2013

Obamacare: The Sum of All Republican Fears

Obamacare is up and running—or at least crawling, and the Republicans are madder than hell and swear they will take down this evil scheme to provide healthcare for all. Even for those who believe that intelligence and learning are hindrances for Republican legislators, the current stance of the party is exceptional. Why are they acting as if Armageddon is upon us and the nation might have to be destroyed in order to save it? What exactly is so frightening to them about Obamacare?

Ross Douthat tries to provide an intellectual underpinning for this fight to the death in a note in the New York Times titled Why the Right Fights. He claims that neither the left nor the right appreciates the long-term historical viewpoints of the other. The left views social programs such as Social Security, Medicare and Medicaid as fundamentals of national life, not as progressive victories. Any attack on them is on a par with restricting access to food and water. The left views the Reagan/Bush years as a period when conservatism had its way and caused untold harm to society.

On the other hand, the small government forces view the Reagan/Bush years as a failed opportunity. Government growth was not turned around—it was barely slowed down. The current wave of Tea Party enthusiasm has also turned out to be unsuccessful—thus far—in reigning in big government.

"But to many conservatives, the right has never come remotely close to getting what it actually wants, whether in the Reagan era or the Gingrich years or now the age of the Tea Party — because what it wants is an actually smaller government, as opposed to one that just grows somewhat more slowly than liberals and the left would like. And this goal only ends up getting labeled as "extreme" in our debates, conservatives lament, because the right has never succeeded in dislodging certain basic assumptions about government established by F.D.R. and L.B.J. — under which a slower rate of spending growth is a "draconian cut," an era of "small government" is one which in which the state grows immensely in absolute terms but holds steady as a share of G.D.P., and a rich society can never get rich enough to need less welfare spending per capita than it did when it was considerably poorer."

Douthat suggests that it is generations of frustrations accumulating and bringing the Republicans to a breaking point rather than anger at Obamacare as a specific program that has led to the current standoff.

"So what you’re seeing motivating the House Intransigents today, what’s driving their willingness to engage in probably-pointless brinksmanship, is not just anger at a specific Democratic administration, or opposition to a specific program, or disappointment over a single electoral defeat. Rather, it’s a revolt against the long term pattern I’ve just described: Against what these conservatives, and many on the right, see as forty years of failure, in which first Reagan and then Gingrich and now the Tea Party wave have all failed to deliver on the promise of an actual right-wing answer to the big left-wing victories of the 1930s and 1960s — and now, with Obamacare, of Obama’s first two years as well."

Douthat points out the reason why previous Republican generations have failed is because what they wanted to do was unpopular and they didn’t dare try to implement the changes they wished.

"....when it comes to the question of whether the state should meaningfully shrink its footprint in our society, American political reality really does seem to have a liberal bias. And so the process....has played out repeatedly in our politics: Conservative politicians take power imagining that this time, this time, they will finally tame the New Deal-Great Society Leviathan … and then they make proposals and advance ideas for doing so, the weight of public opinion tilts against them, and they end up either backpedalling, getting defeated at the polls, or both."

The Republican right—millennial version—comes across in Douthat’s description as a band of eccentric—perhaps even romantic—idealists determined to joust with one more windmill.

Douthat is perhaps too kind to this merry band of men. There are other interpretations of their actions and motives.

Eduardo Porter provides another perspective in a New York Times article: Why the Health Care Law Scares the G.O.P. In his view, the Republican Party’s attempt to deny healthcare coverage to tens of millions is based on the most craven of political calculations.

Porter uses the politics of the state of Missouri to illustrate what only appears to be irrational behavior. There was considerable support for state expansion of Medicaid for the poor and disabled under Obamacare. The Chamber of Commerce was in favor of expansion as it pointed out that Missouri hospitals would lose up to $4.2 billion in federal support over the next six years. Roughly half of the 800,000 uninsured in Missouri would have become eligible for Medicare.

"Pragmatism suggested accepting the expansion. Washington would pay the extra cost entirely for three years and pick up 90 percent of the bill thereafter."

"And it would expand health coverage in the state’s poor, predominantly white rural counties, which voted consistently to put Republican lawmakers into office."

"Missouri’s Republican-controlled Legislature — heavy with Tea Party stalwarts — rejected Medicaid’s expansion in the state anyway."

Are these Tea Party types a band of idealists standing up for a principle they hold dear? Porter thinks not. In his view, Obamacare could be successful enough to join Social Security, Medicare, and Medicaid as a popular program that became untouchable.

"....the argument that half the Republican Party has simply lost its mind has to be an unsatisfactory answer, especially considering the sophistication of some of the deep-pocketed backers of the Tea Party insurgency."

"There is a plausible alternative to irrationality. Flawed though it may turn out to be, Obamacare, as the Affordable Care Act is popularly known, could fundamentally change the relationship between working Americans and their government. This could pose an existential threat to the small-government credo that has defined the G.O.P. for four decades."

The greatest benefit from Obamacare will be felt by those who have had too high an income to be eligible for Medicaid, and too low an income to be able to afford health plan premiums. These are exactly the people (white) that the Republicans depend on to vote them back into office.

"As it turns out, the core Tea Party demographic — working white men between the ages of 45 and 64 — would do fairly well under the law."

"Take Missouri. It has about 800,000 uninsured. Almost half of them would have been eligible for expanded Medicaid benefits, had the Legislature not rejected them. Many of the rest — including families of four making up to $94,000 — will be eligible to get subsidized health insurance."

"In St. Louis, for instance, a family of four making $50,000 a year will be able to buy a middle-of-the-road "silver" health plan for $282 a month and a bottom-end "bronze" plan for $32. Even Medicare recipients will get a benefit worth a few hundred dollars a year."

"This could justify conservative Republicans’ greatest fears."

The Missouri Republicans chose to withhold needed medical care from its own most faithful partisans in order to make sure that they did not receive a benefit for which they might be appreciative. They might even think kindly of Democrats for saving a life or two here and there. That can never be allowed to happen.

Douthat seems to view the current Republican Party as an evolution of traditional conservatism, but these people are not related to his New England examples of years past. These aren’t small-government enthusiasts; they are no-government enthusiasts. They do not want to take the country back to the glory days of unfettered capitalism; these people have a more Southern heritage and would rather reinstate the eighteenth century.

They seem to yearn to return to a day when wealth could be accumulated with no restrictions on methods used. If slavery was no longer practical, then low-wage serfdom was just fine. Pensions and healthcare? Who needs them? In the good old days people worked until they died, and the minimum wage was what God intended: the wage below which a person was too weak to show up for work.

Let these people have their last stand. When they fail, one hopes they have the decency to fall on their swords and disappear forever.

Tuesday, October 1, 2013

Debtor Imprisonment Persists in the United States

The latest newsletter from the Southern Poverty Law Center (SPLC) contained an article with this headline:

"Sent to debtors’ prison for trash bill, man freed after SPLC intervention"

Debtors’ prison? Weren’t debtors’ prisons ruled illegal and terminated long ago? What is going on?

The article describes the fate of an unemployed Alabama man named Richard Van Horn who could not pay an $88 trash bill. This was deemed a misdemeanor and Van Horn was scheduled for a court hearing two months in the future. The judge demanded that he put up a $500 bond to guarantee his return for the court hearing. When Van Horn announced that he was indigent and could not provide that bond he was put in jail and destined to remain there for the next two months. Van Horn was denied legal representation normally provided to the poor because the charge against him could not result in a prison sentence—yet he ended up in prison anyway. The system was determined to house a man in prison for two months in its quest to collect on a $88 debt!

When SPLC became involved, the district attorney and judge immediately agreed to release the man without bail.

The article contained this comment:

"’It’s unconstitutional to jail poor people simply because they cannot pay,’ said Sara Zampierin, SPLC staff attorney. ‘We’re happy that Mr. Van Horn has been reunited with his family, but we know that many more people serve jail time because they have no other option’."

Many more people? Just how common is this practice?

An article by Tina Rosenberg in the New York Times sheds light on the issue: Out of Prison, Into a Vicious Circle of Debt.

"....the Brennan Center for Justice studied Mecklenburg County in North Carolina, which in 2009 arrested 564 people for failing to pay their debts and jailed just under half of them for several days before their hearings."

What is the legal justification for this practice?

"Debtor’s prisons were abolished in the United States in the early 1800s and the Supreme Court has ruled it is unconstitutional to jail someone for failing to pay a debt.  Courts get around the ruling by arguing that they are jailing people not for debt but for violating a court order."

Van Horn’s treatment apparently is not uncommon. One has to wonder what perverse incentive would encourage law enforcement to jail people at great public expense to recover such tiny amounts.

The subject of Rosenberg’s article is actually a bigger and even more troubling issue: the growing practice of charging people caught up in our legal system fees to cover expenses involved in the legal process. As she points out, it is possible for an innocent person to be charged with a crime, be acquitted of all charges, and then be forced to pay a fee to cover the state’s expenses. And as we have seen one can end up in prison if they cannot pay that fee. People actually convicted of a crime can exit prison with no money and large debts to be paid. The price for nonpayment of the debt can be a return to prison.

An editorial from 2007 in the New York Times puts the issue in focus.

"States must also end the Dickensian practice of saddling ex-offenders with crushing debt that they can never hope to pay off and that drives many of them right back to prison."

"Often, the lion’s share of the debt is composed of child support obligations that continue to mount while the imprisoned parent is earning no money. The problem does not stop there. The corrections system buries inmates in fines, fees and surcharges that can amount to $10,000 or more. According to the Justice Center study, for example, a person convicted of drunken driving in New York can be charged a restitution fee of $1,000, a probation fee of $1,800 and 11 other fees and charges that range from $20 to nearly $2,200."

Rosenberg illustrates how this system takes people who have served their time for committing a crime and sends them back to jail as debtors.

"These debts would seem to drive more than a few back into the system.  Probation officers are the front line people pushing probationers to pay, and one of their most effective weapons is the threat of arrest.  But this drives probationers into hiding if they don’t have the money.  "They end up going underground, not fulfilling their probation requirements because they can’t fulfill the court fees," said Abrigal Forrester, a program coordinator for StreetSafe Boston, which does gang intervention and other work to help reduce crime in tough neighborhoods.   If you skip your meetings with probation, you are probably going back to prison."

"In most cases, you can’t get off probation until the debt is paid.  Here’s a vicious circle: the longer probation lasts, the more money you owe.  Nor are fees waived for indigence in the vast majority of cases.  The Brennan Center found that of the 15 states with the largest number of prisoners, 13 of them charge fees for using public defenders — charging clients who are indigent by definition."

The legality of these legal procedures is doubtful, but who is going to worry about mistreatment of criminals or former criminals?

"....the Supreme Court has ruled it is unconstitutional to jail someone for failing to pay a debt.  Courts get around the ruling by arguing that they are jailing people not for debt but for violating a court order.   Charging people for using a public defender violates the spirit, if not the letter, of Gideon v Wainwright, the 1963 decision requiring state courts to provide counsel for criminal defendants who cannot afford it.  Many states also block ex-offenders from recovering the right to vote until all fees are paid — making the fee effectively a poll tax."

A procedure that attempts to raise money, but funnels people with small debts into increasingly unaffordable prison systems seems absurd. One suspects that the explanation is the standard one: somewhere, somehow, people are making money at state expense via this process.