Thursday, August 29, 2013

Capitalism and the Jobs Problem

It has become an article of faith that much of our problem with unemployment would be solved if only we had a better-educated work force. This belief leads to a call for sending ever more students off to expensive four-year colleges where they can obtain a bachelor degree and gain access to better and higher paying jobs. This position is supported by data that consistently shows college graduates are much less likely to be unemployed than those with a lesser education. Clearly it is better for any individual to pursue a college degree in order to raise the odds of employment success. But does it follow that what is good for a given individual is necessarily good for the country as a whole?

The answer to the above question might be a resounding "yes" if there was, in fact, an unmet demand for college graduates. What does the data tell us? Jordan Weissmann addresses that issue in a note in The Atlantic. He discusses a report by the Federal Reserve Bank of New York that produced the claim that as of 2012 44% of young college graduates (ages 22-27) classified themselves as underemployed. The term "underemployed" means that they were working in a job for which their college degree was not required. Weissmann also provides this chart that is presumably from the same NYFED report.





Weissmann tries to put a good face on this data (lipstick on a pig?) and claim that the current situation is similar to what has occurred in the past so we are just returning to the norm for a troubled economy.

"As New York Fed President William Dudley noted yesterday, college graduates during the 80s and early 90s were just as likely to be overqualified for their jobs as the young BA's of today. And then, as now, most grads eventually made it into jobs appropriate for their skills. Our memory of the tech boom makes the problems facing contemporary college grads seem particularly painful. But they're not entirely without precedent. And it's not clear they're a product of anything other than a bad jobs cycle."

Through good times and bad the underemployment rate for all college graduates has been between 30% and 36%. How is that in any way consistent with "most grads eventually made it into jobs appropriate for their skills?" The only way in which that chart and Weissmann’s claim can be consistent is if we are pushing a lot of people through the college experience who employers do not recognize as having college-level skills. Except for some technical areas where years of intense learning are a prerequisite, capable college graduates should be able to move from a general degree area to most others if they can convince employers they have the capability to learn and work.

There are two possible explanations for the underemployment data: we are sending the wrong people to college, and/or we don’t have as many jobs as we think for college graduates.

The Bureau of Labor Statistics (BLS) is tasked to make projections of employment data based on current trends and best-guess extrapolations. It has projected job growth or loss by occupation type for the period 2010-2020. It has provided many useful ways to look at this data. Consider this chart:



Job growth is greatest in the areas where higher education is required, and lowest for areas where a high school diploma or less is adequate. However, the difference in growth rates is not that dramatic. To recognize where the jobs are going to be one has to look at where they are now. Where the new jobs will actually be found at the end of this decade is more obvious from this chart.



Work requiring a high school diploma or less will provide 63% of the new jobs, while work requiring a college degree will contribute 24%.

We are currently seeing about 32% of students gaining a bachelor degree or greater. Would this data, coupled with the data on underemployment, in any way suggest that there is a need to ship 40% or 50% through the college mill?

This country likes to assume that free market approaches will somehow lead to the best solution. This belief is clung to in spite of example after example where markets have run amok and caused grief and havoc.

We treat higher education as a marketplace. This gives distinct advantages to those who are wealthy and those who have highly-educated parents. This is a fairly narrow distribution of people. Cognitive skills seem to be more uniformly distributed throughout the population. Our approach means that many less-capable people will have access to educational opportunity and many more-capable people will be denied that opportunity.

Our system is at best wasteful and inefficient. If we do have a need for more skilled people with a higher level of educational attainment, we should take greater pains to match the correct people with the appropriate opportunity. The market approach we have been following has not worked. In such a situation it is necessary for government intervention of some sort. What form that intervention might take is a long-term and complex issue that will have to be addressed elsewhere.

There is a major issue associated with job creation. There is no economic law or principle that suggests that a free market economy will provide a job for every person. And there is definitely no way to insure that jobs will be created at the appropriate skill level to match each individual’s capabilities.

Unfettered capitalism seems best at accumulating wealth in the hands of a few and at converting semi-skilled and low-skilled work into even lower-skilled work. The employment projections suggest a nearly two-tiered economy with most people dropping into the lowly-paid bottom tier.

Capitalism has always been a collaboration between business and government. What our society is lacking at this moment is a means of creating (recreating) the type of semi-skilled and craft jobs that have traditionally been appropriate and satisfying for a large segment of our workers. If business cannot create them it is necessary for government to step in.

Every year we hear a new estimate of the trillion or so dollars worth of effort we are behind in maintaining our current infrastructure. This issue must be addressed eventually if we are to maintain a healthy economy. If we are to seriously try to contain global warming, a large investment in infrastructure will be required. If we are to seriously respond to inevitable global warming, a large investment in infrastructure will be required. The need for this type of effort is unending.

Infrastructure investment provides work in the crafts and in semi-skilled tasks that supported a middle class in the past. If we, as a society, decide we want jobs, income, and wealth more uniformly distributed, initiating a commitment to providing world-class infrastructure within which to work and live is a place to start.

Sunday, August 25, 2013

Felony Disenfranchisement: Jim Crow Lives On

People who commit crimes deserve punishment, but after submitting to punishment shouldn’t they be encouraged to return to being full members of society? However, many states place severe restrictions on felons and as a result the number of disenfranchised citizens is growing and is affecting voting patterns and election results. What might be the justification for depriving ex-felons of the right to vote? Jamelle Bouie addresses these issues in an important article in The American Prospect: The Ex-Con Factor.

Bouie provides these data from 2010: 5.85 million US citizens (1 in 40 adults) are currently disenfranchised; 2.2 million of those are black, including 1.4 million males.

The Constitution gives states the right to define restrictions on voting for those who have committed a crime. The result is an array of different approaches that vary from lenient to quite severe. Bouie describes a number of these state restrictions. Rather than delving into details, let’s focus on effects.

Florida has the largest population of the disenfranchised. Bouie provides these numbers from 2010: 10.4% of the voting population is disenfranchised, including 23.3% of the black voting population. Can anyone doubt that these numbers are large enough to affect results in such a narrowly divided state? Would George W. Bush have become president without these voting restrictions in place? Do Florida politicians look on these numbers with dismay—or with smug satisfaction?

Bouie provides this insight:

"Disenfranchised felons are concentrated largely in the South. Which means that if you’re a voting-age American who’s forbidden to vote, odds are good that you’re living in the former Confederacy. You’re probably black, probably a man, and likely went to prison for a nonviolent offense like drug possession or theft. Racism didn’t put you in jail, necessarily, but the legacy of discrimination put you in the class of people who are more likely to be arrested, more likely to be convicted, more likely to be sentenced—and more likely to have basic rights of citizenship taken away."

The disproportionate effect on black voters was part of a plan initiated after the end of Reconstruction.

"Felony disenfranchisement laws trace back to the post-Reconstruction era when former Confederates and white Southern Democrats rolled back the political gains made by free slaves after the war. The whole point of these laws was the mass exclusion of black men from mainstream civic life. It still is."

"They were written as race-neutral but were racist in their effects, as Middle Tennessee State University history professor Pippa Holloway documents in her book Living in Infamy: Felon Disenfranchisement and the History of American Citizenship. In just the period between 1874 and 1882, every Southern state but Texas found ways to disenfranchise those convicted of minor crimes like petty theft. ‘Some Southern states changed their laws to upgrade misdemeanor property crimes to felonies,’ Holloway explains, ‘and finally, Southern courts interpreted existing laws to include misdemeanors as disenfranchising crimes’."

If one should remain dubious of the intent of these laws, Bouie provides this relevant quote:

"Lawmakers made no secret of their motivations. During the Virginia Constitutional Convention of 1901, Delegate Carter Glass—who would later become Senator Glass, co-sponsor of the Glass-Steagall Act—praised felon disenfranchisement as a plan to ‘eliminate the darkey as a political factor in this state in less than five years’."

Bouie refers to the current versions of these laws as "this last vestige of Jim Crow." But what if there is more of Jim Crow still around and about traveling under different guises? That is what Michelle Alexander believes. She makes her argument in The New Jim Crow: Mass Incarceration in the Age of Colorblindness. Consider this quote from her book:

"....the seeds of the new system of control—mass incarceration—were planted during the civil rights movement itself, when it became clear that the old caste system was crumbling and a new one would have to take its place."

Under traditional Jim Crow it was simple to charge southern blacks with some sort of crime if one wished to, but under a New Jim Crow one must be more circumspect.

"A new race-neutral language was developed for appealing to old racist sentiments, a language accompanied by a political movement that succeeded in putting the vast majority of backs back in their place. Proponents of racial hierarchy found they could install a new racial caste system without violating the law or the new limits of acceptable political discourse, by demanding ‘law and order’ rather than ‘segregation forever’."

The machinery for converting blacks into second-class citizens was to be the War on Drugs.

"More than 2 million people found themselves behind bars at the turn of the twenty-first century, and millions more were relegated to the margins of mainstream society, banished to a political and social space not unlike Jim Crow, where discrimination in employment, housing, and access to education was perfectly legal, and where they could be denied the right to vote....The New Jim Crow was born."

Alexander points out felons have more rights taken away than just the right to vote.

"It is not necessary to serve time in prison in order to be labeled forever a ‘felon,’ it is the conviction that counts. Many who plead guilty to a felony charge of marijuana possession are unaware of what that admission will cost them. This quote from the American Bar Association describes the fate awaiting these individuals."

‘[The] offender may be sentenced to a term of probation, community service, and court costs. Unbeknownst to this offender, and perhaps any other actor in the sentencing process, as a result of his conviction he may be ineligible for many federally funded health and welfare benefits, food stamps, public housing, and federal education assistance. His driver’s license may be automatically suspended, and he may no longer qualify for certain employment and professional licenses....He will not be permitted to enlist in the military, or possess a firearm, or obtain a federal security clearance. If a citizen, he may lose the right to vote....’

Alexander implies that there was an active, plotting intelligence that created the War on Drugs, produced legislation that imposed incredibly harsh penalties on even recreational drug users, that encouraged—financially—local law enforcement agencies to arrest and incarcerate as many people as possible, that had the Supreme Court provide the permission to preferentially arrest blacks, and then passed laws and instituted policies that would guarantee second-class status for life for anyone caught up in this system.

At the time I first read Alexander’s book I agreed that the net effect of what had transpired was as she described, but I had trouble with the concept of assigning a hidden, guiding hand to its implementation. However, the more I have learned about the manufactured hysteria connecting drugs and crime, drug legislation, voting right restrictions, the Supreme Court, and our criminal system, the more I have begun to wonder if I was not the one who was wrong.

I continue to wonder. We all should be wondering.

Friday, August 23, 2013

The Great Migration and Northern Segregation

I learned much from reading Isabel Wilkerson’s magnificent rendering of the twentieth century migration of the southern blacks to northern and western cities: The Warmth of Other Suns: The Epic Story of America's Great Migration. It was becoming a difficult task to organize all the material that had been presented into a few pages of discussion. Then I happened upon a note about a court case in a document published by The Southern Poverty Law Center. This organization tries to keep track of hate groups and other extremists and alert us to their activities. This report caught my eye:

"Brian Moudry, 36, who claims to have been the Illinois leader of the neo-Nazi "World Church of the Creator" (since renamed "The Creativity Movement"), pleaded guilty to burning down the home of an African-American family who moved into his Joliet, Ill. neighborhood. Nine people were asleep in the home when Moudry, whose body is covered in racist tattoos, torched it in June, 2007."

This incident, coming more than a half century after a similar incident in another town near Chicago that gained worldwide attention, suggests that the type of segregation the blacks encountered as they moved into the northern and western urban areas is not an old story; rather it is an ongoing story. One cannot understand where we are today unless one understands how we got here. That history is part of what Wilkerson has provided.

The Great Migration extended roughly from around World War I to the 1970s. Wilkerson illustrates the history of the black migrants by recounting the lives of three representative individuals who left at different times from different locations and settled in different places. Ida Mae Brandon Gladney, one of the three, left Mississippi with her children and husband in the 1930s and settled in Chicago. When she arrived, the city was already organized into black and white sections and the black sections were already overcrowded. And there were many more yet to come.

"By the time it was over, no northern or western city would be the same. In Chicago alone, the black population rocketed from 44,103 (just under three percent of the population) at the start of the Migration to more than one million at the end of it. By the turn of the twenty-first century, blacks made up a third of the city’s residents, with more blacks living in Chicago than in the entire state of Mississippi."

What Gladney encountered in Chicago was typical of what was occurring in other urban centers: a type of segregation that was, in some ways, more extreme than that she had encountered in the South.

"By the time the Migration reached its conclusion, sociologists would have a name for that kind of hard-core racial division. They would call it hypersegregation, a kind of separation of the races that was so total and complete that blacks and whites rarely intersected outside of work."

"They were confined to a little isthmus on the South Side of Chicago that came to be called ‘Bronzeville,’ the ‘black belt,’ ‘North Mississippi.’ It was a ‘narrow tongue of land, seven miles in length and one and one-half miles in width,’ as the midcentury historians St. Claire Drake and Horace Clayton described it, where a quarter million colored people were packed on top of one another by the time Ida Mae and her family arrived."

Since the blacks were forced to live within this given area, they had little recourse but to pay exorbitant rents to absentee landlords. Edith Abbott of the University of Chicago is referenced.

"’The rents in the South Side Negro district were conspicuously the highest of all districts visited,’ Abbott wrote. Dwellings that went for eight to twenty dollars a month to white families were bringing twelve to forty-five dollars a month from black families, those earning the least income and thus the least able to afford a flat at any rent, in the early stages of the migration."

"The story played out in virtually every northern city—migrants sealed off in overcrowded colonies that would become the foundation for ghettos that would persist into the next century. These were the original colored quarters—the abandoned and identifiable no-man’s lands that came into being when the least-paid people were forced to pay the highest rents for the most dilapidated housing owned by absentee landlords trying to wring the most money out of a place nobody cared about."

Chicago was a city of immigrants. Many of the whites preceded the blacks and had economic stakes that were threatened by the new arrivals. Employers took advantage of the needy blacks and used them as strikebreakers and generally issued threats to the whites that their jobs could be lost to blacks who would work for less pay. This did nothing to benefit race relations. Many of the white immigrants came from regions where they had little or no exposure to black people. Their attitude towards blacks was difficult to describe as anything other than hatred.

"The color line in Chicago confined them to a sliver of the least desirable blocks between the Jewish lakefront neighborhoods to the east and the Irish strongholds to the west, while the Poles, Russians, Italians. Lithuanians, Czechs and Serbs, who had only recently arrived themselves, were planting themselves to the southwest of the colored district."

What is remembered today about our larger cities are the more recent riots in which blacks rampaged, often through their own neighborhoods. However, throughout the longer view of history, riots that broke out tended to be acts of angry whites.

"....from the Draft Riots of the 1860s to the violence over desegregation a century later—riots were often carried out by disaffected whites against groups perceived as threats to their survival. Thus riots would become to the North what lynchings were to the South, each a display of uncontained rage by put-upon people directed toward the scapegoats of their condition."

Harvey Clark and his wife were college-educated blacks who were living, as a family of five in half of a two room apartment and paying $56 a month rent. They found a modern five-room apartment that they could rent for $60 a month, only four dollars more, and decided to move. The new apartment was located in Cicero, just across the Chicago line. It was 1951.

Cicero was a working class section inhabited mostly by first or second generation white immigrants. When the Clarks first attempted to move in they were prevented by the Cicero police and told to go away and not come back. With legal permission in hand they returned to try again. This time they were met by a group of women who had gathered to heckle them as they moved their furniture. Over the course of the day the crowd grew larger and the Clarks were forced to flee as it got out of control.

A mob entered the third floor apartment and threw out a window all the possessions small enough to toss, and smashed everything else. The Clarks’ belongings were then burned in a pile on the grass as the crowd cheered.

"The next day, a full-out riot was under way. The mob grew to four thousand by early evening....They hurled rocks and bricks. They looted. Then they firebombed the whole building. The bombing gutted the twenty-unit building and forced even the white tenants out. The rioters overturned police cars and threw stones at the firefighters who were trying to put out the blaze."

"Illinois Governor Adlai Stevenson had to call in the National Guard....It took four hours for more than six hundred guardsman, police officers, and sheriff’s deputies to beat back the mob that night and three more days for the rioting over the Clarks to subside."

"The Cicero riot attracted worldwide attention. It was front-page news in Southeast Asia, made it into the Pakistan Observer, and was remarked upon in West Africa."

Adlai Stevenson was moved to refer to being constrained within segregated housing as being trapped within an "iron curtain."

Racial hatred was not the only motivation for keeping blacks out of an all-white neighborhood. There was also the very real fear of loss in economic value of the homes in the area. The accepted truth was that as soon as a single black moves in, property values plummet. Wilkerson points out that the situation was actually more complicated.

"Contrary to conventional wisdom, the decline in property values and neighborhood prestige was a by-product of the fear and tension itself, sociologists found. The decline often began, they noted, in barely perceptible ways, before the first colored buyer moved in."

"The instability of a white neighborhood under pressure from the very possibility of integration put the neighborhood into a kind of real estate purgatory. It set off a downward cycle of anticipation, in which worried whites no longer bought homes in white neighborhoods that might one day attract colored residents even if none lived there at the time. Rents and purchase prices were dropped ‘in a futile attempt to attract white residents,’....With prices falling and the neighborhood’s future uncertain, lenders refused to grant mortgages or made them more difficult to obtain. Panicked whites sold at low prices to salvage what equity they had left, giving the homeowners who remained little incentive to invest any further to keep up or improve their properties."

Note that this death spiral needed no action on the part of a black to occur. The threat of blacks moving in, whether real or contrived by real estate speculators, allowed speculators to buy up cheap property from whites and resell it at a premium to blacks who would be willing to pay above market prices to get into the neighborhood.

As the numbers of blacks grew, the black footprint had to grow also. Sometimes it was met with violence or threats of violence, sometimes not. But it was always met with white flight.

In 1968 Gladney and her extended family bought a house in a formerly all-white area called South Park.

"The turnover was sudden and complete and so destabilizing that it even extended to the stores on Seventy-fifth Street, to the neighborhood schools and to the street sweeping and police patrols that could have kept up the quality of life. It was as if the city lost interest when the white people left."

It took only a few years for the neighborhood school to go from essentially all white to essentially all black.

Chicago would go on to earn—and deserve—the title of the most segregated city in America. However, it was only one of many that struggled to keep their blacks behind an "iron curtain."

Is progress being made? Perhaps, but it often seems to be imperceptibly slow.

"In 2000, the U.S. Census found that, of Cicero’s population of 85,616, just one percent of the residents were black, nearly half a century after the riots that kept the Clarks from moving in."

Interestingly, there was one location that integrated with little difficulty: Hyde Park. Hyde Park was an oasis in a sea of black neighborhoods. As the home to the University of Chicago, the residents likely had little in common with those of Cicero. It seems that it was such a desirable location that the white residents could make no sense out of leaving. Where could they go that was comparable? It also helped that it was a very expensive place to buy into. Anyone who could afford to move in was probably "good people" no matter the color of their skin. Hyde Park would become home to a black man who would one day become quite famous.

Ida Mae Brandon Gladney attended a neighborhood meeting in 1997 where the featured speaker was her Illinois State Senator, a young black man named Barack Obama. She and the others listened politely, asked a few questions, and then he was gone.

Ida Mae Brandon Gladney died in 2004. One has to wonder at what she might have thought if she were told that a black man—from Chicago no less—would soon be President of the United States.

Monday, August 19, 2013

Challenging Everything You Know About Drugs and Society

Carl Hart is a professor in the departments of Psychology and Psychiatry at Columbia University. His research specialty has been illegal drugs, their effects on the human body, and the implications for society. He is co-author of a textbook on these subjects. 

Hart has also written a book for a general readership: High Price: A Neuroscientist’s Journey of Self-Discovery That Challenges Everything You Know About Drugs and Society. He provides this description of his motivation.

"The primary reason I wrote this book was to show the public how the emotional hysteria that stems from misinformation related to illegal drugs obfuscates the real problems faced by marginalized people. This also contributes to gross misuses of limited public resources."

Hart is also a black man who came to adulthood about the time of the surge in usage of crack cocaine. He grew up in a segregated area of Miami, the type of location that became ground zero in the "War on Drugs." Hart had left to join the Air Force before this surge actually hit, but he was able to observe the effects of cheap cocaine on his family, friends, and neighborhood. He could have written a book that focused solely on drug and drug effects on humans, but feared that such an approach could not deal adequately with the importance of a person’s individual circumstances in determining the difference between no drug use, recreational drug use, and drug addiction.

Hart tells his story about drugs and society from the perspective of his own life and experiences.

"To truly understand where I came from, you have to understand where I wound up—and how mistaken ideas about drugs, addiction, and race distort the way we see lives like mine and therefore, how society addresses these questions."

Hart’s autobiography provides valuable insights into what it is like to be a young black man in our society. He makes clear the historical tendency to link drug use and race, and to use that link to argue for strict drug laws and tough mandatory sentences even for mere possession. The result has been a situation where illegal drug use and sales are roughly equally common between the races, but arrests and incarceration fall far more heavily on blacks and Hispanics. This is too important a topic to treat casually. Consequently, the focus here will be on drugs and physical response with racial issues mostly left for a later date.

Hart focuses on cocaine. At some point drug producers concluded that marijuana was too bulky and not sufficiently profitable and switched to cocaine as their drug of choice. The characteristics of cocaine are summarized in Wikipedia.

"Cocaine increases alertness, feelings of well-being and euphoria, energy and motor activity, feelings of competence and sexuality. Athletic performance may be enhanced in sports where sustained attention and endurance is required. Anxiety, paranoia and restlessness can also occur, especially during the comedown. With excessive dosage, tremors, convulsions and increased body temperature are observed."

"Occasional cocaine use does not typically lead to severe or even minor physical or social problems."

Much the same could be said for alcohol use and cigarette smoking. Why has cocaine then been long viewed as a dangerous drug that must be declared illegal? Hart references a study by David Musto: The American Disease: Origins of Narcotic Control to arrive at this observation.

"....between 1898 and 1914 numerous articles appeared in the scientific literature and popular press exaggerating the association of heinous crimes and cocaine use by blacks.....As Musto has detailed, ‘experts’ testified before Congress that ‘most of the attacks upon white women of the South are the direct result of a cocaine-crazed Negro brain.’ As a result, it was not difficult to get passage of the Harrison Narcotics Tax Act of 1914, which effectively prohibited the drug."

It was understandable at the time that stories like this would emanate from the South where it was important to maintain the myth that blacks were inferior in as many ways as possible to whites. It is disturbing to realize that so many historians and others of influence in the North bought into this myth and propagated it.

Does the use of the drug actually produce "cocaine-crazed" brains? First some facts about the drug and its usage are required. For those who are not already conversant with the details:

"Powder cocaine is chemically known as cocaine hydrochloride....This form of cocaine can be eaten, snorted, or dissolved in water and injected. Cocaine hydrochloride cannot be smoked, however, because it decomposes under the heat required to vaporize it. Smoking requires chemically removing the hydrochloride portion, which does not contribute to cocaine’s effects anyway. The resulting compound is just the cocaine base (aka freebase or crack cocaine), which is smokable. The important point here is that powder and crack cocaine are qualitatively the same drug."

What is important to a drug user is that the effect be delivered reliably and quickly. Ingesting the drug is a slow and unreliable process. Snorting powder delivers an effect in a few minutes. The fastest approach to a high is by getting the drug into the blood stream via injection or by inhaling vapor into the lungs. Smoking also allowed for an efficient high while requiring only a small amount of the drug. The discovery of a simple and safe way to mass produce crack cocaine meant that the drug could be sold in small amounts and at low prices.

Powder continued to be the drug of wealthier whites, while crack provided a high cheap enough to become popular in the black community. This upsurge in usage led to an upsurge in scare stories about crimes and family dissolution similar to the "cocaine-crazed negro brain" tales from an earlier era. The War on Drugs was in place and pounced on this phenomenon and declared "black cocaine" to be more dangerous than "white cocaine." Much more severe penalties were associated with crack than with powder cocaine. This allowed, and even suggested, that law enforcement should focus on arresting and incarcerating blacks even though usage was just as common among whites. And that is exactly what happened.

Strict laws against the possession, usage, and sale of a drug seem reasonable if the drug is actually highly destructive in leading to addiction, criminal behavior, and social disruption. The point Hart makes is that this condition does not exist for cocaine—nor for many other drugs we criminalize.

A definition of addiction is needed. Hart uses the one described by psychiatrists in their catalogue of mental disorders.

"....a person’s drug use must interfere with important life functions like parenting, work, and intimate relationships. The use must continue despite ongoing negative consequences, take up a great deal of time and mental energy, and persist in the face of repeated attempts to stop and cut back."

Given this definition and actual data on drug dependence, such behavior is relatively rare.

"But more than 75 percent of drug users—whether they use alcohol, prescription medications, or illegal drugs—do not have this problem. Indeed, research shows repeatedly that such issues affect only 10-25 percent of those who try even the most stigmatized drugs, like heroin and crack."

Even the person who becomes a regular user of the drug continues to maintain the ability to choose to take the drug or not depending on the given circumstances. The image of the addict being driven mad with desire for his drug just doesn’t happen. Hart suggests that the desire for the drug is more closely analogous to the desire humans feel for sex and food—both being cravings difficult but possible to control.

Destructive use of a drug does not spring from an uncontrollable physical craving, although the craving is a factor.

Further enlightenment comes from animal studies. Hart describes an experiment where a collection of rats were divided so that one group lived in a shared space (referred to as Rat Park) where they could interact and socialize in whatever manner rats do. The others were placed in isolation in individual cages. All the rats had access to water doped with morphine at a level that should have produced a physical response.

"They found that while the isolated rats quickly took to morphine drinking, the Rat Pak rats did not. Indeed, even when the morphine solution was so sweet as to be overwhelmingly attractive to rats, the Rat Pak residents still drank much less of it than the solitary animals did. Under some circumstances, the isolated rats would drink twenty times more morphine than their social-living compatriots."

Similar results have been obtained with rats using cocaine and amphetamines rather than morphine. The conclusion seems to be that when natural rewards are available, they are generally preferred to the artificial, drug-induced reward.

"When natural rewards, such as social and sexual contact and pleasant living conditions—also known as alternative reinforcers—are available to healthy animals, they are typically preferred. There is now a plethora of evidence collected in animals and humans showing that the availability of non-drug alternative reinforcers decreases drug use across a range of conditions."

Humans and rats have needs. If those needs are not being met in their existing environment, both are more likely to turn to drugs as a source of satisfaction.

"....when people have appealing alternatives, they usually don’t choose to take drugs in a self-destructive fashion. But it does show that in the absence of social support or other meaningful rewards, cocaine can be very attractive indeed. The bottom line is that we have been repeatedly told that drugs like crack cocaine are so attractive that users will forego everything for them. Nonetheless, overwhelming empirical evidence indicates that this is simply not true."

The importance of social context for drug use cannot be overemphasized. People who live in an environment with social support and other alternative reinforcers are less likely to use drugs in a destructive fashion than those whose environment is more devoid of alternative reinforcers. One would then expect addiction to be higher in lower-income neighborhoods than in middle class or wealthier environments.

"....despite years of media-hyped predictions that crack’s expansion across classes was imminent, it never ‘ravaged’ the suburbs or took down significant percentages of middle- or upper-class youth. Though the real proportion who became addicted to crack in the inner city was low, it was definitely higher than it was among the middle classes, just as is true for other addictions, including alcohol."

Given the data Hart has assembled, the path we are following seems to make little sense. Recreational use of drugs seems to be as old as recorded human history. If recreational use is a viable lifestyle choice, why criminalize it? Why not leave controlled users alone and provide support for those in need of help. It is a lot cheaper to provide counseling to a few than to imprison many.

Hart recommends the approach taken by Portugal. In 2001 Portugal decriminalized all illegal drugs. Possession of drugs for recreational use is no longer a crime. Those found in possession of drugs are cited and required to appear before a board consisting of a social worker, a medical professional, and a mental health professional. If the person is deemed to not have a health problem due to drug use, he/she is fined and sent on their way. If a health issue exists, the person is referred to appropriate care givers. But even in this case treatment is not mandatory. Repeat offenders can be issued greater penalties, but are still not considered criminals.

"How has decriminalization been working out for the people of Portugal? Overall they have increased spending on prevention and treatment, and decreased spending for criminal prosecution and imprisonment. The number of drug-induced deaths has dropped, as have overall rates of drug use, especially among young people (15-24 years old)."

Compare that to our system with its severe penalties and immense prisoner population.

"A 3,500 percent increase in spending to fight drugs between 1970 and 2011 had no effect on daily use of marijuana, heroin, or any type of cocaine. And while crack has been seen as a largely black problem, whites are actually more likely to use the drug, according to national statistics."

Wednesday, August 14, 2013

Designer Drugs: An Attempt at Legalization

Everywhere, it seems, countries grow weary of the attempt to prohibit people from getting high. Given that this desire appears to be a universal trait of humanity, and that the practice of inducing such states predates the existence of nations, one might have predicted this outcome.

An article in The Economist describes one country with a unique drug problem that is making an attempt at a unique solution: New Zealand.

New Zealand is a small country in a remote location. Apparently the mass introduction of conventional illegal drugs has proved to be not cost-effective for the traditional brand of smuggler. Since such drugs are rare, people have had to turn to synthetic drugs, generally referred to as designer drugs, for their highs. The problem with stopping this practice is that clever chemists can create new drugs faster than they can be detected and declared illegal.

"An unlikely leader in legal highs is New Zealand. Conventional hard drugs are scarce in the country, because traffickers have little interest in serving 4m people far out in the South Pacific. Kiwis therefore make their own synthetic drugs, which they take in greater quantity than virtually anyone else. The government shuts down more crystal-meth labs there than anywhere bar America and Ukraine. But the business has adapted. First it turned to benzylpiperazine, which a third of young New Zealanders have tried. When that was banned in 2008, dealers found plenty of other chemicals to peddle. Today the most popular highs are synthetic cannabinoids, which pack a harder punch than ordinary cannabis."

The scale of the problem—worldwide—is captured by this bit of data.

"In June the UN reported more than 250 such drugs in circulation."

The government of New Zealand has decided to forego this endless process and create a path whereby drugs can be tested for their risk to consumers and legally sold provided they are not found to be "too dangerous."

"Last month a law was passed which offers drug designers the chance of getting official approval for their products. If they can persuade a new ‘Psychoactive Substances Regulatory Authority’ that their pills and powders are low risk, they will be licensed to market them, whether or not they get people high. Drugs will have to undergo clinical trials, which the government expects to take around 18 months—much less than for medicines, because the drugs will be tested only for toxicity, not for efficacy. Drugs that are already banned internationally, such as cocaine and cannabis, are ineligible. Only licensed shops will sell the drugs, without advertising and not to children."

The policy of criminalizing the use of drugs that might become addictive seems to have been based on invalid assumptions about the probability of addiction. Carl Hart tries to interject scientific results into discussions that have traditionally been more political in nature. Hart is a professor at Columbia University whose specialty has been the study of conventional illegal drugs and their effects on humans. He argues that using drugs like cocaine for recreational purposes is a sustainable path—although not necessarily a desirable one—for most people. Addiction rates are about 10-20% for such drugs, and the causes of addiction are complex and cannot be simply attributed to an uncontrollable physical craving.

Given Hart’s conclusions, the total prohibition of such drugs and the criminalization of possession and usage make little sense.

The officials of New Zealand have a daunting task ahead of them. Mind-altering drugs that cause a high will inevitably be abused by some fraction of the population. Dealing with problems that arise must be included in whatever process is put in place. To make the process acceptable to the recreational user the drugs legalized must provide an "adequate" high—whatever that might be. Consumers must be willing to pay the cost and the imposed tax, and be willing to make their purchases in an open manner. Since drug users seem to come in all ages, how does one define "children" in this context?

One can think of many reasons why this path will not work. Nevertheless, given any chance at all that it will bring greater safety and minimize "crime," it should be tried.

Who knows? Perhaps the pharmaceutical industry will see this as a new market and compete to make best and safest high.

As the author of the referenced article wisely points out, the decisions New Zealand is trying to make are already being made by the drug dealers. This is perhaps one of those few occasions where everyone would be more comfortable with decisions being placed in the hands of government regulators.

Carl Hart is the author of High Price: A Neuroscientist’s Journey of Self-Discovery That Challenges Everything You Know About Drugs and Society.

Monday, August 12, 2013

The War on Drugs, Civil Forfeiture, and Highway Robbery

Sarah Stillman has provided a revealing—and startling—look at the process known as "civil forfeiture." Her article appeared in The New Yorker under the title Taken. Her lede explains why this is such an important topic.

"Under civil forfeiture, Americans who haven’t been charged with wrongdoing can be stripped of their cash, cars, and even homes."

While we are told that the Constitution protects us from "unreasonable searches and seizures," and that our property cannot be taken from us without "due process," the reality is much less reassuring.

Beginning in the 1970s, law enforcement officials began trying new methods to counter organized crime associated with the illegal drug trade. The concern was that those most responsible were those most difficult to bring to justice through the normal criminal justice paths. The intent was to punish by confiscation of property those who could not be punished by criminal statute. The definition of property associated with criminal activity was quite broad. One should recognize that this is moving law enforcement into murky areas.

"Forfeiture in its modern form began with federal statutes enacted in the nineteen-seventies and aimed not at waitresses and janitors but at organized-crime bosses and drug lords. Law-enforcement officers were empowered to seize money and goods tied to the production of illegal drugs. Later amendments allowed the seizure of anything thought to have been purchased with tainted funds, whether or not it was connected to the commission of a crime."

Such forfeitures were comparatively rare until the passage of the Comprehensive Crime Control Act in 1984 and the establishment of a program called Equitable Sharing. The goal seemed to be the enhancement of law enforcement activities without having to pay for the increase. The federal government was providing local law enforcement agencies a way to earn money by arresting people and confiscating their property. The more people arrested, the more money earned.

"It established a special fund that turned over proceeds from forfeitures to the law-enforcement agencies responsible for them. Local police who provided federal assistance were rewarded with a large percentage of the proceeds, through a program called Equitable Sharing. Soon states were crafting their own forfeiture laws."

"Revenue gains were staggering. At the Justice Department, proceeds from forfeiture soared from twenty-seven million dollars in 1985 to five hundred and fifty-six million in 1993. (Last year, the department took in nearly $4.2 billion in forfeitures, a record.)"

Drug arrests and property forfeitures soared as police officers, law enforcement agencies, and even communities began to depend on this source of income. Not surprisingly, there were abuses of the privilege.

"’Unfortunately, I think I can say that our civil-asset-forfeiture laws are being used in terribly unjust ways,’ Henry Hyde, the Republican chairman of the House Judiciary Committee, declared in 1997, ‘and are depriving innocent citizens of their property with nothing that can be called due process’."

To address this issue, Congress passed the Civil Asset Reform Act in 2000 in an attempt to limit the extent of forfeiture actions and to provide more legal rights for those whose property had been taken.

"....requiring that federal prosecutors prove "a substantial connection between the property and the offense," and allowing people who can prove themselves "innocent owners" to keep their property."

The way the process works, the government sues the property itself, not the owner of the property. If a court approves the action, then the owner has to sue to overturn the ruling. Not many people can afford the cost of pursuing justice in such an instance.

The courts have given law enforcers broad latitude in determining what property is liable to forfeiture. In 2006, in United States v. $127,000, an appeals court overruled a lower court and ruled that a Latino driver who was stopped with the indicated sum of money in his car provided suspicious enough circumstances that the police could confiscate the funds as drug-related money. No evidence of a crime existed, and none was needed.

The laws, both state and federal, are vague and can be interpreted as needed. The financial incentives continue to exist, and the need for revenue from forfeiture has become more acute as the economic situation has limited other sources of revenue.

"But civil-forfeiture statutes continued to proliferate, and at the state and local level controls have often been lax. Many states, facing fiscal crises, have expanded the reach of their forfeiture statutes, and made it easier for law enforcement to use the revenue however they see fit. In some Texas counties, nearly forty per cent of police budgets comes from forfeiture. (Only one state, North Carolina, bans the practice, requiring a criminal conviction before a person’s property can be seized.)"

Stillman provides several examples of what civil forfeiture has meant in practice. Two of these will serve to illustrate the seriousness of the issue.

Jennifer Boatright and Ron Henderson and two children, were driving from Houston to the town of Linden. They planned on buying a used car in Linden and were carrying cash to cover the purchase price. They passed through the town of Tenaha on the way and were pulled over by a police car. They were told that they had been stopped because they had been driving in the left lane for more than a mile without passing. The police then asked if they could search the car for drugs. This is what ensued.

"The officers found the couple’s cash and a marbled-glass pipe that Boatright said was a gift for her sister-in-law, and escorted them across town to the police station. In a corner there, two tables were heaped with jewelry, DVD players, cell phones, and the like. According to the police report, Boatright and Henderson fit the profile of drug couriers: they were driving from Houston, ‘a known point for distribution of illegal narcotics,’ to Linden, ‘a known place to receive illegal narcotics.’ The report describes their children as possible decoys, meant to distract police as the couple breezed down the road, smoking marijuana. (None was found in the car...."

After waiting an hour the county district attorney, Lynda K. Russell, showed up and made them an offer they could not refuse.

"They could face felony charges for "money laundering" and "child endangerment," in which case they would go to jail and their children would be handed over to foster care. Or they could sign over their cash to the city of Tenaha, and get back on the road."

They were told that the police thought they were criminals, but if money was paid, they could be set free. Was there ever a more obvious example of a shakedown?

Lest one think that this is a possibility only in remote rural areas of Texas, consider the case of Mary and Leon Adams a black couple in their eighties living in their home in West Philadelphia. Also living with them was an adult son, Leon Jr.

Leon Jr. was said to be involved as the seller in a few $20 marijuana transactions with a police informant. The police sent a SWAT team to break down the door (without warning) and arrest the son. A month later:

"....Mary and Leon Adams were finishing breakfast when several vans filled with heavily armed police pulled up to their red brick home. An officer announced, ‘We’ll give you ten minutes to get your things and vacate the property.’ The men surrounding their home had been authorized to enter, seize, and seal the premises, without any prior notice."

There was this justification for the abrupt action.

"....the state was now seeking to take the Adamses’ home and to sell it at a biannual city auction, with the proceeds split between the district attorney’s office and the police department. All of this could occur even if Leon, Jr., was acquitted in criminal court; in fact, the process could be completed even before he stood trial."

Stillman discovered that about 100 properties in the city are seized and auctioned off each year.

The Adamses did receive a bit of a reprieve.

"....when an officer observed Leon’s frail condition, he told them that they could stay in the house while the forfeiture proceedings advanced. This gave them some time to figure out how to fight."

"’We had no money,’ Mary told me, so they couldn’t hire a lawyer. But they learned of a free "Civil Practice" clinic at the University of Pennsylvania Law School, run by Louis Rulli, where students help indigent homeowners challenge civil-forfeiture claims."

Conversations between Stillman and Rulli included the contention that home forfeitures were common and they were mostly imposed on defenseless minorities.

"The public records I reviewed support Rulli’s assertion that homes in Philadelphia are routinely seized for unproved minor drug crimes, often involving children or grandchildren who don’t own the home. ‘For real-estate forfeitures, it’s overwhelmingly African-Americans and Hispanics,’ Rulli told me. ‘It has a very disparate race and class impact’."

To further make his point, Riulli summoned this counterexample.

"He went on to talk about Andy Reid, the former coach of the Philadelphia Eagles, whose two sons were convicted of drug crimes in 2007 while living at the family’s suburban mansion in Villanova. ‘Do you know what the headline read? It said, "The Home Was an "Emporium of Drugs." An emporium of drugs!’ The phrase, Rulli explained, came directly from a local judge. ‘And here’s the question: Do you think they seized it?’"

The Adamses are still in limbo waiting for their case to be resolved, but they at least have legal representation.

Boatright and Henderson fought back, and with numerous other victims became part of a successful class action suit. They received no reimbursement for their loss, but did have the satisfaction of seeing Tenaha ordered to stop robbing drivers as they passed through. The Texas state legislature was sufficiently embarrassed that they placed some minor restrictions on what law enforcement agencies could do with seized property.

Stillman finds that a few significant patterns have emerged from her research. She has concluded that it is the financial incentives that drive the abuse of the system. In states where there is no local control over seized items there is little abuse evident. Where there is financial gain to be had, the temptation is too great to resist.

Another conclusion is that it is not wealthy and powerful criminals that have the most to fear. Rather, it is minorities and others who appear too poor or too frightened to cause any trouble.

"Yet only a small portion of state and local forfeiture cases target powerful entities. ‘There’s this myth that they’re cracking down on drug cartels and kingpins,’ Lee McGrath, of the Institute for Justice, who recently co-wrote a paper on Georgia’s aggressive use of forfeiture, says. ‘In reality, it’s small amounts, where people aren’t entitled to a public defender, and can’t afford a lawyer, and the only rational response is to walk away from your property, because of the infeasibility of getting your money back.’ In 2011, he reports, fifty-eight local, county, and statewide police forces in Georgia brought in $2.76 million in forfeitures; more than half the items taken were worth less than six hundred and fifty dollars. With minimal oversight, police can then spend nearly all those proceeds, often without reporting where the money has gone."

The focus here has been on drug-related incidents, but forfeiture laws are much broader than that.

"Hundreds of state and federal laws authorize forfeiture for cockfighting, drag racing, basement gambling, endangered-fish poaching, securities fraud, and countless other misdeeds."

And remember—one needn’t be guilty—only worthy of "suspicion," or associated with a person worthy of "suspicion."

Thursday, August 8, 2013

Can We Trust Clinical Trials? Size Matters—So Does Integrity

We will discuss two articles that call into question the accuracy of the current approach to determining the efficacy of drugs: clinical testing. The first has the eye-catching title Do Clinical Trials Work? It was written by Clifton Leaf and appeared in the New York Times. The second article had an even more provocative title: Lies, Damned Lies, and Medical Science. It was provided by David H. Freedman in The Atlantic.

Let’s begin with Leaf’s article.

Leaf describes a study of the drug Avastin for use with patients having a form of brain cancer.

"Mark R. Gilbert, a professor of neuro-oncology at the University of Texas M. D. Anderson Cancer Center in Houston, presented the results of a clinical trial testing the drug Avastin in patients newly diagnosed with glioblastoma multiforme, an aggressive brain cancer. In two earlier, smaller studies of patients with recurrent brain cancers, tumors shrank and the disease seemed to stall for several months when patients were given the drug, an antibody that targets the blood supply of these fast-growing masses of cancer cells."

"But to the surprise of many, Dr. Gilbert’s study found no difference in survival between those who were given Avastin and those who were given a placebo."

The smaller studies were performed without comparison groups so that any improvement obtained could be attributed to the action of the drug whether it was or not.

Leaf tells us that many physicians believe, from personal experience, that Avatin was helpful for some of their patients. One could discount the anecdotal data as being unreliable, and, based on the Gilbert results, conclude that the medication was of no value in this case.

Leaf seems to accept the anecdotal claims and draw a different conclusion: the drug works, but only on a small class of patients.

"Some patients did do better on the drug, and indeed, doctors and patients insist that some who take Avastin significantly beat the average. But the trial was unable to discover these ‘responders’ along the way, much less examine what might have accounted for the difference."

No numbers are presented to support Leaf’s conclusion. Nevertheless, let’s see where this leads. Leaf extrapolates that conclusion and applies it more broadly, suggesting that response to drugs is highly specific to the individual patient and broad studies are not well-designed to recognize these differing medical outcomes.

"Researchers are coming to understand just how individualized human physiology and human pathology really are. On a genetic level, the tumors in one person with pancreatic cancer almost surely won’t be identical to those of any other. Even in a more widespread condition like high cholesterol, the variability between individuals can be great, meaning that any two patients may have starkly different reactions to a drug."

"Which brings us to perhaps a more fundamental question, one that few people really want to ask: do clinical trials even work? Or are the diseases of individuals so particular that testing experimental medicines in broad groups is doomed to create more frustration than knowledge?"

Leaf then goes even further with this thought and attributes the withdrawal of a number of medications from the market to a lack of understanding of how they work on individuals.

"That’s one reason that, despite the rigorous monitoring of clinical trials, 16 novel medicines were withdrawn from the market from 2000 through 2010, a figure equal to 6 percent of the total approved during the period. The pharmacogenomics of each of us — the way our genes influence our response to drugs — is unique."

Those readers who only recall the drugs removed from market because they were found to be dangerous and had been approved based on clinical testing that was often shoddy and purposely misleading might be wondering which drugs Leaf was referring to. No information is provided to support the claim.

Taking Leaf’s reasoning to its logical conclusion, a drug that helps 10 people, kills 10 people, and has no effect on 1,000 others is a perfectly good drug. One merely needs to figure out who the 10 people are that it will help and give the drug only to them.

Leaf seems to believe that a major fault in current clinical testing arises from using sample sizes too small to accurately observe effects on the scale of interest.

Let’s now look at Leaf’s claims from the point of view of a drug company. If it could prove that a given drug was very effective, but for only 5% of the population, and that it could detect that 5%, it would have a very strong case for having that drug approved for use. If this were the case, then a pharmaceutical industry struggling to maintain profit growth and to develop new products, would suddenly have an endless supply of new drugs to pursue. If the drug applies only to 5% of the population because of genetic reasons, then there is the potential for many more variations (20?) to address the remaining 95% of the population. Each would, of course, be expensive to develop and come with a hefty price tag, presumably much higher than the price of a single drug that applies to a large population.

Let’s hold that thought for a moment and consider the article by Freedman.

Freedman focuses on the work of the doctor John Ioannidis and his associates. Ioannidis has concluded that the data emerging from clinical trials cannot be assumed to be trustworthy.

"He’s what’s known as a meta-researcher, and he’s become one of the world’s foremost experts on the credibility of medical research. He and his team have shown, again and again, and in many different ways, that much of what biomedical researchers conclude in published studies—conclusions that doctors keep in mind when they prescribe antibiotics or blood-pressure medication, or when they advise us to consume more fiber or less meat, or when they recommend surgery for heart disease or back pain—is misleading, exaggerated, and often flat-out wrong. He charges that as much as 90 percent of the published medical information that doctors rely on is flawed."

That sounds like the opinion of some marginalized crank—but it isn’t.

"His work has been widely accepted by the medical community; it has been published in the field’s top journals, where it is heavily cited; and he is a big draw at conferences. Given this exposure, and the fact that his work broadly targets everyone else’s work in medicine, as well as everything that physicians do and all the health advice we get, Ioannidis may be one of the most influential scientists alive."

He is most famous for an article published in the Journal of the American Medical Association.

"He zoomed in on 49 of the most highly regarded research findings in medicine over the previous 13 years, as judged by the science community’s two standard measures: the papers had appeared in the journals most widely cited in research articles, and the 49 articles themselves were the most widely cited articles in these journals. These were articles that helped lead to the widespread popularity of treatments such as the use of hormone-replacement therapy for menopausal women, vitamin E to reduce the risk of heart disease, coronary stents to ward off heart attacks, and daily low-dose aspirin to control blood pressure and prevent heart attacks and strokes. Ioannidis was putting his contentions to the test not against run-of-the-mill research, or even merely well-accepted research, but against the absolute tip of the research pyramid."

What did he discover?

"Of the 49 articles, 45 claimed to have uncovered effective interventions. Thirty-four of these claims had been retested, and 14 of these, or 41 percent, had been convincingly shown to be wrong or significantly exaggerated. If between a third and a half of the most acclaimed research in medicine was proving untrustworthy, the scope and impact of the problem were undeniable."

Ioannidis has arrived at a rather simple, although startling, explanation for why medical studies are so often wrong or misleading.

"’The studies were biased,’ he says. "’Sometimes they were overtly biased. Sometimes it was difficult to see the bias, but it was there.’ Researchers headed into their studies wanting certain results—and, lo and behold, they were getting them. We think of the scientific process as being objective, rigorous, and even ruthless in separating out what is true from what we merely wish to be true, but in fact it’s easy to manipulate results, even unintentionally or unconsciously. "’At every step in the process, there is room to distort results, a way to make a stronger claim or to select what is going to be concluded," says Ioannidis. ‘There is an intellectual conflict of interest that pressures researchers to find whatever it is that is most likely to get them funded’."

Most of the initial data on drug efficacy is produced by the drug companies themselves. Consequently, the bias that Ioannidis sees being introduced has the drug companies as its major source. Ben Goldacre wrote in detail about how this is accomplished in his book Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients. In his chapter describing how to manipulate clinical trials and mislead the medical community he lists 15 techniques that have been used. Misrepresentation of medical results is easy—and it is common.

Bigger and more expensive testing will not help if the trials are improperly designed and analyzed.

The notion that medications can be tailored to the specific physical responses of the individual patient is exciting. It could be an incredible medical advance or it could prove to be impractical and end up a bust. In either event, to move forward intelligently, and to avoid being swindled, more control must be exerted over a testing process that has misled us so often in the past.

 

Clifton Leaf is the author of "The Truth in Small Doses: Why We’re Losing the War on Cancer — and How to Win It."

Monday, August 5, 2013

Saving Detroit: Infrastructure and Employee Pensions

The state of Michigan has declared the city of Detroit to be bankrupt. A financial manager, Kevin Orr, has been designated as "emergency financial manager," a position with broad powers that essentially puts him in charge of the city. Where this leads no one knows. 

The various stakeholders will be making arguments before a federal bankruptcy judge. Detroit is said to have a debt of $18 billion, of which about half is said, by someone’s calculation, to be pension funding shortfall. A considerable fraction of the remainder is said to be uncovered healthcare commitments to public employees. There are two issues that need to be addressed: how to deal with the present debt, and how to make Detroit a financially viable community once more. Both must be addressed in a coherent fashion.

John Cassidy provides some perspective on Detroit’s problems in a brief article in The New Yorker: Motown Down.

"Contrary to what some commentators have been arguing, however, Detroit’s troubles can’t be traced simply to bloated payrolls and intransigent public-sector unions: decades of deindustrialization are the main culprit. The population peaked in 1950, at 1.85 million. Since then, as the auto industry declined, and almost all the city’s white residents moved to the suburbs, the population has dropped by about sixty per cent. The city’s payroll has fallen even faster. In 1951, Detroit employed nearly thirty thousand people. Today, it employs about ten thousand five hundred people, and their salaries and their benefits are hardly extravagant. Since 2010, through furloughs and other measures, the city has cut its employees’ wages by close to twenty per cent. The average municipal pension is nineteen thousand dollars a year."

Cassidy fears that Detroit will become a battleground where political conservatives can continue their attacks on unions and public-sector pensions.

"As things stand now, the proceeding could degenerate into an exercise in privatization, union busting, and the imposition of further sanctions: Orr proposed that the unfunded portion of the city’s health-care and pension benefits be cut by up to ninety per cent."

The Obama has thus far indicated that this is a situation that Michigan, Detroit, and the stakeholders will have to resolve. Cassidy argues that this stance is politically expedient, but inconsistent with federal actions in the cases of other disasters.

"Earlier this month, in the Times, Steven Rattner, who was the Obama Administration’s point man on the auto bailout, noted that people living in Detroit are no more responsible for their woes than are people who live in parts of the country devastated by Hurricane Sandy, areas that were awarded tens of billions of dollars in federal aid."

An even more relevant example might have been New Orleans after Katrina. The rebuilding of that city and the cost of securing it from the threat of other storms far exceeds anything that might be needed to get Detroit back on track. A slowly building economic disaster is no less a disaster than hurricane-induced flooding.

Cassidy correctly argues that Detroit provides the perfect example where the federal government should step in and invest in upgrading Detroit’s infrastructure so that it can once again attract businesses and residents who are looking for the excitement of city living. The auto companies received $80 billion to help them get back on their feet. Shouldn’t the city that shared their hard times also get some help?

Detroit raises the issue of how to rescue a city that has been overcome by events. How does one provide it with a new start? It also provides an opportunity to seriously consider, at a national level, how to deal with accumulating pension liabilities that everyone agrees are underfunded.

An article in The Economist provides some useful background information. Tales are told of public employees who abuse their pension plan’s terms and walk off with exorbitant retirement income. Such things do happen, but they are not the norm.

"Most public-sector workers do not benefit from such boondoggles. The average pension payment in California is around $29,000 a year; in Detroit it is $19,000. In some states, those who receive public-sector benefits do not qualify for Social Security (the federal pension scheme that applies to nearly all workers, public- and private-sector). So their pension may be their only income when they are frail."

Public-sector pensions appear overly generous only because private-pension options are so miserly. The strategy should be to preserve the legitimate benefits available to public employees and construct a path whereby private sector employees can attain something similar. The topic of conversation should not be what can be done with public pensions, but rather what can we do about private-sector pensions.

The first step is to protect Detroit’s employees and retirees from predatory political ideologues. The legal status of pension commitments is cloudy.

"....the legal protections granted to employee pensions are a matter of state, not federal, law. Courts have tended to treat pension rights in two ways: either they are deemed to be "property interests" or they are subject to the law of contract. If they are property interests, the courts seem willing to allow rights to be reduced, particularly in the case of a financial crisis."

"Where pension rights are regarded as contracts, however, change is much more difficult. The federal constitution says no state shall pass a law impairing contracts. Some courts have interpreted this to mean that pension benefits are sacrosanct. Some states, including Illinois, have constitutions saying that public-sector pensions cannot be diminished or impaired. What does this mean? Some courts have ruled that while benefits already earned must be protected, future benefits can be reduced. Others have ruled that the pension promises made when an employee starts work must be kept for the rest of his life."

It is not clear what the status of Detroit’s system is, but it must at least be considered an implied contractual agreement.

When AIG was given $182 billion of taxpayers’ money to rescue itself from its own stupidity, it was discovered that 400 AIG employees were targeted to receive $165 million of that money as bonuses for their roles in trying to destroy the world economy. The public was outraged at such arrogance. The powers in charge at the time decided that it would be inappropriate and illegal to force the violation of a contract, so the bonuses were paid.

If the financial rights of the bad-actors at AIG could be sheltered by the sanctity of contracts, why not the financial commitments to the innocent bystanders who are the public employees and retirees of Detroit?

The claim has been made that defined-benefit pension plans are too expensive for businesses. The response has been to move to defined-contribution plans (401K or 403B) where funds depend on the level of employee contribution, the wisdom of the employee’s investment decisions, and the fees extracted from the earnings by financial institutions.

This approach can be made to work, but many companies provide no such plan, and others that do, contribute so little that it is mainly the employees’ responsibility to save for the future.

In Moving to a National Retirement Plan we discussed how the equivalent of a 401K plan could be instituted at a national level with moderate contributions required from both employers and employees. In principle, existing defined-benefit plans could be converted to this national plan by funding the participants’ accounts at appropriate levels that would approximate the anticipated earned benefits.

Hopefully, Detroit’s plight will generate not precipitous action, but rather, a dialogue on how best to prepare all workers for a more stable economic future.

Thursday, August 1, 2013

Raising the Minimum Wage at Walmart and McDonalds’s

Fast-food workers have received a lot of publicity for staging one-day "strikes" and demanding a dramatic increase in their wages. Most earn at or near the minimum wage of $7.25 per hour. This article by Matt Nesto on Yahoo Finance is typical of the reporting.

"You could call it the French fry revolution, as fast food workers from seven cities take to the streets this week, demanding to be paid a living wage of $15 an hour from the restaurants that employ them."

The typical commentary usually includes the implication that if you earn that little then you are incapable of understanding the issues involved.

"However reasonable the "living wage" idea may seem on the surface, labor costs are a very real and complex issue for all U.S. businesses, but especially those that employ the most unskilled workers."

In other words, if you require food stamps, Medicaid, free school meals for your children, and an Earned Income Tax Credit to survive, your problems are small compared to those of the people paying you that paltry salary.

"It would also come at a time when fast-food sales are already under pressure in the U.S. and struggling to grow...."

And who do owners think their fast-food customers are—the top 1%? Have they forgotten the Henry Ford principle: we need to make a product that our employees can afford?

And then there is the inevitable claim that the little, naive people will only end up hurting themselves.

"....industry watchers predict a dramatic wage increase would likely lead to shorter shifts or outright lay-offs, further automation (think self-check kiosks), as well as reduced opportunities for unskilled teenagers."

Why do we need an industry that caters to the needs of unskilled teenagers? Unskilled teenagers should be in school on their way to becoming skilled adults. If the fast-food industry could save labor with automation they would do it, and as automation becomes cheaper they will do it. The workers realize that they are being told "if you work for this unsustainable wage, it will be a little longer before we are able to fire you."

Finally, there is the bizarre claim the fast-food workers actually have it better than they realize.

"Add in the fact that many so-called quick serve restaurant workers are already paid above the minimum wage, as well as the fact that opportunities abound for motivated employees to advance from the bottom rung of the labor ladder, and this effort to move fast food forward, could end up setting a lot of people backwards."

So these minimum-wage workers are going to foolishly put at risk all those opportunities for fame and fortune that producing a hamburger can provide.

Raising the salary of these workers would bring the fast-food industry to its knees and cause economic chaos. Massive unemployment would follow. Right?

It is not that simple.

Fast food enterprises have been trying for decades to minimize wage expenses as a fraction of revenue. They have become rather good at it. An article by Clare O’Connor at Forbes.com reports on the activities of an enterprising student named Arnobio Morelix at the University of Kansas, School of Business. He analyzed McDonald’s annual reports and concluded that doubling all wage expenses would require McDonald’s to increase its revenue by 17% to maintain its profit level. That is interpreted as requiring an increase in price of 17% on its products.

"Morelix’s take: If McDonald’s workers were paid the $15 they’re demanding, the cost of a Big Mac would go up 68 cents, from its current price of $3.99 to $4.67."

"A Big Mac meal would cost $6.66 rather than $5.69, and the chain’s famous Dollar Menu would go for $1.17."

Note that the doubling of wage expenses included a doubling of benefits also; and it covered all employees including the CEO and his current $8.75 million salary.

"The research assistant said his math is based on increases in salaries and benefits for every McDonald’s worker, from minimum wage line cooks paid $7.25 an hour to CEO Donald Thompson, who made $8.75 million in 2012."

This approach provides an upper bound to the cost of implementing a minimum wage of $15. In that case, benefits would not be doubled and many workers would see less than a factor of two change in wages. Many would see no wage gain at all.

For the purposes of argument, let’s assume that a $15 minimum wage requires a blanket 10% price increase to maintain the status quo. And since we are talking a universally applied minimum wage, all McDonald’s competitors would face the same situation. Is a 10% price rise going to destroy the fast-food industry? Would a 17% increase destroy it?

Every time a fast-food worker receives food stamps, Medicaid assistance, an Earned Income Tax Credit, free school lunches for his/her children, or a housing subsidy, the taxpayers are contributing money to the industry’s profit line. How can the conservative, free-market economists who are so against legislating wage constraints, be, instead, in favor of public subsidies to an industry that has an unsustainable business model?

If there is an argument to be made against a much larger minimum wage, one will have to look beyond the McDonald’s of the world.

Another segment of the economy that employs large number of low-wage workers is the retail industry. It so happens that a detailed study of the impact on Walmart, the industry leader, of a $12 minimum wage has already been performed by the Center for Labor Research and Education at the University of California at Berkeley. This chart was included in the report:



The wage increase applied to Walmart’s US business would add a significant $3.2 billion to wage expenses for the company. However, that is only 1.1% of its US revenue. In order to maintain its profit level Walmart would have to raise its prices by a paltry 1.1%. If a $0.98 item now cost $0.99, would anyone even notice?

The same minimum wage would apply to all participants in the retail industry. Its application would not be likely to affect the competitive structure in the retail segment of the economy. It is difficult to see how a 1.1% price hike cold undermine the economy. As in the case of fast food, retailers who insist on a low-wage business model are receiving public subsidies in order to maintain those business practices.

There is no economic law that determines what a minimum wage should be. That has to be a decision made by citizens based on the kind of society they wish to live in. Other nations have made quite different decisions and have rather healthy economies. This chart of OECD data was provided by The Economist.



Even the other English-speaking nations—often as socially stingy as the US—put us to shame when it comes to putting a floor on wages.
Lets Talk Books And Politics - Blogged