Wednesday, September 30, 2015

Corruption, Lobbying, Judicial Prerogatives, and Term Limits for the Supreme Court

Recent history provides several examples where 200 years of judicial findings have been overturned because a few Supreme Court Justices decided that they knew better.  The Constitution had not changed, but politics and political influence had.  The lesson to be learned is that the Constitution, just as the Bible, can be used to justify almost anything.  Words are ambiguous, and become more so over time.  A text intended to convey one meaning can be reinterpreted to produce an entirely different result.    Time, place, and cultural evolution render a written constitution surprisingly indeterminate.

Zephyr Teachout provides a well-documented history of how judicial interpretation evolved over time in her important book Corruption in America: From Benjamin Franklin's Snuff Box to Citizens United.  She makes the case that the writers of the Constitution were determined to avoid the types of corruption that were common in England and other European countries such as the presentation of gifts to public officials.  The belief was that the best way to avoid a problem with a gift from a foreign power was to make any such gift disbursable only with the concurrence of the Congress.

“The argument of this book is that the gifts rule embodies a particularly demanding notion of corruption that survived through most of American legal history.  This concept of corruption is at the foundation of the architecture of our freedoms.  Corruption, in the American tradition, does not just include blatant bribes and theft from the public till, but encompasses many situations where politicians and public institutions serve private interests at the public’s expense.  This idea of corruption jealously guards the public morality of the interactions between representatives of government and private parties, foreign parties, or other politicians.”

Allowing corporations to contribute their wealth to enhance the electability of a specific lawmaker would have sent our founders into cardiac arrest.  They believed that the best way to avoid corruption was to eliminate the possibility of it occurring.  Teachout refers to these types of rules as structural or prophylactic rather than criminal.  A campaign contribution of any kind from a private entity can be corrupting—therefore it must not be allowed to happen.

For 200 years the Supreme Court issued rulings that were generally consistent with the intentions of our founders.  However, recent rulings have changed things entirely, making what was once considered corruption an integral part of our political and economic lives. 

Teachout is particularly incensed over the Citizens United ruling that allowed unlimited use of wealth to influence election outcomes, a result that left the once-corrupt Europeans laughing at us and our folly. 

Teachout provides an interesting but lesser-known history of the legal status of lobbying that also serves to make her point about the changing perception of corruption.  The issue of lobbying is particularly problematic because it can be beneficial as well as harmful to the public good.

“….lobbying involves the production and communication of information and reason.  When viewed in this light, it should be not only protected but elevated.  On the other hand, the social function of lobbying is to take money and turn it into political power.  Lobbyists are hired as alchemists, to turn money into power through the production of information and the careful use of influence….Where it is effective, lobbying means that the full power of the government shifts itself to serve the social goals of those who can afford lobbyists.  Lobbying, at its worst, enables the extraction of public resources from the public.”

While providing knowledge and advice is a useful task, lobbyists are not required to provide unbiased information.  The effective ones can manipulate data to their advantage and confuse as well as enlighten.  Even worse, they become adept at implying offers of assistance to politicians.  A suggestion that campaign funds may be forthcoming, or perhaps a hint of a remunerative job, or any number of other enticements can be effective at getting what they want without breaking any explicit bribery laws.  Both parties are expert in playing this game.

Originally lobbying was viewed as highly suspect, but writing an explicit law to deal with it was not easy.  An individual pleading his own case to a lawmaker was considered the extent to which lobbying could be tolerated.  Paying someone else to plead his case was considered corrupting.  Early on, paid lobbying was dealt with in the courts as a contract issue.  One could hire a lobbyist, but any associated contract was invalid.  Teachout attributes to one Supreme Court Justice this quote:

“….all agreements for pecuniary considerations to control the….ordinary course of legislation are void as against public policy.”

In Trist v. Child (1874) the Supreme Court issued this statement on the matter of paid lobbying.

“If any of the great corporations of this country were to hire adventurers who make market of themselves in this way, to procure the passage of a general law with a view to the promotion of their private interests, the moral sense of every right-minded man would instinctively denounce the employer and employed as steeped in corruption, and the employment as infamous.”

Such statements express an opinion on lobbying but do not eliminate it.  The state of Georgia was so outraged at the practice—and its effectiveness—that it was moved to declare lobbying a punishable crime in its 1877 constitution.

“The majority of the Georgia convention represented the mainstream view of a different time in American history.  Throughout the country, from the early 1830s through the early 1930s the sale of personal influence was treated as a civic wrong in the eyes of the law.  A citizen did not have a personal right to pay someone else to press his or her legislative agenda.  Nor did anyone have a right to be paid to use personal influence for legislation.  Paid lobbying was looked down upon, criminalized in some cases, and treated as against public policy.”

Teachout tells us that legal and public attitudes gradually changed as time went on.  The sanctity of contracts, lobbying or otherwise, became more important than any odious characteristic of the contract.  Lobbyists, of course, sold themselves as professionals providing a valued service rather those who would choose to bribe a legislator.  Indeed, the definition of bribery itself changed.  Only strict quid pro quo acts could be considered as criminal.  One would have to be caught exchanging a bag of money to be legally liable for one’s actions.  Exercising undue influence became the norm, perfectly acceptable as a democratic response.  And, perhaps most important of all, the First Amendment gained more importance for the crop of legal specialists active in the middle of the twentieth century.  The freedom of speech of an individual became more important than the equitable workings of legislatures for the benefit of society as a whole.  And, of course, corporations became individuals.

“One might think—reasonably—that a major Supreme Court decision was required to overturn this massive body of law.  But the lobbying cases were never directly overturned; they were gradually shunted aside.  When the Supreme Court in Citizens United mentioned in passing that ‘Congress has no power to ban lobbying itself.’ it could cite no direct reference”

It is interesting to consider that the people most affected by lobbying and influence pedaling, those who must face corruption issues on a daily basis, were the most interested in erecting bars to private parties exercising too much influence.  The Supreme Court basically announced that it understood what corruption meant better than the senators and congressmen themselves and overruled them.  Teachout rendered this assessment:

“The Court has become populated by academics and appellate court justices, and not by people with experience of power and politics, who understand the ways in which real problems of money and influence manifest themselves.  The lack of experience is compounded by a tendency to decide cases without full factual development.”

The Citizens United decision and the results that followed find no justification in the Constitution, and are definitely at odds with the intentions of the writers of the Constitution.  Yet the Justices who voted in favor advertise themselves as various flavors of “originalists.”

What lessons are to be drawn from this discussion?  Teachout most fears that the Court has paved the way for the wealthy to rule the nation: an oligarchy.  That is of course true. 

It appears true that the Court can do what it wants irrespective of legal precedent and constitutional ambiguity.  It also appears true that the Court can be influenced by public opinion.  A fundamental error may have been made when it was decided to shelter Supreme Court Justices from political influence by providing them with lifetime terms of office.  It is not possible to shield anyone from political influence.  No person ever arrived at the Court without a set of political biases.  In fact, they are generally nominated for the Court because of the way in which they are biased.

If political influences are unavoidable, perhaps it would be more efficacious to ensure that Justices more closely represented current political sentiments of the majority of the voting public, rather than pretend such influences don’t exist.  Justices are nominated by the sitting President, and the presidency is the only office determined by voters all across the nation.  The President comes closest to being a representative for all the people.  Why not provide the sitting president more power to define the makeup of the Court? 

A nine-person Court serving 12-year terms, with three being replaced every four years might be a more effective way of representing the will of the people.  Added protection from judicial overreach could be attained by instituting a need for six votes to approve a ruling that overturns judicial precedent.


Saturday, September 26, 2015

Increasing Inequality By Purposely Diminishing Government Productivity

In a complex and rapidly changing economy it is easy to conclude that there is an uncontrollable dynamic at work: the world moves according to the flow of technology and no one knows where that will lead.  If the trend is to create a few economic winners and a lot of losers, then that is just the way it has to be.  Anthony B. Atkinson refuses to accept the inevitability of such market solutions.  In his book, Inequality: What Can Be Done?, he continually reminds us that government, as the representative of all the people, can and must play a role in determining where technology might lead us.

Atkinson is concerned with identifying actions that are consistent with economic growth and will assist in limiting the continuing increase in inequality.  He arrives at 15 distinct proposals for actions that could be taken, and suggests several others that require more study.  The very first addresses the influence of the government as an investor in technological progress.  He points out that the research that has provided us with the high-tech gadgets we have so come to love was directed and funded by the federal government.  The government has a definite role to play in developing technology, and it also has a role in defining how technology will be implemented. 

“….when making decisions supporting innovation—whether concerned with financing, licensing, regulating, purchasing, or educating—the government should explicitly consider the distributional implications.”

A number of companies have sprouted in recent years that, under the mantle of economic efficiency, have attempted to replace full-time regulated workers with part-time workers available on demand who receive little in terms of benefits.  Take Uber, for example.  What are the “distributional implications” of this trend?  If the result is that a new category of low-paid day-laborers is being created to compete with full-time workers with benefits, then the government could and probably should intervene.

The last decade has seen many semi-skilled jobs replaced by innovative uses of technology.  The result has been movement of workers into less-skilled, lower-wage positions in service industries.  It is often assumed that service jobs are less amenable to innovation and increases in productivity.  Atkinson believes this is a false assumption, and when it comes to a critical service provider—government—it is a dangerous assumption.  The efficient administration of governmental services is of the utmost importance in a healthy society.

“The achievement of an equitable society depends to a considerable degree on the effectiveness of the public administration and the quality of its dealings with citizens.  Repressive administration may be cheaper, but a fair society needs to ensure that its operations—in the fields of taxation, public spending, regulation, and legislation—are just, transparent, and accepted.  This requires resources.  Moreover, as societies become richer, so too they become more demanding in their standards.”

Those who are experiencing social or economic difficulty are most in need of a public administration designed to facilitate the dispersion of aid.

“Economic inequality is often aligned with differences in access to, use of, or knowledge of information and communication technologies.  For middle-class taxpayers, filing a tax return on-line may be a time-saving operation, but for a person who has just become unemployed, applying for benefits on-line may be a worrisome challenge.”

A few examples from the United States will illustrate Atkinson’s point about the need for efficient public administration if fairness and justice are to be maintained.  Since Atkinson brought up the issue of tax returns, let’s begin there.

For most people, filing a tax return is a matter of gathering together information that is already known to the Internal Revenue Service (IRS), specifying that information in a form and sending it on to the IRS so it can check to see if the data has been properly reported.  It is immediately apparent that there is an unnecessary step here.  Why not have the IRS accumulate the information and convert it into a tax assessment specifying either a bill for any underpayment of taxes or a declaration of an overpayment and eligibility for a refund?  That is exactly what the IRS must do anyway to check a submitted tax form.  The person could then choose to accept the assessment or enter modifications for items that had been missed or were incorrect and resubmit it.  This process is in place in a number of countries and seems to work just fine.

The investigative website Propublica explains why such a reform cannot happen in our country in the article by Liz Day: How the Maker of TurboTax Fought Free, Simple Tax Filing.

“Imagine filing your income taxes in five minutes — and for free. You'd open up a pre-filled return, see what the government thinks you owe, make any needed changes and be done. The miserable annual IRS shuffle, gone.”

“It's already a reality in Denmark, Sweden and Spain. The government-prepared return would estimate your taxes using information your employer and bank already send it. Advocates say tens of millions of taxpayers could use such a system each year, saving them a collective $2 billion and 225 million hours in prep costs and time, according to one estimate.”

“So why hasn't it become a reality?”

“Well, for one thing, it doesn't help that it's been opposed for years by the company behind the most popular consumer tax software — Intuit, maker of TurboTax. Conservative tax activist Grover Norquist and an influential computer industry group also have fought return-free filing.”

In an ideal world, the government would be striving to make the tax collecting process as simple and transparent as possible, by providing pre-filled returns for as many people as possible.  That is what productivity in the service sector is all about.  In fact, the IRS could provide the exact same service that Turbo Tax provides—and provide it for free!  How is that for productivity?

Instead, we have commercial companies who exert their influence by throwing money around to protect their parasitic revenue streams.  They are assisted by ideologues like Grover Norquist whose goal is not efficient government, but rather the failure of government.  He and his cronies in the Republican Party wish to starve the IRS of funds and watch it perform so poorly that people will resist paying any federal taxes.

For many, this conflict between government efficiency and corporate or political profit is of little concern.  Unfortunately, it is often those with the least earnings who are forced to get assistance from for-profit agencies in order to file their tax forms.  Financial records become complicated when one makes a lot of money.  They also become complicated when one earns little income and receives public assistance.  If some people have their way, the process will grow even more complicated.

The Bloomberg Editorial Board published a note titled How to Make the Tax code Even Worse.  Here we are warned that one of the most effective tax features—one that has long had bipartisan support—is being undermined by corporate greed and political opportunism.

“Consider a scheme Congress may soon take up for some individuals applying for the earned income tax credit, once a relatively straightforward process. If a report from the Senate Appropriations Committee is heeded, what was once a one-page form for claiming the credit could expand to four or five pages, padded out by pointless and bewildering new questions.”

“The likely result would be that those eligible would either forgo their credits or hire tax services for help. As it happens, H&R Block, the largest such service, helped write a report recommending the changes. Complexity has its advocates.”

In addition to the corporate profit motive, there are the politicians who believe that any assistance to a low income person is a disincentive to work.  Any welfare assistance should therefore be small in amount, difficult to obtain, and be of short duration.  That provides an explanation why some would be in favor of more complex tax reporting for the earned income tax credit—it is better to harm 10 people than to let one person get more than is allowed under the law.

Nowhere is government more inefficient and more purposely harmful than in its administration of the welfare system.  Kaaryn Gustafson, a law professor at the University of Connecticut, published an article in the Journal of Criminal Law and Criminology in 2009 titled The Criminalization of Poverty.

“Lost in these contemporary understandings of welfare is the association of welfare with wellbeing, particularly collective, economic wellbeing. Many of the current welfare policies and practices are far removed from promoting the actual welfare of low-income parents and their children. The public desire to deter and punish welfare cheating has overwhelmed the will to provide economic security to vulnerable members of society. While welfare use has always borne the stigma of poverty, it now also bears the stigma of criminality.”

“Throughout the 1970s, '80s, and '90s, the value of the welfare grant, adjusted for inflation, declined dramatically. The weighted average maximum benefit per three-person family was $854 in 1969 (in 2001 dollars), but plummeted to $456 by 2001. It became increasingly hard for welfare recipients to cover their most basic expenses-food, clothing, and rent-with their welfare grants. Unable to survive on welfare checks and facing barriers to employment, many welfare recipients turned to other sources of income, whether help from kin or participation in underground labor markets, and attempted to hide those sources from the welfare office for fear of losing the small checks they received.”

The government was purposely made less-efficient with the express intent of limiting access to assistance to those unable to keep up with complex reporting requirements.  Social workers who were once directed to provide assistance to those in need became fraud detectors whose goal was to find cheating and withdraw assistance.

“Office caseworkers, hired to replace the social workers, processed the routine paperwork that welfare recipients regularly submitted to the office to document their continuing financial need. In a process known as "churning," the federal government increased the amount of information and paperwork required to determine welfare eligibility, and denied benefits to low-income families who failed to keep up with the paperwork. Income-eligible families were removed from the aid rolls for their failures to provide verification documents in a timely manner.”

Being caught and punished for some violation of the rules (the most common of which seems to be missing a mandatory meeting at the welfare office) is referred to as sanctioning.  States have the discretion of punishing an individual by limiting his or her access to funds or they can choose to punish an entire family for one member’s sins with a full-family sanction.

“….a study conducted by Yeheskel Hasenfeld found that approximately half of the sanctioned adults surveyed did not know they had been sanctioned.   For these families, the welfare system may seem so complex, arbitrary, and mystifying that they cannot determine why their benefits are fluctuating. This suggests that rather than creating a set of incentives that will ‘make work pay,’ the current welfare system is simply punishing people who cannot figure out how the system works.”

More on our current welfare system can be found in Race and the Criminalization of Welfare.

Atkinson is certainly correct in assuming that government provision of services can be made more efficient.  He is also correct in concluding that those most in need of government assistance have much to gain.  Inefficient government does contribute to inequality.


Sunday, September 20, 2015

Hitler, Jews, Genocide, and Food Security

There are a number of nations, variously labeled as Central or Eastern European, whose territorial boundaries were ill-defined and whose ethnic makeup was complex in the interwar period.  These were lands whose position put them between Germany and Russia both geographically and politically.  World War I destroyed the entities that maintained order in this region as empires disappeared and left control to local populations, often leading to ethnic confrontations.  It was in this region that most of the violence associated with World War II and the Holocaust occurred. 

Timothy Snyder, a professor of history at Yale University, has made the understanding of the history and culture of this region one of his main research topics.  He has produced the book Black Earth: The Holocaust as History and Warning.  In it, Snyder attempts to place the Holocaust, the persecution and murder of Jews, into historical and political perspective.  After almost two millennia of anti-Jewish preaching by the Christian Churches it was inevitable that violence against Jews would erupt during chaotic war years.  What is less-well known and less-well understood is the degree of violence that also ensued between other ethnic groups.  The Ukraine was an area of especially high chaos as it was invaded and reinvaded during the war years.  Ukrainian nationalists attempted to eliminate the ethnic Poles that lived in the area with a level of brutality that may have exceeded anything faced by Jews.  

Snyder wishes to make sure that we realize that the Holocaust was only one of the black marks on humanity that were apparent after the events that occurred in this region.  Violent anti-Semitism has been with us for a very long time—and we have a sense of how that arose.  There are, perhaps, deeper and more frightening things to learn about humanity from a more general consideration of the other massacres that occurred.  Does the chaos of war force people to do violent things, or does it provide them with the opportunity to do violent things?  Is violence the innate response—or is it the exception?  What exactly does it take to drive formerly peaceful people to become mass murderers?

The focus here will be on the more limited topic of food security and what the lack of it can cause.

Snyder has recently published two short articles discussing points that he wants to make sure are read by a broader audience than is likely to delve into his book.  The first appeared in The New York Review of Books: Hitler’s World.  Here, Snyder makes clear Hitler’s motives and philosophy, and explains how they led to war and genocide.  The second appeared in the New York Times with the title The Next Genocide.  A Hitler is not likely to be seen again, but the conditions that led him to pursue war and mass murder could recur.

Let us begin by learning about Hitler and his world.  Hitler believed races lived according to a law of the survival of the fittest.  The strong contend against the weak in order to maintain their dominance and acquire the resources they need to survive and thrive. 

“Nothing can be known about the future, thought Hitler, except the limits of our planet: ‘the surface area of a precisely measured space.’ Ecology was scarcity, and existence meant a struggle for land. The immutable structure of life was the division of animals into species, condemned to ‘inner seclusion’ and an endless fight to the death. Human races, Hitler was convinced, were like species. The highest races were still evolving from the lower, which meant that interbreeding was possible but sinful. Races should behave like species, like mating with like and seeking to kill unlike. This for Hitler was a law, the law of racial struggle, as certain as the law of gravity.”


There would be no end to this struggle, and the cognizance of its necessity should motivate racial policies.  Racial policies and political policies were identical.  As Snyder puts it:

“The ceaseless strife of races was not an element of life, but its essence.”

“Hitler entitled his book Mein Kampf—My Struggle. From those two words through two long volumes and two decades of political life, he was endlessly narcissistic, pitilessly consistent, and exuberantly nihilistic where others were not.”

To Hitler, the Jews were not so much an inferior race to be disposed of, as not a race at all by his definition.  Having no turf to defend or to extend they intermingled with many of the races and introduced words and ideas about the universality of mankind, suggesting that all men could learn to live together in peace.  This is the exact opposite of Hitler’s philosophy, and thus the Jews set themselves up as a threat to Hitler.  To the degree that Germans accepted the Jewish view, it weakened Hitler’s hold over them and threatened his goals of conquest.

“Hitler saw the species as divided into races, but denied that the Jews were one. Jews were not a lower or a higher race, but a nonrace, or a counterrace. Races followed nature and fought for land and food, whereas Jews followed the alien logic of ‘un-nature.’ They resisted nature’s basic imperative by refusing to be satisfied by the conquest of a certain habitat, and they persuaded others to behave similarly. They insisted on dominating the entire planet and its peoples, and for this purpose invented general ideas that draw the races away from the natural struggle. The planet had nothing to offer except blood and soil, and yet Jews uncannily generated concepts that allowed the world to be seen less as an ecological trap and more as a human order.”

“Hitler’s basic critique was not the usual one that human beings were good but had been corrupted by an overly Jewish civilization. It was rather that humans were animals and that any exercise of ethical deliberation was in itself a sign of Jewish corruption. The very attempt to set a universal ideal and strain toward it was precisely what was hateful. Heinrich Himmler, Hitler’s most important deputy, did not follow every twist of Hitler’s thinking, but he grasped its conclusion: ethics as such was the error; the only morality was fidelity to race.”

If the master race was to be spared Jewish corruption the only solution was to remove Jews and their influence from society.  There seems to be some disagreement over whether extermination was his initial plan.  He talked of gathering Jews and removing them to some remote location (Siberia?), but extermination was the eventual decision.

Hitler’s plan for protecting the future of the master race depended on a belief that food and resources were limited.  The land could only provide so much and if a race needed more they must take it from another race.  The fundamental driver behind his plans for aggression was his belief that the German people would not long be able to feed themselves without capturing new territory.  Hitler’s gaze was always turned to the east where he saw races easy to defeat and rich agricultural land.  Attacking Poland and Russia was unavoidable.

Hitler’s idea of warfare was not an attempt to just control land, it was also necessary to defeat another race and eliminate it if possible.  His plans for mass murder far exceeded the number of dead that so horrifies us to this day.

“A war of simple conquest, no matter how devastatingly triumphant, could never suffice. In addition to starving inferior races and taking their land, Germans needed to simultaneously defeat the Jews, whose global power and insidious universalism would undermine any such healthy racial campaign.”

Hitler believed that technology would never be able to increase food production in Germany to the extent needed.  Food security thus was the justification he used for waging war and mass extermination.  While Hitler and his philosophy my never be duplicated, food security has always been a worry and a source of conflict.  Global climate change will affect food production, likely diminishing it in total and causing major redistributions in agricultural productivity.  This brings us to Snyder’s warning about “the next genocide.”

“The Holocaust may seem a distant horror whose lessons have already been learned. But sadly, the anxieties of our own era could once again give rise to scapegoats and imagined enemies, while contemporary environmental stresses could encourage new variations on Hitler’s ideas, especially in countries anxious about feeding their growing populations or maintaining a rising standard of living.”

Many countries are unable to feed their populations without importing food.  Food scarcity leads to higher prices.  In many of the poorer nations food is a large fraction of a family’s budget and price increases lead to political unrest.  Some attribute the “Arab spring” to widespread concerns in the Middle East about food security at the time.

Snyder points out that Africa is the only location where there still exists a significant amount of underutilized arable land.  The fact that it is underutilized does not mean it isn’t providing food for the local populations, but it does mean the land is a tempting target for wealthy countries in need of more food.  Many have already closed deals with African nations to work land currently being used by local Africans to produce food to be shipped back home—a form of agricultural imperialism.

Snyder suggests China as a country that is incapable of feeding its population and could eventually succumb to “ecological panic.”

“The Chinese government must balance a not-so-distant history of starving its own population with today’s promise of ever-increasing prosperity — all while confronting increasingly unfavorable environmental conditions. The danger is not that the Chinese might actually starve to death in the near future, any more than Germans would have during the 1930s. The risk is that a developed country able to project military power could, like Hitler’s Germany, fall into ecological panic, and take drastic steps to protect its existing standard of living.”

“How might such a scenario unfold? China is already leasing a tenth of Ukraine’s arable soil, and buying up food whenever global supplies tighten. During the drought of 2010, Chinese panic buying helped bring bread riots and revolution to the Middle East. The Chinese leadership already regards Africa as a long-term source of food. Although many Africans themselves still go hungry, their continent holds about half of the world’s untilled arable land. Like China, the United Arab Emirates and South Korea are interested in Sudan’s fertile regions — and they have been joined by Japan, Qatar and Saudi Arabia in efforts to buy or lease land throughout Africa.”

“Nations in need of land would likely begin with tactfully negotiated leases or purchases; but under conditions of stress or acute need, such agrarian export zones could become fortified colonies, requiring or attracting violence.”

Snyder leaves us with this warning:

“It is not difficult to imagine ethnic mass murder in Africa, which has already happened; or the triumph of a violent totalitarian strain of Islamism in the parched Middle East; or a Chinese play for resources in Africa or Russia or Eastern Europe that involves removing the people already living there; or a growing global ecological panic if America abandons climate science or the European Union falls apart.”


Tuesday, September 15, 2015

Do We Have More Science and Engineering Graduates Than We Need?

Andrew Hacker seems to have a problem with technical education.  He once famously—or infamously if you prefer—wrote an article criticizing the imposition of mathematics on students who find it difficult.  In a New York Times article, Is Algebra Necessary, he took educators, parents, and the rest of the world to task for making algebra a required course when it was unnecessary for the vast majority of people and caused numerous people to fail who would have done just fine without it.

“A typical American school day finds some six million high school students and two million college freshmen struggling with algebra. In both high school and college, all too many students are expected to fail. Why do we subject American students to this ordeal? I’ve found myself moving toward the strong view that we shouldn’t.”

“This debate matters. Making mathematics mandatory prevents us from discovering and developing young talent. In the interest of maintaining rigor, we’re actually depleting our pool of brainpower. I say this as a writer and social scientist whose work relies heavily on the use of numbers. My aim is not to spare students from a difficult subject, but to call attention to the real problems we are causing by misdirecting precious resources.”

“The toll mathematics takes begins early. To our nation’s shame, one in four ninth graders fail to finish high school. In South Carolina, 34 percent fell away in 2008-9, according to national data released last year; for Nevada, it was 45 percent. Most of the educators I’ve talked with cite algebra as the major academic reason.”

Earlier, he coauthored a book with Claudia Dreifus, Higher Education?: How Colleges Are Wasting Our Money and Failing Our Kids---and What We Can Do About It.  The book contained a number of noteworthy observations and several interesting recommendations.  However, it also incorporated a rather curious view of what higher education should consist of.

“The very phrase liberal arts induces hushed respect.  In an era when bachelor’s degrees are awarded in sports management and fashion merchandising, hearing of students who are majoring in philosophy and history still evokes our esteem.  The liberal arts may be viewed as a classical education and an intellectual adventure, as learning for its own sake and pursuing the life of the mind.  The most admired purlieus of higher education are its liberal arts colleges, which generally enroll only undergraduates and eschew vocational programs.”

They then take this ancient and highly suspect opinion and couple it with this sage advice.

“We’d even hazard that twenty-one or twenty-two, the usual ages of college graduation, may be too early to decide that dentistry or resource management is the career for you.  The best thing after receiving a bachelor’s degree may be to find a semi-skilled job—Best Buy or Old Navy?—and keep your eyes and mind open about career choices.”

And which are these lesser schools that have refused to “eschew vocational programs?”  They would include places like MIT which actually stoops to training people to labor as engineers.

“In fact, vocational training has long been entrenched in America’s colleges.  Even now, more students at MIT major in engineering than the sciences.”

So—MIT is a vocational school, and science is an art while engineering is a mere craft.  How quaint.  Hacker would prefer that students who want to learn how to build the new bridges and dams that we need, or the smarter, more efficient electrical grid our society demands, should forego that for four years (plus two years at Best Buy) and spend their time learning why ancient philosophers didn’t know what they were talking about.

Hacker has returned to address technical education again in an article published in The New York Review of Books: The Frenzy About High-Tech Talent.  His concern here is that there is a persistent public clamor demanding that we produce more graduates in the fields of science, technology, engineering, and mathematics (STEM) or else dire consequences will follow.

“Pronouncements like the following have become common currency: ‘The United States is falling behind in a global “race for talent” that will determine the country’s future prosperity, power, and security’.”

Hacker allows that there are a number of indicators that can be used to support such conclusions.

“The United States ranks 27th among developed nations in the proportion of college students receiving undergraduate degrees in science or engineering.”

He then claims that the data on employment of STEM graduates is inconsistent with the presumed high demand.  There are already more graduates being produced than the number of jobs available.

“A 2014 study by the National Science Board found that of 19.5 million holders of degrees in science, technology, engineering, and mathematics, only 5.4 million were working in those fields, and a good question is what they do instead.”

The source of that data can be found here.  This is what the report actually says:

“In 2010, estimates of the size of the U.S. S&E workforce ranged from approximately 5 million to more than 19 million depending on the definition used.”

“The application of S&E knowledge and skills is widespread across the U.S. economy and not just limited to S&E occupations.  The number of college-educated individuals reporting that their jobs require at least a bachelor’s degree level of technical expertise in one or more S&E fields (16.5 million) is significantly higher than the number in occupations with formal S&E titles (5.4 million).”

The latter statement is quite inconsistent with the conclusion drawn by Hacker.

Hacker turned to another source for corroborating data.

“The Center for Economic Policy and Research, tracing graduates from 2010 through 2014, discovered that 28 percent of engineers and 38 percent of computer scientists were either unemployed or holding jobs that did not need their training.”

That data was contained in the report A College Degree is No Guarantee by Janelle Jones and John Schmitt.  These authors were studying the differences in educational outcomes for black degree holders as compared to the population as a whole.  In the process they produced some interesting data on what happens to recent college graduates.  Consider the chart below from which Hacker presumably extracted his numbers.



The period studied was 2010-2012, and the criterion was a job requiring a degree of any kind versus one which did not require a degree.  Engineering is actually at the top of the list in terms of placing graduates in degree-requiring jobs.  If we were to accept the engineering data as indicative of a field producing too many graduates, what should we conclude about Hacker’s beloved liberal arts degree—or any college degree at all?

Hacker also uses Bureau of Labor Statistics (BLS) projections of job demands over the period 2012-2022 to point out that STEM occupations are mostly expected to grow more slowly in number than employment as a whole.  But what exactly does that mean?  BLS defines mechanical engineering, for example, as a field requiring a bachelor of science (BS).  As in most fields as they mature, entry-level skills are being computerized while the requirement of an advanced degree grows.  Fewer, but more highly-trained people are in demand.  The importance of expertise in a field may be growing while the number of jobs is decreasing.  What does one do with that insight?  Can one insure a required number of doctorates will be produced by discouraging students from entering the field at the BS level?

The number of STEM-designated jobs required may be an ill-defined quantity, but doesn’t the same concern apply to all fields of study?  Couldn’t one argue that there is a glut of liberal arts majors so someone should write an article discouraging that pursuit?  STEM-related degrees appear to be better than most at preparing students for an uncertain future—so quit complaining.


Tuesday, September 8, 2015

Bonobos, Christians, Scandinavians, Atheists, and Government

There is a tenet propagated by many religious believers that religion is what separates us from animals.  It is our religious beliefs that provide the moral rules that have allowed our civilization to develop.  This notion that rules of human behavior have been handed down to us from above has aroused Frans de Waal to come to the defense of both the animals and humans, pointing out that each are better-developed morally than they are given credit for.  He is a primatologist who is a professor at Emory University and director of the Living Links Center at the Yerkes National Primate Center in Atlanta, Georgia.  His arguments are presented in his book The Bonobo and the Atheist: In Search of Humanism Among the Primates.

The bonobo is, from the human perspective, a lesser known, less abundant species of chimpanzee.  An anthropologist from another planet would categorize us as three species of ape.  It is the view of this alien anthropologist that de Waal wishes us to consider.

From the human perspective, we split off from the chimpanzee line about six million years ago.  The bonobos split off that same line about one million years ago.  The bonobos were geographically isolated from the better known common chimpanzee and evolved differently.  Their body structure is more like that of a human and the society it developed is as different from that of the common chimp as is the human.  What de Waal sets out to demonstrate is that these three quite different apes all possess common traits that are necessary for animals to coexist in groups.  These are traits that we would associate with pro-social behavior, as opposed to anti-social behavior. These pro-social behaviors are equivalent to living according to a moral code.

 The claim then can be made that it is not religion that provides us with the moral precepts necessary for civilization; rather, it is the innate set of traits that millions of years of evolution have provided us that forms the base upon which religion was able to evolve.

The common chimps, being more numerous and more extensively studied, probably provided sufficient data for de Waal to prove his thesis.  The existence of data for another species can only strengthen his case, but one suspects that bonobos are just so damned interesting that he couldn’t resist focusing on them.

 Bonobos differ from the common chimp by having a matriarchal society.  A group will possess an alpha female.  There will also be a dominance hierarchy for males, but their prestige and authority are derived from their mothers.  One observer described the dominant male as a general who gets his orders from the female prime minister.

Bonobos are much more peaceful than common chimps.  This moved an observer to claim that if two unfamiliar bands of bonobos were to wander into each others’ path it is more likely that an orgy would break out, rather than violence.  One of the more interesting traits of bonobos is the way they use sex in social interactions. It is a way of relaxing the tension associated with meeting a stranger; it is a way of bonding with another male or female; it is a way of expressing gratitude or sympathy; it is way to have fun.

From de Waal we are provided this perspective:

“Even if becoming a primate sexologist was never my goal, it was an inevitable consequence.  I have seen them do it in all positions one can imagine, and even in some we find hard to imagine (such as upside down, hanging by their feet).  The most significant point about bonobo sex is how utterly casual it is….We use our hands in greetings, such as when we shake hands or pat each other on the back, while bonobos engage in ‘genital handshakes.’  Their sex is remarkably short, counted in seconds rather than in minutes.  We associate intercourse with reproduction and desire, but in the bonobo it fulfills all sorts of needs.  Gratification is by no means always the goal, and reproduction only one of its functions.  This is why all partner combinations engage in it.”

“The contrast with their fellow species is striking.  Most observed chimp killings take place during territorial disputes, whereas bonobos engage in sex at their boundaries.  They can be unfriendly to neighbors, but soon as a confrontation has begun, females have been seen rushing to the other side to copulate with males or mount other females.  Since it is hard to have sex and wage war at the same time, the scene rapidly turns into socializing.  It ends with adults from different groups grooming each other while their children play.”

While not all bonobo interactions end with this degree of congeniality, violent interactions are rare.

Bonobos provide an interesting alternative to the popular assumption that human development is characterized by the need to resort to violence in order to attain dominance in intra-group activities as well as in conflicts between groups.  This seems reasonable given the warfare that has existed since history came to be recorded.  However, that period is just a brief instant in terms of human evolution and was driven by enormous changes in how humans lived.  The relatively peaceful bonobos society could just as easily be the better paradigm for early human societies.   

“I welcome bonobos precisely because the contrast with chimpanzees enriches our view of human evolution.  They show that our lineage is marked not just be male dominance and xenophobia but also by a love of harmony and sensitivity to others.  Since evolution occurs through both the male and female lineage, there is no reason to measure human progress purely by how many battles our men have won against other hominins.  Attention to the female side of the story would not hurt, nor would attention to sex.  For all we know, we did not conquer other groups, but bred them out of existence through love rather than war.  Modern humans carry Neanderthal DNA, and I wouldn’t be surprised if we carry other hominin genes as well.  Viewed in this light, the bonobo way doesn’t seem so alien.”

Psychologists have been experimenting on humans to discover innate tendencies and have discovered that people are inherently pro-social.  The default option is to be considerate of others and to avoid hurting others unless they themselves have in some way been mistreated and feel free to retaliate.  This type of behavior is what is necessary if people are to exist in societies.  Much of de Waals book is devoted to providing evidence that this behavior also exists in chimps, bonobos and other mammals.  Furthermore it exists in infants of the various species so it is an innate attribute, not one acquired culturally.  There is an obvious evolutionary explanation why this behavior would have been selected.  For much of existence life was very difficult and survival depended on the protection and sustenance that could be obtained as a member of a cooperative group.  The alternative to living within such a group was often death.

In de Waals’s view, these pro-social tendencies were sufficient in a small group to keep members behaving in accordance with the social rules.  Everyone would be aware of any transgression and punishment was near certain.  There was no need for any imposition of a moral code.  Thus, early humans who lived in such small groups had no need for a religion to define morality and preserve rules.  Religion would come later as humans developed larger and more complex societies where some explicit mechanism was required to define and demand appropriate behaviors.  As societies evolved, religion evolved with them in order to continue to maintain control.

As an atheist, de Waal does not have to believe any of the claims of religious authenticity in order to view religion as a positive factor in society.  Therefore, as an atheist, why should he feel a need to argue about religious authenticity, especially when there is no way to prove who is wrong or right.  He has little sympathy for those aggressive atheists who seem to feel a need to wage war against religion.

“True, if being a self-declared atheist carries a stigma, as it unfortunately does in this nation, frustration is understandable.  Hatred breeds hatred, which is why some atheists rail against religion and talk as if its disappearance will be a huge relief.  Never mind that religion is too deeply ingrained for it to ever be eliminated, and that historical attempts to do so by force have brought nothing but misery.  Perhaps it can be done slowly and gently instead, but that would require us to appreciate and value our religious heritage at least to some degree, even if we regard it as outdated.”

Curiously, he argues that the moral behavior imposed by religion is necessary for society to function, but then proceeds to point to Scandinavia where there is a religious history, but with essentially no impact of religion on political life, as a model “being tried out” and as a “grand experiment.”

“Civic institutions have taken over many of the functions originally fulfilled by the churches, such as care for the sick, poor and old.  Despite being largely agnostic or nonpracticing, the citizenry of these countries stands firmly behind this effort.  It is a giant experiment, both economically and morally, that may tell us whether large nation-states can forge a well-functioning moral contract without religion.”

The Scandinavian experiment has been in place for generations and has produced some of the most stable and successful nations on Earth.  In fact, a look at the list of OECD nations, the most successful nations in terms of wealth, reveals only a few that could be considered to have societies whose actions are influenced by representatives of religions.  Most are stable, secular societies.  Many have levels of religious participation that are almost as low as in the Scandinavian countries.  In fact, many countries in Europe once had state-sponsored religions that possessed great political power.  It’s as if those countries have declared that that will never happen again and are aggressively secular.  The inherent pro-social tendencies of which de Waal is so fond, combined with secular enforcement of secular rules and regulations, works just fine in a modern society.

Historically, when groups of humans grew to significant size religion and political power were closely tied together.  Often rulers maintained their power by claims of divinity or by having been anointed by the god or gods.  While religion could be used to impose moral precepts, it could also be used to levy taxes, wage war, discriminate against groups of people (and other religions), and to exact punishments. 

It is unreasonable for de Waal to imply that religion and religious leaders are responsible for the good things in society and not for the bad things humans have done.  The iron rule of religion is that it will try to impose its beliefs on others.  This has led to uncountable religious wars, mass murder, and mass destruction.

The United States, arising late in history, followed a different path.  When it was formed it had four significant religions—four versions of Christianity with each despising the other three.  There could be no government-related religion under those circumstances.  Never having experienced the disaster of a theocracy, there are many who wish to see one established based on Christian law.

Consider surveys performed by David E. Campbell and Robert D. Putnam.  In a New York Times article titled Crashing the Tea Party (2011), they provided some insight into what self-proclaimed Tea Party adherents are really after.  The authors performed a study of the political attitudes of 3000 people in 2006, before the rise of what would come to be known as the Tea Party.  They were able to go back and re-interview many of the same people several years later.  This allows them to correlate a person’s original political beliefs with the tendency to associate with the Tea Party.

The authors find that those who would be Tea Partiers today are merely the traditionally highly conservative wing of the Republican Party. The Tea Party is not a new movement, and it was not a grass-roots development born of the Great Recession.  These people have always been there.  The only difference is that now they have greater influence in a more conservatively aligned Republican Party.

And what are Tea Partiers most interested in?

“….they were disproportionately social conservatives in 2006 — opposing abortion, for example — and still are today. Next to being a Republican, the strongest predictor of being a Tea Party supporter today was a desire, back in 2006, to see religion play a prominent role in politics. And Tea Partiers continue to hold these views: they seek ‘deeply religious’ elected officials, approve of religious leaders’ engaging in politics and want religion brought into political debates. The Tea Party’s generals may say their overriding concern is a smaller government, but not their rank and file, who are more concerned about putting God in government.”

Those who are against putting God in government must fight back against these people who would impose their belief system on others.  Fortunately, history provides them with plenty of ammunition.  This issue is too important to worry about being polite in dealing with one’s adversaries.

The view of the role of religion in history assumed by de Waals is dreadfully naïve.  One need not look far to note examples of tragedies that follow from mixing religion and politics.  There are very good reasons why atheists might want to trash religion.


Tuesday, September 1, 2015

The Ambiguity of Pain and Its Consequences: Discrimination, Oligoanalgesia, and Addiction

We tend to think of pain as a quantity with a cause that can be identified and a level of intensity that can be characterized.  However, Joanna Bourke, a British historian, tells a different story.  In her book, Story of Pain: From Prayer to Painkillers, she convinces us that pain is a much more elusive entity. 

“….pain describes the way we experience something not what is experienced.  It is a manner of feeling….Crucially, pain is not an intrinsic quality of raw sensation; it is a way of perceiving an experience.”

It is well known that expression of pain is a learned response that varies from one culture to another, from one gender to another, and from one generation to another.  It is also well known that extreme physical injuries can be endured with little or no pain, and that extreme pain can be experienced with no discernible cause.

There is no way of measuring pain directly.  Consequently, a physician must depend on the description of the pain by the patient or draw her own conclusions based on evaluating the signals of pain being expressed by the patient.  Given that high level of uncertainty, it is not surprising that physicians and nurses have treated patients suffering from pain in ways that were affected by biases and baseless assumptions.  Treatment was accorded to patients based on race, ethnicity, gender, and age.  What is surprising is that biases and baseless presumptions have not disappeared as medical knowledge has been accumulated; instead, many have largely persisted in modern times.

Bourke’s study is limited to the English-speaking countries and is thus dominated by Britain and the United States.  Not unexpectedly, when matters of race or ethnicity arose Anglo-Saxons always seemed to be identified as the ideal “race.”  Since pain was to them such a real and compelling sensation they assumed that their superior intellectual development rendered them more sensitive to all stimuli.  People the Anglo-Saxons didn’t respect (everyone else except a few Northern Europeans) were generally assumed to be incapable of experiencing pain at the same level because of inferior physical and mental development.  Jews weren’t well-liked but they couldn’t conveniently be classified as mentally inferior.  It was assumed that Jews were hypersensitive to pain and lacked the ability to control their response to it because they were culturally or racially deprived of the fortitude provided by an Anglo-Saxon heritage.

These beliefs had a great deal of influence on the manner in which people were treated over the centuries.

“….slaves, ‘savages’, and dark-skinned people generally were routinely depicted in Anglo-American texts as possessing a limited capacity to truly feel, a biological ‘fact’ that conveniently diminished any culpability amongst their so-called superiors for acts of abuse inflicted upon them.”

These beliefs about the inferiority of the people the British wished to conquer and colonize were extremely convenient.  In the context of medical treatment, this led to less attention being paid to the pain suffered by certain classes than to others.

Gail Collins provides a startling example of how this presumed insensitivity was put to use by physicians in America's Women: 400 Years of Dolls, Drudges, Helpmates, and Heroines.  There was a horrible tragedy that could befall women in giving birth.  It was referred to as a vesico-vaginal fistula. 

“During childbirth, the wall between their vagina and the bladder or rectum ripped, leaving them unable to control the leakage of urine or feces through the vagina.  The condition had been recognized for centuries, but some historians believe that it increased when doctors began delivering babies [replacing midwives] and inserting their instruments into the womb.”

A surgeon, J. Marion Sims assumed the task of trying to surgically repair this devastating condition in 1845

“J. Marion Sims, an Alabama physician, devised an operation that successfully closed the fistulas and let these tormented women resume their lives.  But the discovery came at a horrifying cost...He experimented with surgical techniques while the [slave] women balanced on their knees and elbows, in order to give them a better view of what he was doing....Four years later he finally succeeded in repairing the fistula of a slave named Anarcha....It was Anarcha’s thirtieth operation, all of them performed without anesthetics.....Sims claimed that the women had begged him to keep trying his experiments and it’s possible that was true....But they were still slaves with no real option to say no, and Sims chose to work on them in part because he believed white women could not endure the kind of pain he was inflicting.”

As Bourke makes clear, this assumption about insensitivity to pain by certain classes of people is not something relegated to the deep, dark past.

“….in the words of a gynaecologist in 1928, forceps were rarely needed when ‘colored women’ were giving birth because ‘their lessened sensitivity to pain makes them slower to demand relief than white women’.”

“From the 1980s onwards, surveys showed that minority patients being treated for pain associated with metastatic cancer were twice as likely as non-minority patients to be given inadequate pain management.  Even after major operations, certain patients, Chinese for example, were likely to be given less pain relief than white patients, in part because of assumptions that they had a higher threshold for tolerating pain.  In a study of people treated for long-bone fractures at the UCLA Emergency Medicine Center in Los Angeles in the 1990s, Hispanics were twice as likely as non-Hispanic whites to receive no medication for pain.”

Women were subject to much analysis on the part of the males who created conventional wisdoms.  While continuing to refer to them as “the weaker sex,” it was generally acknowledged that women were better able to endure pain than men.  Some of the justifications for this assumption were more interesting than others.

“In one particularly pessimistic account in 1910, women’s resilience was simply ascribed to their ‘long practice in suffering the blows of the male’.”

A more politically correct and more compelling view held that since women were provided the unavoidable task of delivering children, nature, or God, must have provided them with the wherewithal to deal with the associated pain.

Women’s reward for actually being the stronger sex (regarding pain) was to be continually provided with less pain relief than men might receive in similar circumstances.  Bourke cites studies that indicate this pattern has persisted into the current century.

Animals and, incredibly, infants suffered due to constantly changing biases and assumptions.  As noted in the beginning, physicians cannot measure pain, they can only guess at it based on claims or actions provided by the patient.  Both animals and newborn infants are unable to provide the appropriate signals that convey what a physician is capable of recognizing as true pain.  Therefore, they were left to the mercy of guesses and assumptions.

“Indeed, one of the main reasons why some scientists regarded it as legitimate to vivisect animals was because they did not behave as if they felt pain to such an extent as the human species’, as one commentator put it in the 1920s.”

Eventually, as animals were used more and more as surrogates for humans in the testing of medicines and medical procedures, it became necessary to learn more about animal pain, and in fact to quantify it.

The history of infants and pain is discussed in Infants, Fetuses, and the Complicated Issue of Pain.  Towards the end of the nineteenth century physicians began to assume that infants felt little if any pain.  Prior to that, they had assumed that infants were extremely sensitive to pain.  The net result was that in a period when anesthetics and analgesics were readily available, infants were not likely to receive much consideration when it came to pain.

“The author of Modern surgical Technique (1938), for instance, claimed that ‘often no anesthetic is required’, when operating on young infants: indeed ‘a sucker consisting of a sponge dipped in some sugar water will often suffice to calm the baby’.”

Around 1980, the consensus changed again and concluded that infants did in fact experience pain.  However, that does not mean that infants were destined to be coddled.

“A study in 1995 revealed that pre-term infants were subjected to an average of sixty-one painful procedures while in the neonatal intensive care unit.  In another study, published in 1987, neonates were subjected to about three invasive procedures an hour.  In addition, neonates were actually less likely to be given analgesia than older children.”

As we have noted, there were classes of people who systematically received less pain care than others: infants, women, minorities, and poor people.  It is important to realize that this discrimination existed within a system whereby inadequate levels of pain relief were being delivered to all.  The medical community has recognized this as a problem and has even created a term to describe the under treatment of pain: oligoanalgesia.

Bourke suggests several reasons why there might be a reluctance to administer sufficient pain treatment.  One explanation was lack of training.  Pain management receives little attention in the education of most doctors and nurses.  Another is the development in the medical community of a culture which declares its goal to be pain reduction, not pain elimination.  If the patient is tolerating pain then what is the problem?  A third explanation rests on a fear of overmedicating patients.  Many pain suppressors are potentially addictive and both patient and physician can suffer if addiction should develop.

This latter issue puts physicians in a difficult situation.  At the same time they are being told to be careful and not under medicate, and be careful and not over medicate.  Dannielle Ofri provided an excellent example of how difficult this could be in practice in a New York Times article: The Pain Medication Conundrum.

It seems that after centuries of medical advancement, doctors and nurses are still reduced to listening to patient complaints, evaluating their expressions of pain, and guessing as to what the level of pain might actually be and how they should respond.


Lets Talk Books And Politics - Blogged