Sunday, December 26, 2021

Cultural Polarization and the Future of the United States: Schismogenesis

 It is a wondrous day when one encounters a book that provides a new and totally refreshing outlook on we human beings and our history.  Such is the impact of The Dawn of Everything: A New History of Humanity by David Graeber and David Wengrow.  The authors argue that the conventional wisdom on human history has been based mostly on assumptions and guesswork, but now recent archeological research has allowed a more defendable picture of what humans have been up to over the last twenty or thirty thousand years.  As a result, we can congratulate ourselves on having been a much more interesting and clever species than previously thought.  Here we will focus not on the overall findings in the authors’ work, but on one particular type of social interaction between societies that was identified as being part of our historical past. 

We are introduced to the concept of schismogenesis,

“Back in the 1930s, the anthropologist Gregory Bateson coined the term ‘schismogenesis’ to describe people’s tendency to define themselves against one another.  Imagine two people getting into an argument about some minor political disagreement but, after an hour, ending up taking positions so intransigent that they find themselves on completely opposite sides of some ideological divide—even taking extreme positions they would never embrace under ordinary circumstances, just to show how much they completely reject the other’s points.”

We know this can happen with individuals.  What is of greater interest is the notion that the same sort of response can take place between societies.

“Bateson was interested in psychological processes within societies, but there’s every reason to believe something similar happens between societies as well.  People come to define themselves against their neighbours.  Urbanites thus become more urbane, as barbarians become more barbarous.  If ‘national character’ can really be said to exist, it can only be as a result of such schismogenetic processes: English people trying to become as little as possible like French, French people as little like Germans, and so on.  If nothing else, they will all definitely exaggerate their differences in arguing with one another.”

The historical record suggests that this response is innate.

“…what is it that causes human beings to spend so much effort trying to demonstrate that they are different from their neighbors?  Recall how, after the end of the last Ice Age, the archeological record is increasingly characterized by ‘culture areas’; that is, localized populations with their own characteristic styles of clothing, cooking and architecture; and no doubt also their own stories about the origin of the universe, rules for the marriage of cousins, and so forth.  Ever since Mesolithic times, the broad tendency has been for human beings to further subdivide, coming up with endless new ways to distinguish themselves from their neighbors.” 

It seems our ancestors were quite willing and capable of traveling great distances.  They would be aware of what was going on in nearby regions.  Differences between neighboring societies would not be due to isolation, but rather, develop by choice.  Information and technology might propagate, but culture, much less so.  In fact, cultural comparisons yield the conclusion that societies are best understood by observing the cultural attributes they reject.  The authors attribute this viewpoint to Marcel Mauss.

“For if everyone was broadly aware of what surrounding people were up to, and knowledge of foreign customs, arts and technologies was widespread, or at least easily available, then the question becomes not why certain culture traits spread, but why other culture traits didn’t.  The answer, Mauss felt, is that this precisely how cultures define themselves against their neighbors.  Cultures were, effectively, structures of refusal.”

The authors provided much material to support this critical aspect of human history.  For our purposes, the characteristics of the ancient city-states of Sparta and Athens provide the most accessible example.

‘Schismogenesis, you’ll recall, describes how societies in contact with each other end up joined within a common system of differences, even as they attempt to distinguish themselves from one another.  Perhaps the classic historical example (in both senses of the term ‘classic’) would be the ancient Greek city-states of Athens and Sparta, in the fifth century BC.  As Marshall Sahlins puts it:

“Dynamically interconnected, they are then reciprocally constituted…Athens was to Sparta as sea to land, cosmopolitan to xenophobic, commercial to autarkic, luxurious to frugal, democratic to oligarchic, urban to villageois, autochthonous to immigrant, logomanic to laconic: one cannot finish enumerating the dichotomies…Athens and Sparta were antitypes.”

“Each society performs a mirror image of the other.  In doing so, it becomes an indispensable alter ego, the necessary and ever-present example of what one should never wish to be.” 

Can one think of a present example of two interacting societies that seem to be defining themselves according to the belief the other represents an “ever-present example of what one should never wish to be?”  Unfortunately, the example that comes to mind is the polarization that has occurred between red and blue regions in the United States. The color designation of these regions represents broad differences that go far beyond politics.  In the spirit of the above comparison of Athens and Sparta, here are a series of questions about the United States to which both sides consistently provide different answers. 

Were we created as a secular nation or as a Christian nation? 

Do we have a duty to aid the least among us or are the least among us deserving of their fate?

Is education a fundamental right for all or is it a commodity to be determined by market forces? 

Is healthcare a fundamental right for all or is it a commodity to be determined by market forces? 

Do people of color suffer from discrimination or do whites suffer from discrimination?

Should global warming be addressed, or should it be ignored?

Is abortion a woman’s right or should it be forbidden?

Is our governance based on majority rule, or is it not?

To use the phrasing of Marshal Sahlins, “one cannot finish enumerating the dichotomies.”  Each side views the other as an “ever-present example of what one should never wish to be.”  Schismogenesis seems to be at work as each side, when in power, goads the other by promoting policies the other deems highly offensive.  At least one side is arming for and threatening violence and throwing around the “secession” term.  And the divide continues to grow.  Will it reach a point of no return?  If so, what comes next?  We have had one bloody Civil War already.  History provides no encouragement.  Athens and Sparta waged war on each other.  Nations with irreconcilable mixed populations such as India subdivided and separated, but only after millions were murdered.  If it is believed there is no path to reconciliation, then each side will strive to become the victor in whatever form of conflict follows.

Somehow the escalating polarization must be moderated.  People have often claimed that a serious threat to our nation would force us to come together and collaborate.  We now have two serious threats: the coronavirus pandemic and climate change.  Neither have helped because neither is generally viewed as sufficiently serious—yet.  The pandemic could get much worse; climate change will get much worse.  How bad must it get before the pain suppresses the animosity?  Will that happen before the nation comes unglued?

Thursday, December 16, 2021

Health Sciences and the Accuracy Crisis

 A recent issue of the London Review of Books provided an article by John Whitfield titled Replication Crisis.  It reviewed the book Science Fictions: Exposing Fraud, Bias, Negligence, and Hype in Science by Stuart Ritchie.  In this pandemic/climate change era when we are constantly demanding that people “follow the science,” we should be concerned about a book with such a title.

The replication crisis in Whitfield’s title refers to the fact that scientists in critical fields are churning out papers reporting scientific findings that other researchers are unable to reproduce.  If the results are not reproducible, how can they be considered accurate?  And what if public or medical care policy is based on a result later found to be incorrect?

Most of these studies are carried out by academics and funded by government agencies.  “Publish or perish” is the order of the day.  The pressure begins as soon as a graduate student enters a doctoral track.  The student must demonstrate the ability to produce publishable research to obtain a doctoral degree, and his/her professor must demonstrate the ability to produce students capable of producing such work.  Getting results published in a prestigious journal is how academics gain tenure at a university and is key in obtaining funding to continue to perform research.  The new post-graduate student is under continuing pressure as he/she navigates the academic world looking for employment which will hopefully lead to a permanent position somewhere.  Note that professors can produce many doctoral students over their careers, many more than the system can absorb.  The competition is intense and never really ends.   

“In Science Fictions, Stuart Ritchie explores the problems with this system. The book is a useful account of ten years or more of debate, mostly in specialist circles, about reproducibility: the principle that one purpose of a scientific paper is to make it possible for others to carry out the same work, and that one test of its reliability is whether they get the same result.”

“In recent decades there have been large-scale efforts at replication in several fields, but if an experiment can’t be repeated, it doesn’t necessarily mean the original work was incompetent. Work at the frontier of a discipline is difficult, and skilled hands are an underacknowledged factor in scientific success… Even so, the findings of these large-scale replication studies have helped to fuel a widespread sense that science is failing on its own terms: in cancer biology, one effort managed to replicate just six out of 53 studies; in psychology about 50 per cent of studies cannot be replicated; in economics, about 40 per cent. In 2016 Nature surveyed researchers across the natural sciences and found that more than half the respondents had been unable to repeat their own work, though less than a third regarded the failure as a sure sign that a study was wrong.”

This system demanding continual proof of accomplishment via journal publications provides dangerously perverse incentives at numerous points.

“At one end of the replication crisis, as it has become known, there are spectacular frauds. In the early 2000s the South Korean biologist Hwang Woo-suk became a national hero for cloning human stem cells; just a few years earlier, the materials scientist Jan Hendrik Schön was being tipped for a Nobel Prize for papers describing molecular-scale electronic components. Both had made up their results. In surveys, about 2 per cent of researchers admit to fabricating data, though many more suspect their colleagues of doing so. But deliberate malpractice probably accounts for only a small portion of unreliable science. The greater concern is that the rush to publish and the pressure to make a splash pushes researchers to take short cuts and dodges: low-level fiddles that stop short of fraud but undermine reliability.”

“The worry is that scientific processes have been undermined by perverse incentives to the point that it’s difficult to know what to believe. The crisis has hit psychology, Ritchie’s own discipline, and biomedicine especially hard. These are crowded, competitive fields, in which research groups around the world are racing one another to publish on the hottest topics. In these circumstances, haste can win out over care. The data in these fields tends to be noisy, leaving room for interpretation and manipulation in presentation and analysis, and psychologists and biologists tend to be less mathematically expert than their colleagues in the physical sciences.”

There are also financial incentives driving the need to publish.  Professors’ salaries are generally merit based with the proof of merit demonstrated by their publication lists.  In addition, there are opportunities for more direct forms of compensation.

“There can be financial incentives too…many Chinese universities were giving cash bonuses for publications, with higher-impact journals securing bigger rewards for researchers. In a survey of Chinese university policy in 2016, the average bonus for the lead author of a paper in Nature or Science was calculated at $44,000, five times the average professorial salary.”

In no medical arena is the opportunity for and the realization of false results more egregious than in the pharmaceutical industry where biased research and unethical and illegal activities have become common business practices.  These activities are not new developments.  About ten years ago they were discussed in the articles Lies, Damned Lies, and Medical Science, Medical Science and the Vanishing Truth, and Drug Companies: Even More Corrupt Than Financial Institutions?.  A recent article by Brian Buntz, GSK, Pfizer and J&J among the most-fined drug companies, according to study, indicates that these practices continue.  Rather than send criminals to jail, large, powerful drug companies get their hands slapped by merely assessing them negligible fines (compared to the profits gained) if they promise to behave better in the future. 

“GlaxoSmithKline (LON: GSK) paid nearly $10 billion in inflation-adjusted financial penalties between January 2003 and December 2016, the highest tally for any drug company, according to research published in JAMA.”

“Pfizer (NYSE: PFE) was next in line with almost $3 billion in fines.”

“Johnson & Johnson (NYSE: JNJ) came in the third slot with $2.7 billion in penalties.”

“In all, the 26 pharmaceutical companies paid some $33 billion in fines during the 13-year period. The top 11 alone accounted for $28.8 billion, or 88%, of the total.”

“’The pharmaceutical industry is unique in that all large pharmaceutical firms explicitly state that they are focused on promoting patient welfare, yet the majority of large pharmaceutical firms engage in illegal activities that harm patient welfare,’ said Denis G. Arnold, a coauthor of the study and a professor at the University of North Carolina at Charlotte.”

It is interesting, and perhaps a bit scary, that two of the companies on which we have depended for Covid vaccines are among the worst offenders.

Scientific research can produce extraordinary results, but shoddy and inaccurate findings can undermine the general trust in science.  Between Covid and climate change, we are in a position where we must follow the science and the science must be correct.

 

Wednesday, November 24, 2021

Women’s Economic Progress and the Legacy of Patriarchy

 The contributor to The Economist under the label “Free Exchange” provided an interesting column on the economic progress women have made: Do “greedy jobs” cause the gender pay gap?  The author discusses the conclusions provided in a recent book.

“A new book by Claudia Goldin of Harvard University, an expert on women and work, is a study both of American women’s choices and of the context in which they are made. “Career and Family: Women’s Century-Long Journey Toward Equity” traces the history of work and family for college-educated women, and diagnoses what still troubles their careers today.”

Goldin is said to provide a standard description of the evolution over time of working women from single women working in female specific jobs, to unmarried women gaining advanced education and access to traditionally male positions until marriage, to women trying to combine both careers and family.  All the while, the women would be underpaid for their efforts.

“Yet, despite the staggering extent of the change Ms Goldin documents, a clear gender gap still exists for these women, most notably with respect to pay. American women earn on average 20% less per hour worked. For college graduates, the gap is larger, at 26%.”

“It is at this point that the book becomes provocative. Drawing on reams of research Ms Goldin argues that most women no longer suffer much labour-market discrimination in the sense of unequal pay for equal performance, as is often claimed by the left. Nor is the gender pay gap driven primarily by women’s choice of occupation, an explanation sometimes favoured by the right.”

“The most important cause is that women curtail their careers as a part of a rational household response to labour markets, which generously reward anyone, male or female, who is willing to hold down what Ms Goldin calls a “greedy job”. These are roles, such as those in law, accountancy and finance, that demand long and unpredictable hours. Parents need somebody to be on-call at home in case a child falls ill and needs picking up from school, or needs cheering on at a concert or football match. That is incompatible with a greedy job, which requires being available for last-minute demands from a client or boss. No one person can do both. The rational response is for one parent to specialise in lucrative greedy work, and for the other—typically the mother—to prioritise the children. Ms Goldin writes that ‘couple equity has been, and will continue to be, jettisoned for increased family income’.”

This conclusion pertaining to women’s progress requires comment, not because it is incorrect, but because it demonstrates so much that yet needs to change.  First of all, why do we even have these “greedy” jobs?  The author states “those in law, accountancy and finance, that demand long and unpredictable hours.”  These occupations do not demand long and unpredictable hours, it is their social cultures that have developed these conditions.  They all involve interacting with money or people who have money—lots of it.  If money is to flow or be protected, a system has been set up whereby these people must be dealt with.  They are not unlike the gangs of crooks who would take control of a river or highway and demand a fee for passage in days of old.  They seem so ashamed of the fact that they contribute so little yet make so much money that they create this myth that what they do requires such extreme intelligence and hard work that only a few of the “elite” can possibly succeed.  Part of the support for this myth is claiming that a woman raising children and needing well-defined working hours—usually with a husband who is nearly useless—can’t compete.  It is all a scam of epic proportions that needs to collapse.

In Are Men Becoming the Second Sex?, it was noted that women compete quite well with men in education where success demands “long and unpredictable hours” and can be measured quantitatively.  Women have long out-competed men for college bachelor degrees and now produce most of the advanced masters and doctoral degrees.  Women are taking control of the cadre of young medical doctors and have begun to surpass men in law degrees.  Where they are less likely to succeed are in activities that are controlled by cultural factors such as politics and the high-income aristocracy where patriarchal attitudes remain dominant.

There are many men, often with significant power and influence, who do not wish to compete with women.  They are much more comfortable relegating them to household chores, an attitude that has thousands of years of cultural/religious support.  Is there a better way to drive women out of the workforce than to deny them any support when they inevitably get pregnant and must stop work and attend to the newborn?  After a few months, the mother and infant have bonded, and the infant can just as well be cared for by the father.  In fact, the infant and family would be in better shape if the father took the time to form a strong bond with the child as well.  If one believes that men and women have equivalent cognitive capabilities on average, why is the default solution that the mother should drop her career so the father can maximize family income.  Why is that not a decision reached on the basis of each parent’s capabilities and prospects.  The idea that women should focus on childcaring is a social construct imposed by tradition.  Tradition can be changed.

The Nordic countries were never much influenced by Christian traditions.  Consequently, they seem to be the firmest believers in equality of opportunity for women.  In Scandinavia and Gender Equality Sweden was used as an example of the extent to which a society can go to provide gender equality.  The Swedish government’s goal is thusly defined. 

“The aim of Sweden’s gender equality policies is to ensure that women and men enjoy the same opportunities, rights and obligations in all areas of life.”

Note the inclusion of the phrase “equality of obligations.”  Sweden interprets this to mean that women must not bear the only burden of raising a child.  When a woman has a baby, she is not offered maternal leave, but a very generous family leave of over a year in extent.  This leave can be taken as needed over the first several years of the child’s life.  The reason for calling this family leave is because it is designed to balance the time a mother will spend away from work with the requirement that the father also miss work—a minimum of two months—and spend time providing the childcare.  The policy is aimed at fairness in sharing obligations and intends to protect any new mother from having to leave the workforce unless she wishes to do so.

Patriarchy is a tradition that needs to die.



 

Wednesday, November 10, 2021

Pensions and Pension Plans: The Guaranteed Retirement Account

 Experts claim that if a retiree can maintain an income that is 70% of the preretirement income, then the individual will be able to maintain the preretirement lifestyle.  People who survive but struggle will continue in the same mode, those who were living the good life should be able to continue living the good life.  Sources of retirement income include Social Security payments.  This system was never intended to be a full retirement program; it was intended to keep the elderly out of abject poverty.  Benefits are focused on providing the greatest assistance to low-wage workers.  As income increases, the benefits from the program become a smaller fraction of that desired 70%.  Those with moderate to high incomes now need either an employer provided defined benefit plan, an employer provided defined contribution plan, a healthy personal savings accumulation, or some combination of the three.

A defined benefit pension plan was once common for employees of large companies.  With the ascendency of neoliberal concepts and the acceptance of the dubious claim that a company’s only responsibility was to provide the greatest profit for the benefit of shareholders, defined benefit plans began to disappear and be replaced by defined contribution plans such as 401(k)s.  This was a massive transfer of risk from companies to their employees.  As points out Max Reyes in a Bloomberg Businessweek article, Companies Decide the Time Is Right to Offload Pensions to Insurers, a perverse outcome of the pandemic was that the suffering was born by mostly low-income individuals, while investors were able to dramatically increase their wealth.  Pension plans are major investors and they have performed quite well.

“The hundred largest corporate pensions were funded at 97.1% in August [2021], according to the consultant Milliman Inc., and they could creep as high as 102% by the end of the year under optimistic projections.  A year ago, pensions were less than 87% funded.”

Being underfunded means the plans did not have the projected resources to meet their projected obligations.  The companies would be at risk of having to default in some manner in the future or extract large sums of money from their businesses.  Now that they are at or near full funding, they have a convenient mechanism for terminating their defined benefit plans entirely and eliminating the potential for risk to return in the future—without having to contribute any of their profits.  The path forward is called a pension risk transfer or PRT.

“…By buying a financial product called an annuity, a company can essentially place the assets of a plan and the responsibility for paying for it into the hands of a life insurance company. The insurer makes money if it can earn more from investing the assets than it has to pay out. (Another risk-transfer option is to offer to pay benefits in a lump sum; in that case, the risk of ensuring the money lasts is taken on by the pensioner rather than another company.) These deals with insurers aren’t new, but record high markets are making them especially attractive to employers.”

Major companies have recently opted to take this path and many more are expected to follow.

“A recent study commissioned by MetLife found that 93% of 250 plan sponsors surveyed intend to divest all of their obligations, up from 76% in 2019. Companies can choose to fully divest the plans or reduce the scope of pensions on their balance sheets without removing them entirely. But plenty of businesses are looking to get out for good, according to Yanela Frias, president of Prudential’s group insurance business. ‘The reality is that an insurance company is much better positioned to manage this liability than a car manufacturer or a telephone company,’ Frias says. ‘We do this for a living’.”

This trend protects the pension rights of employees and is revenue neutral for retirees and nearly so for those close to retirement.  However, for younger employees and new hires they will be pretty much on their own in saving for retirement, managing their savings, and making sure they don’t outlast their savings in retirement. 

This trend of divesting pensions will hopefully generate serious consideration of what is really needed: a national retirement that is not dependent on the whims of employers with mandatory participation by both employers and employees.  Teresa Ghilarducci has for years been promoting a Guaranteed Retirement Account (GRA).  She and Tony James provided a detailed account of their proposed GRA in Rescuing Retirement: A Plan to Guarantee Retirement Security for All Americans.  If implemented throughout employees’ working lifetimes and earning returns consistent with those of existing public employee pension plans, the cost to individuals and companies is surprisingly small. 

The authors begin with some perspective on why their plan is necessary.

“As recently as 1979, half of all private sector workers with retirement plans had traditional, employer-administered pensions.”

“Today, however, only 15 percent of the U.S. workforce (mostly government workers and public school teachers) has access to a traditional pension.  Beginning in the 1980s, most private employers shifted to ‘defined contribution plans such as 401(k)s, which cost companies less and shift funding risk rom companies to employees.  Roughly half of private sector workers (53% in 2016) either lack access to any plan or do not participate in one.”

Most employers offer only modest contributions to 401(k)s, usually matching workers’ contributions up to a few percent of wages.  Most workers have endured nearly stagnant wages over the last several decades, leaving little opportunity to lock up precious earnings in a long-term effort to prepare for retirement.  As a result, most workers, particularly those with low wages, will have little other than Social Security available when they hit retirement age.

“The World Economic Forum (WEF) estimates that the United States had a $28 trillion retirement savings gap in 2015—the largest in the world.  By 2050, they project this will grow to $137 trillion.  This is almost a $3 trillion annual increase—five times the annual defense budget.”

The goal of the GRA is to fill the gap between what Social Security provides and the desired 70% of preretirement income.  The intention is to provide that amount for workers earning less than $100,000 per year.  Those with higher incomes will still benefit but should be able to provide some additional savings for retirement.  The basic plan is simplicity itself.  Every worker, from the time of first employment to the full Social Security retirement age of 67 is assumed to contribute 1.5% of their income to a retirement account.  Employers are required to match that amount.  There is also a $600 annual tax credit for individuals making the participation of lowest wage workers nearly free.  This 3% annual contribution invested in a long-term investment plan is to be converted to an annuity at retirement that will provide a constant source of income for the remainder of their lives.  

At the historical rates of return for these types of pension funds, the GRA is expected to produce about 38% of preretirement income for all participants.  This will augment the amount provided by Social Security (SS).  The authors provide these quantities: for an income of $22,021, SS provides 53% + 38% = 91%; for an income of $48,937, SS 40% +38% = 78%; for $78,295 SS 33% + 38 = 71%; for $127,500, SS 26% +38% = 64%.

The authors assume the money invested will earn 6-7% over a lifetime.  This is consistent with past performance for defined benefit plans, of which public employee plans seem to perform the best.  Historically, defined contribution plans have poorer performance in terms of return on investment.  Whether the economy in 40 years will be anything like the economy over the last 40 years is an open question.  However, being able to accomplish so much with such a small level of investment, this type of GRA provides a platform for modifications in the future as needs arise.  

The plan is designed to work for young people who will contribute all their working careers.  Those who are approaching retirement with little in savings will at least have the opportunity to invest more in the GRA than the 3% to hasten the buildup of benefits, but the main recourse seems to be to work longer to buy more time and to boost Social Security benefits.

The universality of the approach frees the individual from a dependence on the employer for retirement benefits.  Just as a good healthcare plan provided by a company can induce individuals to make unwise economic decisions based on the fear of losing that level of coverage by choosing another job, a good company pension can provide a similar constraint.  The GRA continues wherever the worker goes.  This should make life simpler for both employees and employers.

 

Thursday, November 4, 2021

Rainy Day History: How the South Emerged from the Civil War with More Political Power Than Ever

The red states, mostly in the South, have created much political angst as they impose ever tighter voting conditions in an attempt to limit the number of voters.  It is well understood why they wish to do this.  What is perhaps less clear is how important control of access to polling booths within the electoral system has been throughout the history of the southern states.

When our nation was formed, representation in the House of Congress would be proportional to the population of each state.  The southern states’ populations were composed largely of slaves who were not considered citizens eligible to vote, yet these states needed the total population count in order to have the political influence they thought they deserved.  A compromise was reached in which it was allowed for each slave to be considered three-fifths of a person in allocating representatives.  As a result of the Civil War and the constitutional amendments enacted, the former slaves had to be treated as full citizens and provided the right to vote.  In effect, the freeing of the slaves would change the three-fifths rule to the five-fifths rule.  In terms of representation, the southern states would receive extra house members—to the losers went the spoils.

This situation would be optimized, in the southerners’ view, if the former slaves were not allowed to vote, providing more seats, yet with total control over who manned those seats.  It took a few years and a friendly Supreme Court, but soon southern blacks were nearly totally disenfranchised, and the south would comfortably remain a one-party region.

States with a dominant party gain nothing from encouraging voter turnout as long as the players in the scheme show up to vote.  The differences between numbers of voters and representation in congress of the voters became startlingly large.  Alexander Keyssar recently wrote a book titled Why Do We Still Have the Electoral College?  He points out that for all our history the method of electing a president was universally viewed as defective and dangerous.  However, there were always enough states or a political party that had an advantage from the electoral system and refused to give up that short-term gain for long-term stability.  Ultimately, the most popular proposed amendment was to replace the Electoral College with direct election of the president by a nationwide vote, something the southern states would never allow.

Keyssar provides some data on the imbalance caused by the electoral system.

“In 1904, for example, Delaware had cast roughly the same number of votes for Congress as Georgia had. But Georgia had eleven representatives while Delaware had only one.  Ohio that same year cast as many votes for president as nine southern states together, but those nine states possessed ninety-nine electoral votes in comparison to Ohio’s twenty-three.  (The 1904 election was no anomaly: in every presidential contest from the 1890s into the 1960s, there were many fewer ballots cast per electoral vote in the South than elsewhere.)”

“…the Electoral College reinforced white supremacy in the South and gave southern Democrats unmerited strength in national elections.”

The South would use this unmerited strength to forbid any attempt to make lynching a crime, eliminate most of the black population from the Social Security System, and ensure that the postwar G.I. Bill would effectively become affirmative action for whites.  The southern Democrats would eventually become mainstream Republicans and continue pursuing minority party rule and white dominance.  The culture persists; the methods are little changed.

  

Saturday, October 30, 2021

Ultimate Recycling: Composting Humans

 Lisa Wells provides us with an intriguing new option for dealing with our remains after we pass away.  She produced the article To Be a Field of Poppies: The elegant science of turning cadavers into compost for Harper’s Magazine to acquaint us with this new development.

Wells is someone who has thought a lot about what should be done with her body after she is gone.

“As a child, I desperately wanted a Viking burial, an idea inspired by the 1988 Macaulay Culkin film Rocket Gibraltar, in which a group of kids boost their grandpa’s corpse, load it onto a boat, push it out to sea, and light it on fire with a flaming arrow. If the sky glowed red, the narrator explained, it meant the dead Viking had ‘led a good life’.”

“By my twenties, I had settled on the more realistic option of cremation. I wanted my ashes scattered on the banks of my favorite river, or cast from a cliff into the Pacific Ocean, or fired into the atmosphere from a cannon. (I was in a Hunter S. Thompson phase.) But after a friend’s ashes were lost in the mail, I reconsidered. I explored sky burial, in which a corpse is left out in the open to be fed upon by raptors; and alkaline hydrolysis, a process in which flesh is liquefied in a solution of water and potassium hydroxide. More recently, I planned to follow the example of Nineties heartthrob Luke Perry and purchase an Infinity Burial Suit: a shroud containing fungi that would consume my corpse and bioremediate its toxins.”

She suggests that such focusing on exiting the world is healthy and useful: we owe the world some consideration as to how to dispose of ourselves, it saves those who remain any uncertainty as to what to do with us, and consideration of our inevitable death should produce a greater appreciation for the life we have left.

“A willingness to face life’s nonnegotiable realities seems to me one mark of psychological maturation. But it comes at a price—the discovery that the world is not as simple as we once believed.”

“The Viking burial, for example, is apocryphal; the Vikings were known to burn their dead in boats, but kept them parked on land. What’s more, their funerals sometimes involved human sacrifice, in which a female slave was raped by the dead man’s clan, then ritually stabbed and strangled. Other, less sinister realities: both sky burial and the firing of heavy artillery are frowned upon in the city of Seattle, where I live. And even if cannons were permitted, cremation releases about 540 pounds of carbon per incinerated corpse. The carbon output from a year’s worth of cremations in the United States is roughly equivalent to that from burning 400 million pounds of coal. Alkaline hydrolysis has less ecological impact, but like cremation, it wastes the body’s energy; instead of going up in smoke, nutrients are flushed down the drain. Even the mushroom suit, according to critics, adds nothing to the decomposition process that soil itself can’t provide.”

In that spirit, our traditional burial six feet under can be considered as just a brutal mechanism to inject a carcinogen into mother earth.

“By the 1950s, embalming had become standard in the United States, but I wonder if this would have been the case had people understood the violence involved. There is no single method, but in a typical scenario, fluid containing formaldehyde is pumped into the carotid artery, which forces blood and other fluids in the corpse out of a tube in the jugular or femoral vein. An aspirating device resembling a meat thermometer is then repeatedly pushed into the abdomen and chest, where it punctures the organs. The organs are then filled with concentrated ‘cavity chemicals.’ No wonder embalming is considered desecration in some traditions, including among Muslims and Jews, who bury their dead in shrouds or simple coffins, sometimes without nails or fasteners, to avoid obstructing the decomposition process.” 

Wells centers her article around Amigo Bob Cantisano and his end-of-life decision.

“Amigo Bob didn’t know what should be done with his body. To bury toxic embalming fluid in the earth was out of the question—he was a lifelong environmentalist.”

“A few months earlier, in May 2020, a Washington State bill legalizing the conversion of human remains into soil, known as natural organic reduction (NOR), had gone into effect. A company called Recompose was due to open the world’s first NOR facility that December in Kent, a city just south of Seattle. They named it the Greenhouse. It seemed perfect for Amigo Bob, who had revolutionized the field of organic agriculture—first as a farmer, then as an advocate and consultant—and spent his life building soil and protecting it from the ‘pesticide mafia’.”

“When he began to accept that the end was near, Amigo Bob called the founder of Recompose, Katrina Spade. He wanted to make sure she knew what she was doing. Compost is the basis of organic farming, so he knew a lot about it—he’d even served as an adviser for a few large composting operations. Katrina explained their process, and he seemed to find her account convincing, but it wasn’t until his final moments that he told Jenifer [his wife] definitively: ‘This is what I want’.”

Amigo Bob would be one of the ten people who would become the first legally composted humans.  Proper composting requires a bit of technology.

“Composting isn’t rocket science, but the process requires a precise amount of sustained heat to eliminate pathogens and quickly convert decaying organic matter to soil.”

“Each of their bodies was placed inside an eight-foot-long steel cylinder called a ‘vessel,’ along with wood chips, alfalfa, and straw. Over the next thirty days, the Recompose staff monitored the moisture, heat, and pH levels inside the vessels, occasionally rotating them, until the bodies transformed into soil. The soil was then transferred to curing bins, where it remained for two weeks before being tested for toxins and cleared for pickup.”

“Half of the NOR soil would wind up in a forest on Bells Mountain, in southwestern Washington, near the Oregon border. A composted body produces approximately one cubic yard of soil, which can fill a truck bed and weigh upwards of 1,500 pounds. For many surviving relatives—apartment dwellers, for example—taking home such a large quantity of soil is unrealistic, so Recompose offers them the option to donate it to the mountain, where it’s used to fertilize trees and repair land degraded by logging.”

“But Amigo Bob was a farmer, so Jenifer rented a U-Haul and brought the whole cubic yard of him home. She turned the trip into a kind of pilgrimage, stopping to visit loved ones and the headwaters of their favorite rivers. Over the next few months, their farmer friends came by and filled small containers with the soil to use on their own land. Jenifer used some to plant a cherry tree.”

It is not clear what the future of human composting will be.  It will seem an attractive option for the environmentally concerned.

“Recompose claims that each person who chooses composting over conventional burial or cremation will prevent an average of one metric ton of carbon dioxide from entering the atmosphere. According to the EPA’s calculator, that is a modest carbon payback, equal to the consumption of about ten tanks of gas. On the other hand, this is preferable to adding to the debt.”

“To my mind, it’s the perceptual shift that bears the greatest promise. If we begin to imagine ourselves as beneficial contributors to the earth in death, rather than as agents of sickness and damage, maybe we can start to see that possibility for our lives.”

Wells quotes one of the Recompose participants, Elliot Rasenick.

“’The climate crisis is fundamentally a soil crisis,’ Elliot mused. ‘There is a poetry in the possibility that the death of one generation can make possible the life of the next’.”

Interest in this procedure extends beyond Washington state.

“…similar bills have been introduced in California, Vermont, Delaware, Hawaii, Maine, New York, Oregon, and Colorado. The latter two have already passed.”

The reader who did not find this topic thought provoking might yet benefit from the time spent by considering who in our political firmament might better serve humanity as a cubic yard of dirt.

 

Tuesday, October 19, 2021

How Societies Fail: Cultures of Honor, Patriarchy, Women’s Rights, and Social Stability

 A recent issue of The Economist provided a fascinating article detailing the costs a country pays when its culture demands the subjugation of females: Societies that treat women badly are poorer and less stable.  The author uses research into the correlation between the fragility of societies and their oppression of women to make the point that when women suffer, the men, and greater society, suffer as well.

The article turns on the work of Valerie Hudson of Texas A&M University and Donna Lee Bowen and Perpetua Lynne Nielsen of Brigham Young University.

“In ‘The First Political Order: How Sex Shapes Governance and National Security Worldwide’, Ms Hudson, Ms Bowen and Ms Nielsen rank 176 countries on a scale of 0 to 16 for what they call the ‘patrilineal/fraternal syndrome’. This is a composite of such things as unequal treatment of women in family law and property rights, early marriage for girls, patrilocal marriage, polygamy, bride price, son preference, violence against women and social attitudes towards it (for example, is rape seen as a property crime against men?).”

The term fraternal syndrome arises from the tendency in many countries to form strong clan bonds within an extended family in which the clan as a whole acts to protect the interests of each male member.  Patrilocal refers to the female as the one who must move on when a marriage agreement is made.  Patrilineal refers to the family name and assets passing down through the male descendants.  The term “bride price” refers to the common practice of a father charging a fee to a prospective groom in order to close a marriage deal.  This certainly represents the fact that women are considered economic commodities and their “honor” and market value must be protected.  The accumulated data is plotted on the y-axis versus a measure of the degree of disorder and uncontrolled violence in a nation.

Cultures of honor require individuals to be willing to resort to violence, or the threat of violence, as a response to any affront to their honor.  These features tend to develop in societies where there is no recourse to external assistance in protecting property; the threat of theft is common with only the potential victim able to protect himself and his assets.  This situation will encourage strong family bonds as a means of protection and extend this familial response as far as possible, forming a clan—and perhaps a tribe.  However, the potential thieves will also have developed a clan.  A conflict between two individuals can lead to conflict between a large number of people and extend over generations.  The author uses Iraq as an example of a nation where clannishness leads to dysfunction. 

“The Iraqi police are reluctant to intervene in tribal murders. The culprit is probably armed. If he dies resisting arrest, his male relatives will feel a moral duty to kill the officer who fired the shot or, failing that, one of his colleagues. Few cops want to pick such a fight. It is far easier to let the tribes sort out their own disputes.”

“The upshot is that old codes of honour often trump Iraqi law (and also, whisper it, Islamic scripture, which is usually milder). Cycles of vengeance can spiral out of control.”

Situations in which allegiances to family or clan are stronger than that to the nation predictably produce corruption.

“Clan loyalties can cripple the state. When a clan member gets a job in the health ministry, he may feel a stronger duty to hire his unqualified cousins and steer contracts to his kin than to improve the nation’s health. This helps explain why Iraqi ministries are so corrupt.

Male dominance in a society leads to adverse trends that produce dysfunction.  Nature, wisely, produces male and female babies in equal numbers.  Humans, foolishly, often tend to favor males over females.

“The obstacles females face begin in the womb. Families that prefer sons may abort daughters. This has been especially common in China, India and the post-Soviet Caucasus region. Thanks to sex-selective abortion and the neglect of girl children, at least 130m girls are missing from the world’s population, by one estimate.”

“That means many men are doomed to remain single; and frustrated single men can be dangerous. Lena Edlund of Columbia University and her co-authors found that in China, for every 1% rise in the ratio of men to women, violent and property crime rose by 3.7%. Parts of India with more surplus men also have more violence against women. The insurgency in Kashmir has political roots, but it cannot help that the state has one of most skewed sex ratios in India.”

There is a more efficient way to limit the number of women available for marriage: polygamy (or more precisely, polygyny).  The number of women in polygynous marriages worldwide is small, but it happens to be quite large in some of the most violent regions of the world.

“Only about 2% of people live in polygamous households. But in the most unstable places it is rife. In war-racked Mali, Burkina Faso and South Sudan, the figure is more than a third. In the north-east of Nigeria, where the jihadists of Boko Haram control large swathes of territory, 44% of women aged 15-49 are in polygynous unions.”

“If the richest 10% of men have four wives each, the bottom 30% will have none. This gives them a powerful incentive to kill other men and steal their goods. They can either form groups of bandits with their cousins, as in north-western Nigeria, or join rebel armies, as in the Sahel. In Guinea, where soldiers carried out a coup on September 5th, 42% of married women aged 15-49 have co-wives.”

“Insurgent groups exploit male frustration to recruit. Islamic State gave its fighters sex slaves. Boko Haram offers its troops the chance to kidnap girls. Some Taliban are reportedly knocking on doors and demanding that families surrender single women to ‘wed’ them.”

The custom of bride price turns girls into marketable commodities.  In a situation where females are in short supply, the price can soar producing an unstable situation.

“Bride price, a more widespread practice, is also destabilising. In half of countries, marriage commonly entails money or goods changing hands. Most patrilineal cultures insist on it. Usually the resources pass from the groom’s family to the bride’s, though in South Asia it is typically the other way round (known as dowry).”

“The sums involved are often large. In Tororo district in Uganda, a groom is expected to pay his bride’s family five cows, five goats and a bit of cash, which are shared out among her male relatives. As a consequence, “some men will say: ‘you are my property, so I have the right to beat you,’” says Mary Asili, who runs a local branch of Mifumi, a women’s group.”

“Bride price encourages early marriage for girls, and later marriage for men. If a man’s daughters marry at 15 and his sons at 25, he has on average ten years to milk and breed the cows he receives for his daughters before he must pay up for his sons’ nuptials. In Uganda, 34% of women are married before the age of 18 and 7% before the age of 15. Early marriage means girls are more likely to drop out of school, and less able to stand up to an abusive husband.”

If bride price becomes unaffordable, crime can be the only path to marriage.

“Bride price can make marriage unaffordable for men. Mr Manshad in Iraq complains: ‘Many young men can’t get married. It can cost $10,000.’ Asked if his tribe’s recent lethal disputes over sand and vehicles might have been motivated by the desire to raise such a sum, he shrugs: ‘It is a basic necessity in life to get married’.”

The author introduces his article with a quote from the just mentioned Mr. Manshad of Iraq.

“’A woman who drives a car will be killed,’ says Sheikh Hazim Muhammad al-Manshad. He says it matter-of-factly, without raising his voice. The unwritten rules of his tribe, the al-Ghazi of southern Iraq, are clear. A woman who drives a car might meet a man. The very possibility is ‘a violation of her honour’. So her male relatives will kill her, with a knife or a bullet, and bury the body in a sand dune.”

These types of attitudes, and the desire to conceal females from public view by encasing them in clothing from head to foot, are common in Islamic countries, but they do not seem to be driven by the precepts of Islam itself.  Death for driving a car is a choice that particular society has made.  The Quran merely requires both men and women to dress modestly, a not unreasonable demand.  The culture of honor described above is derived not from an attempt to protect the car-driving woman’s honor but that of the men who own her.  That cultural tradition developed well before even the culture of the ancient Hebrews.  It is associated with the development of patriarchy in what we today refer to as the Middle East.

Gerda Lerner was one of the many Jews driven from Europe by Hitler who went on to make the lives of the rest of us more interesting.  She is credited with producing the first formal class on women’s history at any university in 1963 while she was still an undergraduate.  It would be her efforts that were critical in establishing the history of women as a formal topic for academic research.  She is best known for her book The Creation of Patriarchy (1986).  It covers the origins of patriarchy, its effects on society, its incorporation into religion and history, and its effects on women up to our current time.

As humans progressed from a hunter-gatherer existence to a more sedentary agricultural-based economy, the division of responsibilities between the genders changed.  This period would begin the introduction of features of economics and capitalism that encouraged the accumulation of wealth by individuals, the industrialization of production, and the waging of war for conquest or defense.  All of these advantaged a division of labor in which men took the lead while women focused on the female responsibilities of breeding and caring for children.  Writing and the production of historically useful documents date back to about 3000 B.C.  At that time, evidence existed that women played a substantial, though not equal, role to that of men in society.  Over the next 1000 years or so such references disappeared from historical documents and the dominance of men was expressed in the patriarchal family structure.  At its worst, patriarchy provided these characteristics.

“The father had the power of life and death over his children.  He had the power to commit infanticide by exposure or abandonment.  He could give his daughters in marriage in exchange for receiving a bride price even during their childhood, or he could consecrate them to a life of virginity in the temple service.  He could arrange marriages for children of both sexes.  A man could pledge his wife, his concubines and their children as pawns for his debt; if he failed to pay back the debt, these pledges would be turned into debt slaves.”

“The class difference between a wife living under the patriarchal dominance/protection of her husband and a slave living under the dominance/protection of the master was mainly that the wife could own a slave…”

The fate of girls from poor families was worse.

“By the second millennium B.C. in Mesopotamian societies, the daughters of the poor were sold into marriage or prostitution in order to advance the economic interests of their families.”

Women became valuable commodities that could be bought or sold, but men would do the buying and selling.  It became very important to a man of wealth that he have some means of demonstrating that the women of his family were not available for sale and were safely held under his protection.  From this grew the practice of veiling “honorable” women so they could be distinguished from “dishonored” women who were forbidden the veil.  It is not difficult to imagine veiling by the wealthy becoming an aspirational goal for all.  And if a face covering contributes to family honor, why would not some try to outdo others by covering ever more of the body.  Consequently, the dress requirements for women vary wildly, depending on the particular culture. 

It would be the Hebrews, in assembling their sacred documents, who would establish then current cultural treatment of women as a subordinate species to one ordained as being according to God’s will.  Christianity and Islam would then follow along and evolve from this same starting point.

The Biblical narratives are not kind to women.  The tale of Adam being created by a presumed male God by a means not requiring a birth process, eliminates any female role in creation.  Woman, in the form of Eve, is subsequently created from a rib of Adam, suggesting a lower status or rank relative to Adam and thus to God himself.  Eve then becomes the temptress that causes the fall from grace and the expulsion from Eden.  What clearer message could there be suggesting that women and their sexuality are dangerous to men and must be strictly controlled by men.  The basis of the Hebrew religion would be the covenant God was said to have made with Abraham.

“He asks acceptance that He will be the God of Israel, He alone and no other.  And He demands that His people which worship Him will be set apart from other people by a bodily sign, a clearly identifiable token…”

The token will be the required circumcision of males.

“We must take note of the fact that Yahweh makes the covenant with Abraham alone, not including [his wife] Sarah, and that in so doing He gives divine sanction to the leadership of the patriarch over his family and tribe…the covenant relationship is only with males—first with Abraham, then explicitly with Abraham and Sarah’s son, Isaac, who is referred to only as Abraham’s son.  Moreover, the community of the covenant is divinely defined as a male community, as can be seen by the selection of the symbol chosen as ‘token of the covenant’.”

“For females, the Book of Genesis represented their definition as creatures essentially different from males; a redefinition of their sexuality as beneficial and redemptive only within the boundaries of patriarchal dominance; and finally the recognition that they were excluded from directly being able to represent the divine principle.  The weight of the Biblical narrative seemed to decree that by the will of God women were included in His covenant only through the mediation of men.”

The tyranny of the religions men created in order to propagate their dominance held women back for many centuries—and still does in some regions and cultures.

“…All males, whether enslaved or economically or racially oppressed, could still identify with those like them—other males—who represented mastery of the symbol system.  No matter how degraded, each male slave or peasant was like to the master in his relationship to God.  This was not the case for women.  Up to the time of the Protestant Reformation the vast majority of women could not confirm and strengthen their humanity by reference to other females in positions of intellectual authority and religious leadership.”

“Where there is no precedent, one cannot imagine alternatives to existing conditions.  It is this feature of male hegemony which has been the most damaging to women and has ensured their subordinate status for millennia.  The denial to women of their history has reinforced their acceptance of the ideology of patriarchy and has undermined the individual woman’s sense of self-worth.”

No matter how counterproductive they may be, cultural attributes can be propagated for millennia, long after the conditions that generated them have disappeared.  Revolutions, wars, plagues, they come and go, but all too often the culture persists.

 

 

Sunday, October 10, 2021

Cultures of Honor and Their Persistence: The Scots-Irish

 The term “culture of honor” can be misleading to the unfamiliar.  It does not imply a society wherein honorable intentions and actions are the standard.  Let us turn to this source for an appropriate definition provided by psychologists. 

“A culture of honor is a culture in which a person (usually a man) feels obliged to protect his or her reputation by answering insults, affronts, and threats, oftentimes through the use of violence. Cultures of honor have been independently invented many times across the world. Three well-known examples of cultures of honor include cultures of honor in parts of the Middle East, the southern United States, and inner-city neighborhoods (of the United States and elsewhere) that are controlled by gangs.”

“Cultures of honor can vary in many ways. Some stress female chastity to an extreme degree, whereas others do not. Some have strong norms for hospitality and politeness toward strangers, whereas others actively encourage aggression against outsiders. What all cultures of honor share, however, is the central importance placed on insult and threat and the necessity of responding to them with violence or the threat of violence.” 

Cultures develop over time.  The sociology of inner-city gangs seems more a business model than a culture.  The cultural attributes referred to in the Middle East developed over millennia and have persisted over millennia up to the present day.  Referring to a “southern” culture of honor is not incorrect, but the term can be misleading and provide little in gaining an understanding of the culture. 

It is universally understood that the roots of this southern behavior pattern are derived from the immigration of what are referred to as the Scots-Irish (alternately as the Scotch-Irish).  We will recognize that the term “scotch” is more appropriately associated with a delightful alcoholic beverage.  Malcolm Gladwell included a chapter on the Scots-Irish in his book Outliers: The Story of Success.  He attributes the best historical account of the southern culture to David Hacker Fischer. 

“David Hacker Fischer’s book Albion’s Seed: Four British Folkways in America is the most definitive and convincing treatment of the idea that cultural legacies cast a long historical shadow…In Albion’s Seed, Fischer argues that there were four distinct British migrations to America in its first 150 years: first the Puritans, in the 1630s, who came from East Anglia to Massachusetts; then the Cavaliers and indentured servants, who came from southern England to Virginia in the mid-seventeenth century; then the Quakers from the North Midlands to the Delaware Valley between the late seventeenth and early eighteenth centuries; and finally the people of the borderlands to the Appalachian interior in the eighteenth century.  Fischer argues brilliantly that these four cultures—each profoundly different—characterize those four regions of the United States even to this day.”

These people of the borderlands are the Scots-Irish.  The borderlands are the inhospitable and long-contested regions where England and Scotland collide.  These people would acquire the “Irish” label because they would move to Northern Ireland to escape the English only to find more mistreatment by the English in Ulster before finally seeking relief in America.

“The so-called American backcountry states—from the Pennsylvania border south and west through Virginia and West Virginia, Kentucky and Tennessee, North Carolina and South Carolina, and the northern end of Alabama and Georgia—were settled overwhelmingly by immigrants from one of the world’s most ferocious cultures of honor.  They were the ‘Scotch-Irish’—that is, from the lowlands of Scotland, the northern counties of England, and Ulster in Northern Ireland.”

Former Senator for Virginia Jim Webb, a man proud of his Scots-Irish heritage, heaped praise on these people (and a little criticism) in his book Born Fighting: How the Scots-Irish Shaped America.  He sees much to be admired in the characteristics imported into the US.  We will follow his book for a moment.

His description begins in Roman times when Celtic peoples were driven to what is now Scotland and made a stand in defense of their freedom.  Their social development, or lack thereof, was determined by the centuries of warfare, by the inhospitable environment they found in Scotland, and by the version of Christianity they adopted.

“...while Scotland’s rough topography made it difficult to conquer, it made it equally difficult to rule.....Not unlike Appalachia, Scotland is a land of difficult water barriers, sharp mountains and deep hollows, soggy moors and rough pastures, and of thin, uncultivable soil that lies like a blanket over wide reaches of granite....the settlements of ancient Scotland grew haphazardly and emphasized a rugged form of survival that had links neither to commerce nor to the developing world.  Again we find a cultural evolution and a fundamental lifestyle very much like those that would emerge later in the Appalachian Mountains.”

This is a theme that Webb returns to several times.  These people did not pass through the stages of cultural and political development that have been common to most nations.  The Scots of interest to us left before Scotland became a stable political entity with renowned universities.  By the time they migrated to Northern Ireland they had endured many generations of political turmoil at the national level.  That and their isolated and harsh environment formed character traits and social responses that Webb argues persist to this day.

“Such turbulence at the center of national government not only empowered the local clan leaders, it also demanded that they be strong, both for their own survival and also for the well-being of their extended families.  And again a familiar pattern reinforced itself in what would become the Scots-Irish character: the mistrust of central authority, the reliance on strong tribal rather than national leaders, and the willingness to take the law into one’s own hands rather than waiting for a solution to come down from above.”

Of great significance in understanding these people is the development of their religious beliefs.  The Scots were the beneficiaries of what Webb describes as the most corrupt version of the Catholic Church to be found anywhere.  It seems only natural that they would respond by accepting the most harsh and demanding form of Protestantism based on the teachings of John Calvin. 

“But Scotland ‘developed the Calvinistic doctrine that civil government, though regarded as a necessity, was to be recognized only when it was conducted according to the word of God.’  This meant not only that the Kirk would have the power to organize religious power at the local level, but also that Scots had reserved the right to judge their central government according to the standards they themselves would set from below.” 

Note that at this point you have a people who have never accepted the notion of allegiance to a central government and have lived with the belief that loyalty is to be extended as far as their local clans and churches.  You have a culture whose highest educational goal is to be able to read the Bible.

More on Webb’s book can be found in Born Fighting by Jim Webb.

The Scots-Irish culture would spread from its Appalachian roots and become dominant in the southern states.  It would also undergo one last migration.  It and its effects are described by James N. Gregory, a history professor, in his book The Southern Diaspora: How the Great Migrations of Black and White Southerners Transformed America.  The migration of Blacks from the South to the cities of the North and West has been referred to as “The Great Migration.”  The migration of whites from the South over the same period was much larger, but much less studied.  Gregory provides this summary of what his investigations demonstrated.

“This book is about what may be the most momentous internal population movement of the twentieth century, the relocation of black and white Americans from the farms and towns of the South to the cities and suburbs of the North and West.  In the decades before the South became the Sun Belt, 20 million southerners left the region.  In doing so, they changed America.  They transformed American religion, spreading Baptist and Pentecostal churches and reinvigorating evangelical Protestantism, both black and white versions.  They transformed American popular culture, especially music.  The development of blues, jazz, gospel, R&B, and hillbilly and country music all depended on the southern migrants.  The Southern Diaspora transformed American racial hierarchies, as black migrants in the great cities of the North and West developed institutions and political practices that enabled the modern civil rights movement.  The Southern Diaspora also helped reshape American conservatism, contributing to new forms of white working-class and suburban politics.  Indeed, most of the great political realignments of the second half of the twentieth century had something to do with the population movements out of the South.” 

More on Gregory’s work can be found in The Southern Diaspora and the Southernization of America

The notion that cultures persist across generations and centuries has been claimed several times.  We return to Gladwell who attempts to provide proof.  He begins with an assessment of backwoods honor culture at work.  A description is made of a dispute between the Howard and Turner families in nineteenth century Harlan County, Kentucky.  An argument began between one member of each family over a poker game.  Before it was resolved many people had to die. 

“The first critical fact about Harlan is that at the same time that the Howards and Turners were killing one another, there were almost identical clashes in other small towns up and down the Appalachians.  In the famous Hatfield-McCoy feud on the West Virginia-Kentucky border not far from Harlan, several dozen people were killed in a cycle of violence that stretched over twenty years.  In the French-Eversole feud in Perry County, Kentucky, twelve died…The Martin-Tolliver feud, in Rowan County, Kentucky, in the mid-1880s featured three gunfights, three ambushes, and two house attacks, and ended in a two-hour gun battle involving one hundred armed men.  The Baker-Howard feud in Clay County, Kentucky, began in 1806, with an elk-hunting party gone bad, and didn’t end until the 1930s, when a couple of Howards killed three Bakers in an ambush…And these were just the well-known feuds.” 

“The triumph of a culture of honor explains why the pattern of criminality in the American South has always been so distinctive.  Murder rates are higher there than in the rest of the country.  But crimes of property and ‘stranger’ crimes—like muggings—are lower.  As the sociologist John Shelton Reed has written, ‘The homicides in which the South seems to specialize are those in which someone is being killed by someone he (or often she) knows, for reasons both killer and victim understand.’  Reed adds: ‘The statistics show that the Southerner who can avoid arguments and adultery is as safe as any other American, and probably safer.’  In the backcountry, violence was not for economic gain.  It was personal.  You fought over your honor.”

Gladwell describes experiments by psychologists Dov Cohen and Richard Nisbett of the University of Michigan.  They set up a situation where a group of male students were asked to fill out a form and then drop it off at the end of a long hallway.  For half the students that was all.  For the other half, it was arranged for the student be bumped into by research associate and called an “asshole.”  In order to assess the effect of having that word directed at them the researchers visually sensed the level of anger, measured strength of handshake, and took saliva tests both before and after to measure cortisol and testosterone levels. 

“The results were unequivocal.  There were clear differences in how the young men responded to being called a bad name.  For some, the insult changed their behavior.  For some it didn’t.  The deciding factor in how they reacted wasn’t how emotionally secure they were, or whether they were intellectuals or jocks, or whether they were physically imposing or not.  What mattered—and I think you can guess where this is headed—was where they were from.  Most of the men from the northern part of the United States treated the incident with amusement.  They laughed it off.  Their handshakes were unchanged.  Their levels of cortisol actually went down as if they were unconsciously trying to diffuse their own anger.”

“But the southerners?  Oh,my.  They were angry.  Their cortisol and testosterone jumped.  Their handshakes got firm.” 

“This study is strange isn’t it?  It is one thing to conclude that groups of people living in circumstances pretty similar to their ancestors’ act a lot like their ancestors.  But those southerners in the hallway study weren’t living in circumstances similar to their British ancestors.  They didn’t necessarily have British ancestors.  They just happened to have grown up in the South….They were living in the late twentieth century not the late nineteenth century.  They were students at the University of Michigan, in one of the northernmost states in America, which meant they were sufficiently cosmopolitan to travel hundreds of miles from the south to go to college.  And none of that mattered.  They still acted like they were living in nineteenth-century Harlan, Kentucky.”

“’Your median student in those studies comes from a family making over a hundred thousand dollars, and that’s in nineteen ninety dollars,’ Cohen says.  ‘The southerners we see this effect with aren’t kids who come from the hills of Appalachia.  They are more likely to be the sons of upper-middle management Coca-Cola executives in Atlanta.”

No matter how counterproductive they may be, cultural attributes can be propagated for centuries—even millennia— long after the conditions that generated them have disappeared.  And cultural attributes matter.  Here the focus was on violent tendencies.  But Webb’s description touches on other cultural factors that are perhaps even more important in our era. They certainly matter in politics.  We cannot deal with political adversaries if we don’t understand why they think and act as they do.  We should also try to consider what cultural legacies are driving us to believe and act as we do.