Thursday, November 30, 2017

T. R. Reid: When Tax Reform in the United States Worked: 1986

T. R. Reid has had a long career as a journalist and author.  One of his best products was a volume titled The Healing of America: A Global Quest for Better, Cheaper, and Fairer Health Care.  This book was published around the time that the program now known as Obamacare was being assembled and passed into law.  It provided an excellent comparison between healthcare systems in other countries and that operating in the United States.  Reid’s message was that we in the US can do a lot better if we would only be willing to learn from others.  He has recently produced another timely book focused this time on the topic of taxation: A Fine Mess: A Global Quest for a Simpler, Fairer, and More Efficient Tax System.  Reid again produces a short, clear, highly readable comparison between the US and other approaches to taxation in use around the world.  Again, the message is that we could learn a lot from others.

Reid states that while the initial passage of the federal income tax occurred in 1913, it was not until 1922 that the law settled into a well-established form.  Over the course of time conditions change and lawmakers feel compelled to “tweak” the tax system and it becomes more complicated and more unwieldy.  By the time of Eisenhower’s administration it was clear that the legislation needed a major rewrite in order to render it more efficient.  That new version was generated in 1954.  As before, this version was subjected to continual modifications as legislators felt a need to accommodate various special interests.  It would be in the 1980s that a sprawling and confusing tax code generated enough disgust to render another rewrite necessary.  This would take place under Ronald Reagan in 1986.  Reid takes note of the fact that major rewrites have occurred every 32 years, making the timing for the next rewrite to be 2018.  How much more timely could an author be?

The path to tax reform in 1986 was not simple, but after several years of arguing back and forth between political partisans, a very simple approach began to be viewed in a positive light by both sides and strong bipartisan support for the new tax code was attained.  The lessons of that period should inform the current deliberations on tax modification, but the attempt to avoid bipartisan support probably means we will not see the needed rewrite on Reid’s 32-year schedule.  Nevertheless, Reid’s perspective is informative.

Reid makes the dubious claim that politics in the Reagan era “was as fractious as it is today.”  At the time of the relevant taxation deliberations we had Republicans in control of the presidency and the Senate while Democrats controlled the House of Representatives.  Reid attributes credit for the needed political momentum to two people.

“These two were strange bedfellows indeed.  One of them, a conservative Republican, was a Wall Street tycoon; the other, a liberal Democrat, was a basketball star.”

The Republican was Donald Regan who accepted the post of secretary of the Treasury under President Reagan.  He was not an ideologue and thought our messy system could be improved by studying what other countries were doing in the taxation arena.  He was particularly interested in what was taking place in New Zealand, a country that was already moving in a direction that seemed appropriate for the US as well.

Essentially all nonpartisan tax experts advise countries to adopt an approach tersely summarized as “broad base, low rates,” or even more simply: BBLR.  The notion is that high taxes affect economic decisions which lead to inefficient economic outcomes.  It is better that decisions be based on their intrinsic properties rather than on tax considerations.  The goal then is to attain the needed funds to run government with the lowest possible rates of taxation.  The way to do this is to broaden the tax base as much as possible.  Every existing special tax deduction or allowance then has the effect of narrowing the base and requiring higher rates.  New Zealand possessed a complex mess for a tax code and decided to apply the BBLR principle in order to fix it.

“It relied primarily for revenue on a personal income tax, and the tax was riddled with exemptions, credits, and giveaways for particular groups and companies.  In short, it was the opposite of BBLR.  All those preferences made for a fairly narrow tax base, which meant that tax rates had to be high to raise the required revenue.  By the early 1980s, the top marginal income tax rate was 66%.”

The tax specialists were tasked with lowering income tax rates by half.  This they accomplished by eliminating essentially all of the special provisions that had been inserted into the previous code and by establishing a national sales tax that has dwelled in the 10-15% range.  This tax was labeled a goods and services tax (GST) but it is equivalent to the value-added taxes (VATs) in place in most nations of the world.

“With that addition, the income tax rates could be cut in half for every tax payer in the nation—with no loss of revenue to the government.”

Reid estimates that a couple in the US with the median income will average paying about 35% of earnings in income taxes and healthcare insurance.  That includes federal, state, local, and payroll taxes (Social Security and Medicare).

“In New Zealand, in contrast, the median wage earner pays about 17.5% in income tax.  But that one payment also covers his old-age pension (there’s no separate tax for Social Security), plus free healthcare for life (there’s no separate tax for healthcare), plus free education through college graduation (there’s no separate tax for schools).  So the average New Zealander’s wages are taxed at less than half the rate of the average American’s.  And yet New Zealand provides more government services than the United States—with half the tax rate.  That’s the beauty of BBLR.”

One must remember that New Zealand’s consumption tax (value-added tax) is also much more efficient at collecting funds than the array of state and local sales taxes that are so easy to cheat on, and we waste much more money than anyone else on healthcare.  But yes, there are things to be learned from other countries.

The driver for tax reform from the liberal side was Bill Bradley, a basketball all-American at Princeton and a Rhodes Scholar who followed that up with a career as a star player for the New York Knicks of the NBA.  Bradley’s interest in federal taxes is said to have arisen when he learned that as a player he was considered a “depreciable asset” by his employers.  He entered the Senate representing New Jersey in 1978 and began focusing on the issue of taxation.  His goal was to apply the principles of BBLR in order to be able to lower tax rates.

“’The trade-off between loophole elimination and a lower top rate became obvious,’ Bradley wrote later; ‘the lower the rate, the more loopholes had to be closed to pay for it.’  Bradley stuck to the mantra of ‘broad base, low rates’ for years, telling anybody who would listen that a significant cut in tax rates would win the votes needed to broaden the base.  ‘The key to reform was to focus on the attractiveness of low rates, not on the pain of limiting deductions.”

When Regan, his successor at Treasury, James Baker, and Bradley brought a credible proposal to President Reagan in 1984 to significantly cut tax rates he bought into the idea.  Legislation would eventually pass in 1986 after much political haggling.

The tax legislation passed in 1986 is startling in the changes that were approved.  The fact that it could actually be passed by a bipartisan vote is astonishing given today’s environment.

“The 1986 law, generally recognized as ‘the most significant reform in the history of the income tax,’ reduced the top marginal rate for individual taxpayers from 50% to 28%—the biggest reduction of any tax bill before or since.  It did that by eliminating a broad range of ‘tax shelter’ breaks available only to the rich.  It cut back the deduction for mortgage interest and completely eliminated the deduction for interest on consumer loans, like auto loans and credit cards.  It eliminated the deduction for state and local sales taxes.  It limited deductions for charitable contributions, IRA deposits, medical bills, and other personal expenses.  It set the tax rate on capital gains—that is profit on stocks, real estate deals, and so on—at the same level as the top income tax rate, so that financiers could no longer cut their tax bill by defining all their pay as capital gains.”

“The new law produced significant tax cuts for low-income and median-income Americans and provided tax savings for the rich as well.  But somebody had to pay for all that lost revenue, and the burden was shifted largely to corporations.  Although the bill cut the basic corporate tax rate, from 48% to 34%, it took away so many of industry’s cherished credits, deductions, and depletion allowances that corporate taxes increased by some $120 billion over five years.”

It may be difficult to believe now, but at one time the United States was the shining example of how to produce tax policy.

“This stunning and unexpected tax reform, particularly coming out of a politically polarized Washington, D.C., drew attention, and prompted action, around the world.  When little New Zealand transformed its tax code, the other wealthy nations found it interesting; when the mighty United States did the same thing, the rest of the world found it imperative, on political and fiscal grounds, to do the same.  In short order, Britain, Ireland, Canada, the Netherlands, and other democracies dramatically lowered their tax rates by broadening the tax base.  The OECD called this wave of tax reduction a ‘global revolution,’ and the United States lit the spark.”

But politicians do what politicians do and “money doesn’t talk, it swears,” so over the years, even New Zealand’s tax code became cluttered with exceptions and had to be rewritten in 2010.  Our politicians have also been busy.  Corporate taxes produce ever less in relative revenue and the demand is to cut that revenue even further.  Mechanisms for tax avoidance inexorably grow causing the government to either raise rates or borrow money to keep the ship of state afloat.

“After proudly patting itself on the back because of the 1986 reform, Congress in the next three decades made more than thirty thousand changes to the 1986 code.  Most of them ran counter to the ethos of BBLR.  Virtually all of them made the tax code more complicated—including that bizarre ‘anti-complexity clause,’ Section 7803(c)(2)(B)(ii)(IX).”

It is hard to believe that only three decades ago Congress was capable of acting on such a major bill.  Reid may see his 32-year prophecy come to pass, but it is unlikely that the result will be viewed as a worthwhile accomplishment by the American people—let alone by the rest of the world.


The interested reader might find the following articles informative:






Tuesday, November 14, 2017

The Persistence of Implicit Racial Bias

Although many of the world’s most powerful nations have participated in slavery, the United States was unique in the extent to which it based its political structure and its economy on the institution of slavery.  In so doing, it created a situation that it has struggled to deal with for the last 150 years.  The inevitable end to enslavement for millions of black African Americans left a white-dominated society with a need to incorporate all of these new citizens.

People who enslave others must produce a moral justification for themselves.  If one chooses to make slaves of members of a race, then that race must consist of people who are deserving of subjugation.  If whites are humans, then blacks must be subhuman in some way or another.  It was as simple as that.  And that belief, imprinted over centuries, did not disappear with the end of the institution of slavery.  In fact, it persisted openly throughout much of the twentieth century.  It seems to continue to propagate within our culture even though overt forms of discrimination have become illegal.

Keith Payne addresses the implicit form of racial bias that has continued in the US in his book The Broken Ladder: How Inequality Affects the Way We Think, Live, and Die.  In one chapter Payne devotes a chapter to the ties between racial bias and inequality.  He concludes that discrimination persists although it has become more covert.  He also demonstrates that while explicit bias is much less apparent, implicit bias is quite common.  Most troubling, but perhaps most enlightening, he provides examples which indicate racial bias is often subconscious, resulting from a lifetime of conditioning.  Payne provides this perspective.

“When the Civil Rights Act of 1964 outlawed overt racial discrimination, and the Voting Rights Act of 1965 ended explicitly discriminatory voting practices, society did not change overnight in response.  Following that 350-year period of perfectly legal subjugation, a mere half century—less than a single lifetime—separates us from whites-only lunch counters, water fountains and schools.  How much have things changed since then?  It depends whom you ask.”

“If you look at polls, the proportion of Americans favoring overtly racist ideas like segregated schools and hiring discrimination has declined from clear majorities in the 1960s to single digits today.  These trends have been regarded as an encouraging sign, but perhaps we have drawn too much encouragement from them.”

There are a number of studies that indicate racial discrimination, while more subtle, is still alive and well.  One of the more famous studies, by sociologist Devah Pager, consisted of sending out equal numbers of young black and white men with résumés crafted to be equivalent.  They were also given equivalent narratives to introduce themselves to prospective employers.

“The white applicant was called back twice as often as the equally qualified black applicant.  Similar studies have been repeated with the same results in New York, Chicago, Atlanta, and other cities.  They have also been replicated in areas other than employment.  Black renters are much more likely than equally qualified white renters to be told there are no vacant apartments.  Black shoppers are offered less favorable deals on cars and higher interest rates on mortgages than equally qualified whites.  Antiblack bias is alive and well in twenty-first-century America.”

People acquire their racial attitudes from family, from friends, and from what they view.  Since de facto segregation is still common, much of what is learned about other races is absorbed from the media.  The sum of all those inputs programs an individual to respond to other races in certain ways.  The examples of bias listed above derive from conscious decisions to discriminate.  What Payne wants us to realize is that racial bias can arise from subconscious mechanisms and lead to what he refers to as implicit bias. One can be biased without actually realizing it.

Payne describes his own enlightenment as he sought a tool to measure implicit bias in a rather important context.  He was interested in learning the probability that a person would mistakenly assume a harmless object was a gun when it was associated with a black person.  His objects were mostly tools like wrenches and pliers that were chosen to be metal and similar in size to a handgun.  The idea was to quickly flash a picture of a white or black person, followed by an image of an object.  The subjects participating in the measurement were given only a brief instant to decide whether the object was or was not a gun.  Payne tried the program out on himself as he verified that it was working properly.  He was startled by what he discovered.

“When I looked at my data I got about 80 percent correct.  That was not a bad result, but the pattern of my errors was disturbing: I was much more likely to mistake harmless objects for guns when a black face had been flashed initially.”

“Sitting there in my lab, trying to beat my own bias test and failing, I felt for the first time the discomforting gap between my good intentions and my biased behavior, known as implicit bias.”

His explanation for his own behavior, and that of the average person who would take such a test, is that when we are faced with ambiguous data, our subconscious has been programmed to make a choice, and that choice will be the one we subconsciously expect to be the case.

“One of the best-established findings in all of psychology is that people make sense of uncertain or ambiguous circumstances by relying on their expectations.  The less time there is to think carefully, the more they depend on them.”

Payne’s experience certainly is relevant to the rash of police shootings of black people that have occurred in recent years.  His data indicates that not everyone exhibits this bias, but enough do to allow him to assert that the average person will be biased with respect to race.  One might expect that this knowledge can be used to eliminate this tendency, but that is not the case.

“In some versions of the experiment, we even warned the subjects that the race of the face would bias them and urged them to resist that prejudice.  But cautioning didn’t help, and in fact it made the bias even worse, because then the topic of race was more prominent in subjects’ minds.  Good intentions don’t protect us from unintended biases.”

Payne’s results have been duplicated by other researchers in their own laboratories.  This is a significant finding.  One has to wonder how this mental programming was accomplished.  How did this association of black people with guns occur?  Payne perhaps provides us with a clue in a discussion of another bias against blacks.

According to Payne, we have been programmed to think of poor people who are deserving of assistance as whites, and those who are on “welfare” as undeserving blacks.

“Not only does income inequality heighten racial bias, but prejudice can also perpetuate income inequality.  Decades of studies have found a strong correlation between dislike of black people and opposition to social welfare policies aimed at helping the poor.”

“’Welfare’ simply refers to the suite of race-neutral government programs aimed at helping the poor, so these results don’t make much sense on their surface.”

“But it turns out that when Americans talk about ‘the poor,’ they mean something very different from when they talk about ‘welfare recipients.’  The best predictor of wanting to slash funding for welfare recipients is racial prejudice.  People who believe that black Americans are lazy and undeserving are the most likely to oppose welfare spending.”

It seems the traditional media has played a role in establishing biases.  It is scary to consider what social media will contribute to interracial strife.

“While it may not be surprising that the average person views welfare in racially tinged terms, the truth is that welfare recipients are about evenly divided among white, black, and Hispanic recipients.  But when [political scientist Martin] Gilens analyzed depictions of welfare recipients in television and newsmagazines since the 1960s, he found a clear racial bias: When welfare recipients were depicted as the ‘deserving poor,’ they were mostly white, but when they were portrayed as lazy and dishonest, they were overwhelmingly black.”

Insidious cultural messaging coming from parents and peers might be foreseen damping out in a few generations.  But when it is deeply imbedded in our mass media, that situation can be corrected quickly.  Whether or not change occurs depends on people like Payne getting the message out.

Payne leaves us with this final thought on the matter of implicit racial bias.

“Understanding implicit bias requires taking a more nuanced approach to the individuals we are easily tempted to label as ‘racist’ or ‘not racist.’  If you consider whether you yourself are biased, and why, you will likely focus on your conscious thoughts and beliefs, your values and good intentions.  Having reflected on what a fundamentally good person you are, you will conclude that implicit bias is other people’s problem.  Although we would all like to believe ourselves to be members of the ‘not racist’ club, we are all steeped in a culture whose history and present is built on massive racial inequality.  Research has shown that a majority of even well-meaning people—and their children—show signs of implicit bias when tested.”


The interested reader might find the following articles informative:





Wednesday, November 8, 2017

Early States, Capitalism, and the Domestication of Humans

There is a tendency to assume that human evolution has been characterized by an inexorable improvement in humanity’s capabilities and in the societies it develops.  When humans were at the mercy of the elements and could exercise little control over their environments, natural selection would favor characteristics that favored survivability in whatever current environment existed.  That produces change but it does not necessarily introduce what one might consider, in retrospect, as progress.  As time went on humans became more adept at influencing their own environments and creating new selection trends.  About 10,000 BCE humans began to experience and try to manage a number of changing conditions, probably driven mostly by increases in population and the ever-changing climate.  A species that had spent most of its existence as hunter-gatherers would gradually transition into farmers, herders and craftspeople.  Small groups would be replaced by larger communities that would ultimately evolve to states organized on the basis of an agricultural economy.  This is often viewed as a period of great progress on the part of humanity.  Humans filled the earth and to a great extend molded it to suit their needs.  However, in changing the earth they also changed the factors operative in natural selection.  By changing their environment, humans also changed themselves.

It was certainly a period of great change, but can it all be viewed as progress? 

James C. Scott is a political scientist at Yale University with an interest in the characteristics of the earliest formed states.  He was impressed by the amount of new information that had been produced by archeological and anthropological studies and was moved to present his interpretation of this fresh data in his book Against the Grain: A Deep History of the Earliest States.  His focus is on events in a region that is now roughly equivalent to modern Iran.  What Scott’s analysis makes clear is that the precursors of the modern state were entities driven by elites whose goals had little to do with any universal benefits to humanity.  Rather, these early political constructs seemed more akin to modern corporations—but ones with horrible human resource policies.

Scott introduces his final chapter with this warning:

“The history of the peasants is written by the townsmen
The history of the nomads is written by the settled
The history of the hunter-gatherers is written by the farmers
The history of the nonstate peoples is written by the court scribes
All may be found in the archives catalogued under ‘Barbarian Histories’”

The immediately accessible record of the distant past is generally self-serving documentation produced by a small element of the population.  In effect, any person who was not controlled by a state was considered a “barbarian.”  The term had nothing to do with the quality of life or the viability of the society in which these nonstate peoples lived.  In fact, Scott claims that one could make the argument that it would only be around 1,600CE that the majority of humans would have transitioned from “barbarism” to state domination.  That last statement carries a scent of cynicism.  That type of attitude is difficult to avoid after reading Scott who seems to enjoy indicating the conflicts of interest that exist between state rulers and state subjects.  For example, the most important task of the early states was to prevent the accumulated laborers under state control from escaping and regaining the safer and more comfortable life available under “barbarism.”

Scott provides a brief chronology of the period of interest.

“Homo sapiens appeared as a subspecies about 200,000 years ago and is found outside of Africa and the Levant no more than 60,000 years ago.  The first evidence of cultivated plants and of sedentary communities appears roughly 12,000 years ago.  Until then—that is to say for ninety-five percent of the human experience on earth—we lived in small, mobile, dispersed, relatively egalitarian, hunting-and-gathering bands.  Still more remarkable, for those interested in the state form, is the fact that the first small, stratified, tax-collecting, walled states pop up in the Tigres and Euphrates Valley only around 3,100 BCE, more than four millennia after the first crop domestications and sedentism.  This massive lag is a problem for those theorists who would naturalize the state form and assume that once crops and sedentism, the technological and demographic requirements, respectively, for state formation were established, states/empires would immediately arise as the logical and most efficient units of public order.”

Here is his summary of our conventional wisdom as to our history.

“Historical humankind has been mesmerized by the narrative of progress and civilization as codified by the first great agrarian kingdoms….In its essentials, it was an ‘ascent of man’ story.  Agriculture, it held, replaced the savage, wild, primitive, lawless, and violent world of hunter-gatherers and nomads.  Fixed-field crops, on the other hand, were the origin and the guarantor of the settled life, of formal religion, of society, and of government by laws.  Those who refused to take up agriculture did so out of ignorance or a refusal to adapt.  In virtually all early agricultural settings the superiority of farming was underwritten by an elaborate mythology recounting how a powerful god or goddess entrusted the sacred grain to a chosen people.”

“No one, once shown the techniques of agriculture, would dream of remaining a nomad or forager.  Each step is presumed to represent an epoch-making leap in mankind’s well-being: more leisure, better nutrition, longer life expectancy, and, at long last, a settled life that promoted the household arts and the development of civilization.”

What actually happened was that these barbarians had already developed all the technology needed to implement an agricultural economy based on a few dominant crops and animals yet they decided against it.  They had very good reasons for not following that path, resisting such a move for over 4,000 years.  The region in which they lived, the Tigres and Euphrates Valley, was, at the time, rather lush with many wetland areas that provided an abundant and diverse assortment of food sources.  Acquiring one’s daily nutrition was a part-time occupation, and if conditions changed it was relatively simple to move on to a new location.

“Having already domesticated some cereals and legumes, as well as goats and sheep, the people of the Mesopotamian alluvium were already agriculturalists and pastoralists as well as hunter-gatherers.  It’s just that so long as there were abundant stands of wild foods they could gather and annual migrations of waterfowl and gazelles they could hunt, there was no earthly reason why they would risk relying mainly, let alone exclusively, on labor-intensive farming and livestock rearing.”

What was driving the development of the agricultural economy based on state control was not the desire to advance civilization, but the desire to earn a profit for the few from the labor of many.  There were many sources of food available, but grain was chosen to be the main crop because it provided critical industrial and fiscal advantages.  It was important because it could be traded, making it the equivalent of money.  It would require a great amount of labor to produce that wealth.  It would require much more effort than that involved in hunting and gathering.  Therefore, a degree of coercion was required to obtain laborers.  Either environmental conditions made hunting and gathering no longer competitive, or physical coercion was required.  What records remain of these early states indicate great concern about maintaining the workforce by preventing escape or replacing those who escaped by raiding other sites and enslaving captives.  So much for advances in civilization.

“The key to the nexus between grains and states lies, I believe, in the fact that only the cereal grains can serve as a basis for taxation: visible, divisible, assessable, storable, transportable, and ‘rationable.’  Other crops—legumes, tubers, and starch plants—have some of these desirable state-adapted qualities, but none has all of these advantages.”

“The fact that cereal grains grow above the ground and ripen at roughly the same time makes the job of any would-be taxman that much easier.  If the army or tax officials arrive at the right time, they can cut, thresh, and confiscate the entire harvest in one operation.”

These early states were to be kingdoms, not democracies.  The people, other than a class of elites, were subjects, not citizens.  Wild animals were domesticated by controlling reproduction to produce desired characteristics.  It would take only a few generations of controlled breeding to produce a more docile species better acclimated to life in the agricultural economy.  Something similar must have also occurred with humans as they were extracted from the more intellectually challenging and nutritionally superior life of the hunter-gatherer and subjected to generations of simple but strenuous labor.

“’Domiciled’ sheep, for example, are generally smaller than their wild ancestors; they bear telltale signs of domesticate life: bone pathologies typical of crowding and a narrow diet with distinctive deficiencies.  The bones of ‘domiciled’ Homo sapiens compared with those of hunter-gatherers are also distinctive: they are smaller; the bones and teeth often bear the signature of nutritional distress, in particular, an iron-deficiency anemia marked above all in women of reproductive age whose diets consist increasingly of grains.”

“Evidence for the relative restriction and impoverishment of early farmers’ diets comes largely from comparisons of skeletal remains of farmers with those of hunter-gatherers living nearby at the same time.  The hunter-gatherers were several inches taller on average.  This presumably reflected their more varied and abundant diet.”

Animal species that have been domesticated all undergo physiological changes and suffer a loss of brain mass relative to their wild counterparts.  It is not clear exactly what that loss can be attributed to, but it seems foolish to assume that humans could not have been similarly affected.  In fact, physical anthropologists tell us that human brain size has been decreasing for the past 20,000 years.  Could it be that civilization places less demands on us and allows smaller brains to prove adequate?

“It is no exaggeration to say that hunting and foraging are, in terms of complexity, as different from cereal grain farming as cereal grain farming is, in turn, removed from repetitive work on a modern assembly line.”

The enshrinement of the agricultural economy, and the increase in population density of both humans and other animals, contributed yet another new feature to civilization: the creation of modern infectious diseases.  A virus or microbe that can thrive within an animal host requires a mechanism for transfer to another host and the availability of another host if it is to survive.  There will then be a minimum population size required for a disease agent to propagate and thrive.  That value will differ according to the characteristics of the given agent.  The critical point is that larger, higher-density populations invite new disease agents to move in and take hold.

“The importance of sedentism and the crowding it allowed can hardly be overestimated.  It means that virtually all the infectious diseases due to microorganisms specifically adapted to Homo sapiens came into existence only in the past ten thousand years, many of them perhaps only in the past five thousand.  They were, in the strong sense, a ‘civilizational effect.’  These historically novel diseases—cholera, smallpox, mumps, measles, influenza, chicken pox, and perhaps malaria—arose only as a result of the beginnings of urbanism and, as we shall see, agriculture.  Until very recently they collectively represented the major overall cause of human mortality.”

Agriculture’s specific contribution to the misery of humankind is the close association between herds of humans and the herds of other animals that ensued.  This provided infectious agents from humans the opportunity to attack other species (anthroponosis), and, more importantly to us, the opportunity for infectious agents carried by animals to infect humans (zoonosis).

“Estimates vary, but of the fourteen hundred known human pathogenic organisms, between eight hundred and nine hundred are zoonotic diseases, originating in nonhuman hosts.”

“In an outdated list, now surely even longer, we humans share twenty-six diseases with poultry, thirty-two with rats and mice, thirty-five with horses, forty-two with pigs, forty-six with sheep and goats, fifty with cattle, and sixty-five with our much studied and oldest domesticate, the dog.”

The zoonosis process continues.  HIV and Ebola are more recent zoonotic diseases.  As humans continue to increase in population and wander into unexplored ecosystems where new sources of pathogens exist, some have suggested that the rate of zoonosis is increasing.

Over the millennia, these new diseases had devastating effects on human populations, contributing to the rise and fall of states and severely limiting population growth—at least for a while.  Since survivors of a given disease acquire immunity, populations would eventually stabilize while allowing the disease to lurk in the background and survive—if there was a sufficient flux of new potential hosts.  The disease became endemic within that population, ready to leap out and devastate any group of humans without acquired immunity that it might encounter.  One legacy of the human civilization project is the burden of living under the continuous threat of new and even more dangerous epidemics.

We know, of course, that the hunter-gatherer lifestyle lost out in the long run as formalized, hierarchical states became dominant.  It seems some combination of population pressure and climate change probably forced people to accept the less desirable roll of state-controlled laborers.  Can this be considered progress?  Many view this transition as the source of the tale of humans being driven out of the Garden of Eden and being forced to live a life of suffering and toil.

Tales of paradise lost are probably extreme, but it is clear that this transition was quite painful and not without unforeseen consequences.  Scott is cautious in hypothesizing about what effect the state-dominated regime, with its economic demands, might have had on humans.  He believes not enough generations have passed for genetic consequences to be readily apparent.  He may be wrong.

Consider that we began with a picture of humans living in small bands where wealth was not accumulated and individuals were more or less equal in status.  The dynamic of a small band demands that members look out for each other and share their individual bounties when appropriate.  Would anyone use those same words to describe current societies?  What the “expulsion from Eden” did was create a hierarchical society where class mattered, and there were always elites who accumulated wealth through their control of society.  We went from living in a world of relative equality to one of rampant inequality.  We went from a society in which cooperation was demanded to one in which competition is required.  We started in a place where wealth was barely even a concept to one in which it is worth breaking all societal rules in order to acquire it. 

Living in such an environment for a few hundred generations is plenty of time for natural selection to have carried us off in some different direction.  We are not who we were, and we do not know who we will become.

There are at least two rapidly approaching crises with which humans will have to deal.  One is climate change; the other is the growth of automation and artificial intelligence.  It is likely that both will require a considerable retreat from the inequality and individual competitiveness we have grown accustomed to if we are to survive intact.  The question is: Can we discard the lessons of a hundred generations worth of natural selection in the next one or two generations?

I fear not.


The interested reader might find the following articles informative: