Wednesday, February 26, 2020

Climate Gentrification, Climate Apartheid


The Earth is warming.  Even the climate deniers recognize that, they just refuse to accept responsibility.  Sea levels are rising, storms are becoming more ferocious and perhaps will become more frequent.  Living on the seacoast has become more hazardous and it will only get worse.  One might expect that the prudent would retreat in an orderly fashion to higher ground, but what is actually happening is far from prudent.  Gilbert M. Gaul provides a view of what has been occurring along our eastern and gulf coasts over recent decades in his book The Geography of Risk: Epic Storms, Rising Seas, and the Cost of America's Coasts

Our coasts, except for the cities and towns long established at strategic locations, were relatively undeveloped until the era of postwar affluence when even the middle class could envisage a small summer home along the ocean for escape and recreation.  This surge of building would lead to organization into municipalities that bred politicians and real estate lobbyists.  The result was a need to maintain and protect whatever had been built, even if the homes were periodically flooded or destroyed.  As it became clear that the costs of residing on fragile coasts exceeded local and regional means, it would have to be the federal government that acquired the responsibility for supporting this infrastructure.

“Over the next few decades, a surge of second homes filled beach towns and communities along the ocean, gulf, and bays.  That wave of development has only accelerated since, backed by an array of federal and state programs that provide inexpensive financing and tax breaks; offer heavily subsidized flood insurance; underwrite roads, bridges, and utilities; and distribute billions more in disaster aid to help beach towns rebuild after hurricanes and floods—setting the stage for a seemingly endless loop of government payouts.  It isn’t an exaggeration to say that without the federal government, the coast as we know it simply wouldn’t exist.”

“The result, this book argues, is one of the most costly and damaging planning failures in American history, with at least $3 trillion worth of property now at risk of flooding and catastrophic storms, and the U.S. Department of Treasury serving as the insurer of last resort.”

The combination of climate change and continuing real estate development has dramatically increased the cost of maintaining this system.

“Hurricanes and coastal storms now account for sixteen of the twenty most expensive disasters in U.S. history, with well over half a trillion dollars in damage in the last two decades alone—far more than the damage caused by earthquakes, wildfires, and tornadoes combined.  Worryingly, the pace of destruction has accelerated, with seventeen of the most destructive hurricanes in history occurring since 2000.  In 2017, Harvey, Irma, and Maria resulted in over $300 billion in losses, the most in a single year.  As I write, Florence is bearing down on an area of North Carolina that has repeatedly been pummeled by storms, yet is rapidly adding houses and people.  No doubt the damage will cost taxpayers billions.  Meanwhile, federally subsidized flood claims at the coast have increased twentyfold in the last two decades and account for three-quarters of all flood losses nationwide, effectively leaving the government’s insurance program insolvent.  Now with rising seas, warmer oceans, and more property than ever before at risk, even more calamitous storms—and government payouts—are inevitable.”

Gaul devotes much of his book to the history of Long Beach Island which stretches along the New Jersey coast.  Barrier islands such as this are common along our shores.  They are essentially mounds of sand that have accumulated over time.  The sands naturally shift and move under the action of currents and waves, with major storms capable of causing significant erosion.  Early builders of houses seemed to recognize that their investments were risky.  They built modest structures that could quickly be replaced if destroyed.  Once one allows houses to be built in a significant number the pressure builds to protect that net investment.  Local politicians and real estate interests would clamor for ever more assistance in rebuilding or in replacing continually eroding beaches with dredged-up sand.  One ends up with a never-ending rebuilding and maintenance cycle.

What is curious about this process is that it has led to ever greater property values—at least thus far.  The more middle-class initial homeowners have been replaced by a wealthier class better able to deal with the frequent damage and the paperwork involved in filing federal claims.  The process is akin to the gentrification that is often identified with urban areas.

“By the mid-1980s, the rapidly rising cost of land at the coast motivated homeowners to build bigger houses to maximize their investments.  Real estate records highlight the trend: in the 1960s, the average size of a Long Beach Island bungalow was about 600 square feet.  The following decade it grew to 1,000 square feet.  In the 1980s, it crept up to about 1,500 square feet and then 2,000 square feet.  After Hurricane Sandy wrecked the island, the average size of a new house swelled to over 3,000 square feet, which is about 500 square feet larger than the typical American house in the suburbs.”

The value of the island’s property would rise from about $100 million in the 1960s, to over $15 billion today.  The island’s communities would undergo fundamental changes as the homeowners they fought to protect became infrequent visitors.

“Tellingly, as prices, taxes, and other costs soared, Long Beach Island began emptying out.  In 1980, when Jim Mancini was battling the state over the right to build along the oceanfront, he told regulators the local population would double by the year 2000, to 28,000, as baby boomers reached retirement age and moved to the shore year-round.  In fact, the year-round population of Long Beach Island (and many other shore communities) has plummeted to about 7,000 as families sell or are priced out and wealthy investors gobble up more of the real estate.”

“In winter, Long Beach Island takes on an abandoned, ghostly feeling, with row after row of blackened houses.  There are fewer year-round workers, fewer families, and far fewer schoolchildren.  The island once sponsored four Little League teams.  Now it doesn’t have enough kids for one.  A few years ago, Long Beach Township tore up its field and replaced it with pickleball courts favored by the elderly.”

One of the reasons the costs of coastal storms have begun to skyrocket is that the residents have become wealthier and have built more expensive structures.  Government policies put in place to protect the vulnerable from devastating financial disaster are now being used to maintain the lifestyles and playgrounds of the wealthy.

The barrier islands represent one form of evolution along our coasts.  Because of the wealth concentrated there they receive political and financial consideration from the powers that be.  However, there is another class of places equally threatened by rising seas and climate change.  There the residents are much poorer and often of darker skin.  In a practice that has been referred to as climate apartheid, they are generally left to fend for themselves.

Gaul tells of a tour he took of small towns along the shores of Eastern Carolina with a coastal geologist named Stan Riggs. There, many of the towns had an elevation of less than a foot and were encircled by a complex array of waterways whose threat to the populations was increasing rapidly as the ocean water rose.

“The towns were tiny and poor.  The dwindling populations ranged from the hundreds to a few thousand.  Some had been formed by freed slaves who’d worked for decades on nearby farms.  Others were fishing villages now trying to reinvent themselves as ecotourism destinations.  They were filleted by black-water creeks and rivers.  Water abounded in bogs, swamps, agricultural dikes, rivers, lakes, and sounds.  Trees stripped of foliage and color haunted the landscape where saltwater had invaded fresh.  ‘Ghost forests,’ Riggs said.  A wharf that was once three feet above sea level was now two feet below…All were choking on tidal water rising up from the nearby estuaries and rivers.”

“Riggs was angry because no one was paying attention.  Most of the focus of the rising water was on the barrier islands, which was where all the property and money was.  ‘The issue isn’t the Outer Banks,’ Riggs stressed.  ‘They can take care of themselves until the water gets them.  The politics ignores all of these small, poor places on the Inner Banks’.”

Karen Plough, an associate of Riggs, put the town of Columbia in perspective for Gaul.

“…some residents of Columbia didn’t want to hear about sea-level rise.  ‘They enjoy living here for the simple life and values and don’t want to leave,’ Plough said.  ‘Fishermen will tell you that they see it.  But people fifty and younger don’t believe it.  I think part of it is that people are afraid of science.  They don’t understand it, or they don’t trust it’.”

“In North Carolina, it is practically a blood sport to attack scientists.  A few years earlier, Riggs had quit a state panel on sea-level rise when state lawmakers rewrote the panel’s report after being pressured by real estate and development interests.”

“Until then, North Carolina had enjoyed a reputation as a national leader in coastal management, even embracing rules limiting oceanfront development.  But after conservative Republicans took control of the state capital, the politics turned ugly, and they began rolling back rules.  In 2012, they criticized the committee’s findings on sea-level rise as unfriendly to business and rejected the science as unproven estimates…”

Columbia is luckier than many of its neighboring towns. The time still exists to save the town if there was the will to do so.

“Columbia has some time, Riggs said, but the residents need to act now.  ‘If the sea level rises two feet, Columbia will have to move—period.  If it rises three feet, it’s gone’.”

That conclusion illustrates what Gaul has been trying to demonstrate.  One can hold back the waters for a long time if there are only a few places to protect and one is willing to spend enormous amounts of money to do so.  But there are many places where the people will have no choice but to pick up and leave at some point.  The ideal situation is to try and recreate a town further inland at a higher elevation if possible.  Such an outcome would take years of planning and considerable resources.  The worst result would follow from ignoring the problem until the town essentially dissolved, dispersing refugees throughout the region.

Retreat will be necessary.   We might as well recognize that fact and plan to do it efficiently.


Tuesday, February 18, 2020

Money: Economists, Banks, and Government


David Graeber is a New-York-born anthropologist with a bent towards economic history.  His views seemed to be too iconoclastic for staid old Yale University and a seemingly successful professorship there failed—controversially—to lead to tenure.  He subsequently settled in at the London School of Economics as a professor of anthropology.  There is an advantage in viewing economics from the perspective of anthropology because that discipline actually requires one to look at economic actors as real human beings, not as the automatons constructed by economists to fit economic theories.  Graeber’s most prominent work was his book Debt: The First 5,000 Years.  In it he makes clear his opinion that economists know nothing about human beings and economic history, and understand very little about their own discipline of economics.  Economics: Money, Markets, Debt,and the Barter Myth discusses many of his findings.  Graeber has returned to the topic of economics with a review of the book Money and Government: The Past and Future of Economics by Robert Skidelsky, an economist of whom he approves.  The article titled Against Economics appeared in The New York Review of Books.

Graeber begins with this perspective.

“Mainstream economists nowadays might not be particularly good at predicting financial crashes, facilitating general prosperity, or coming up with models for preventing climate change, but when it comes to establishing themselves in positions of intellectual authority, unaffected by such failings, their success is unparalleled. One would have to look at the history of religions to find anything like it.”

“To this day, economics continues to be taught not as a story of arguments—not, like any other social science, as a welter of often warring theoretical perspectives—but rather as something more like physics, the gradual realization of universal, unimpeachable mathematical truths.”

However, a real science arrives at unimpeachable truths by constant testing and validation of those truths.

“This is precisely what economists did not do. Instead, they discovered that, if one encased those models in mathematical formulae completely impenetrable to the noninitiate, it would be possible to create a universe in which those premises could never be refuted. (‘All actors are engaged in the maximization of utility. What is utility? Whatever it is that an actor appears to be maximizing.’) The mathematical equations allowed economists to plausibly claim theirs was the only branch of social theory that had advanced to anything like a predictive science (even if most of their successful predictions were of the behavior of people who had themselves been trained in economic theory).”

The validity of these economic models has not been demonstrated, but it has been sold to the media and to the general public as if it had been.  Beliefs are so firmly entrenched that a complete retelling of economic history is required in order to shake them.  This is what Skidelsky sets out to do.

“Ostensibly an attempt to answer the question of why mainstream economics rendered itself so useless in the years immediately before and after the crisis of 2008, it is really an attempt to retell the history of the economic discipline through a consideration of the two things—money and government—that most economists least like to talk about.”

Astonishingly, economists can’t seem to come to an understanding of that fundamental economic quantity called money.  Pundits and politicians love to proclaim that some or other worthy project cannot be pursued because there is no money available.  In Theresa May’s words, “There is no magic money tree.”

“There are plenty of magic money trees in Britain, as there are in any developed economy. They are called ‘banks.’ Since modern money is simply credit, banks can and do create money literally out of nothing, simply by making loans. Almost all of the money circulating in Britain at the moment is bank-created in this way. Not only is the public largely unaware of this, but a recent survey by the British research group Positive Money discovered that an astounding 85 percent of members of Parliament had no idea where money really came from (most appeared to be under the impression that it was produced by the Royal Mint).”

Most economists insist that the money available from banks continues to be corelated to a bank’s reserve holdings, but not all.

“Only a minority—mostly heterodox economists, post-Keynesians, and modern money theorists—uphold what is called the ‘credit creation theory of banking’: that bankers simply wave a magic wand and make the money appear, secure in the confidence that even if they hand a client a credit for $1 million, ultimately the recipient will put it back in the bank again, so that, across the system as a whole, credits and debts will cancel out. Rather than loans being based in deposits, in this view, deposits themselves were the result of loans.”

What this means is that economists can’t say whether the supply of money places a constraint on economic activity, or whether the supply of money is determined by the level of economic activity.  Those “heterodox” economists seem to be gaining the upper hand—at least in principle.

“Before long, the Bank of England (the British equivalent of the Federal Reserve, whose economists are most free to speak their minds since they are not formally part of the government) rolled out an elaborate official report called ‘Money Creation in the Modern Economy,’ replete with videos and animations, making the same point: existing economics textbooks, and particularly the reigning monetarist orthodoxy, are wrong. The heterodox economists are right. Private banks create money. Central banks like the Bank of England create money as well, but monetarists are entirely wrong to insist that their proper function is to control the money supply. In fact, central banks do not in any sense control the money supply; their main function is to set the interest rate—to determine how much private banks can charge for the money they create. Almost all public debate on these subjects is therefore based on false premises. For example, if what the Bank of England was saying were true, government borrowing didn’t divert funds from the private sector; it created entirely new money that had not existed before.”

What emerges from Skidelsky’s history is an endless sequence of conflicts between supporters of QTM (the quantity theory of money) which concludes that inflation is generally a matter of monetary supply and demand, and those who view monetary supply to be a result of economic activity wherein it is created to meet a demand for it and therefore is not directly related to inflation.  Rather, it will be the demand for products, either material or labor, that will drive inflation. 

“To put it bluntly: QTM is obviously wrong. Doubling the amount of gold in a country will have no effect on the price of cheese if you give all the gold to rich people and they just bury it in their yards, or use it to make gold-plated submarines (this is, incidentally, why quantitative easing, the strategy of buying long-term government bonds to put money into circulation, did not work either). What actually matters is spending.”

Nevertheless, the winner of these disputes would be the monetarists who believed inflation could be tamed by contracting the supply of money.

“According to Skidelsky, the pattern was to repeat itself again and again, in 1797, the 1840s, the 1890s, and, ultimately, the late 1970s and early 1980s, with Thatcher and Reagan’s (in each case brief) adoption of monetarism. Always we see the same sequence of events:

(1) The government adopts hard-money policies as a matter of principle.

(2) Disaster ensues.

(3) The government quietly abandons hard-money policies.

(4) The economy recovers.

(5) Hard-money philosophy nonetheless becomes, or is reinforced as, simple universal common sense.”

How is it possible to continue to believe in theories that are over and over proved to be incorrect?  Perhaps it began with Adam Smith.  Graeber provides this perspective from his book on debt.

“Recall here what [Adam] Smith was trying to do when he wrote The Wealth of Nations.  Above all, the book was an attempt to establish the newfound discipline of economics as a science.  This meant not only did economics have its own particular domain of study—what we now call ‘the economy,’ though the idea that there even was something called an ‘economy’ was very new in Smith’s day—but that this economy acted according to laws of much the same sort as Sir Isaac Newton had so recently identified as governing the physical world.  Newton had represented God as a cosmic watchmaker who had created the physical machinery of the universe in such a way that it would operate for the ultimate benefit of humans, and then let it run on its own.  Smith was trying to make a similar, Newtonian argument.  God—or Divine Providence, as he put it—had arranged matters in such a way that our pursuit of self-interest would, nonetheless, given an unfettered market, be guided ‘as if by an invisible hand’ to promote the general welfare.  Smith’s famous invisible hand was, as he says in his Theory of Moral Sentiments, the agent of Divine Providence.  It was literally the hand of God.”

Smith’s corollary to this assumption was that man or government should not mess with what God had created.  Economists would ditch the religious overtones but would embrace the concept of a self-regulating market as dogma and assume that any intervention in market processes could only make matters worse.

“There is a logical flaw to any such theory: there’s no possible way to disprove it. The premise that markets will always right themselves in the end can only be tested if one has a commonly agreed definition of when the ‘end’ is; but for economists, that definition turns out to be ‘however long it takes to reach a point where I can say the economy has returned to equilibrium.’ (In the same way, statements like ‘the barbarians always win in the end’ or ‘truth always prevails’ cannot be proved wrong, since in practice they just mean ‘whenever barbarians win, or truth prevails, I shall declare the story over.’)”

The almost theological belief in markets has consequences for investors.  Quoting from Skidelsky:

“The efficient market hypothesis (EMH), made popular by Eugene Fama…is the application of rational expectations to financial markets. The rational expectations hypothesis (REH) says that agents optimally utilize all available information about the economy and policy instantly to adjust their expectations….”

“Thus, in the words of Fama,…’In an efficient market, competition among the many intelligent participants leads to a situation where…the actual price of a security will be a good estimate of its intrinsic value.’ [Skidelsky’s italics]”

“There is a paradox here. On the one hand, the theory says that there is no point in trying to profit from speculation, because shares are always correctly priced and their movements cannot be predicted. But on the other hand, if investors did not try to profit, the market would not be efficient because there would be no self-correcting mechanism….”

“Secondly, if shares are always correctly priced, bubbles and crises cannot be generated by the market….”

“This attitude leached into policy: ‘government officials, starting with [Federal Reserve Chairman] Alan Greenspan, were unwilling to burst the bubble precisely because they were unwilling to even judge that it was a bubble.’ The EMH made the identification of bubbles impossible because it ruled them out a priori.”

If one believes the amount of money in the system is based on the demand for money, what is to prevent investors from borrowing or diverting ever more money to drive market prices ever higher until valuations become so absurd that some begin to cash out and the market collapses?  “Intrinsic value” becomes meaningless.  Isn’t this a fatal flaw of markets, one that demands some sort of intervention?

Could such a catastrophic event as the Great Recession be capable of shaking some of these economic beliefs?  Apparently not.

“After such a catastrophic embarrassment, orthodox economists fell back on their strong suit—academic politics and institutional power.”

“Breaking through neoclassical economics’ lock on major institutions, and its near-theological hold over the media—not to mention all the subtle ways it has come to define our conceptions of human motivations and the horizons of human possibility—is a daunting prospect. Presumably, some kind of shock would be required. What might it take? Another 2008-style collapse? Some radical political shift in a major world government? A global youth rebellion? However it will come about, books like this [Skidulsky’s]—and quite possibly this book—will play a crucial part.”


Monday, February 10, 2020

Financialization and Boeing: Making Airbus Great Again


Financialization refers to the increasing role that financial institutions such as markets and banks play in our economy.  Early economists began their analyses by considering the “real” economy to consist of making or growing things and selling them.  In order to do this, borrowing and lending must take place.  Finance was always necessary, but it was thought of as a cost of doing business not something that contributed to economic value.  Over time, and particularly recently, financial activity has become a growing component of the economy.  But does this growth represent a contribution to economic activity or should it still be thought of as a “cost” to the economy which is actually extracting wealth from it. 

The publicly funded corporation was one of the great creations of capitalism.  Governments created these as legal entities and provided appropriate protections and restrictions on them on the premise that they would be beneficial to society.  It was the accepted view that these corporations had stakeholders that included society-at-large, employees, consumers, and those who would fund them by buying shares through a stock market or lending them money.  In their endless quest to mess things up, economists sold the notion that the shareholders were not only investors, but also owners.  Therefore, shareholders were the predominant stakeholders, and the goal of a corporation must then be to maximize shareholder value (MSV).  One can do this by paying a significant share of profits to shareholder in the form of cash dividends which are taxable as normal income.  Since most shareholders are not interested in consuming cash dividends, they are better served by taking steps to increase the value of their shares on the stock markets.  It is at that point that financial maneuvers can become costly to the economy and to the individual corporation.  Mariana Mazzucato considers these issues in her book The Value of Everything: Making and Taking in the Global Economy.

In the 1930s, Keynes observed that it was in the nature of markets and investors for them to focus on short-term performance rather than the long-term prospects of an individual corporation.

“A successful speculator himself, Keynes knew what he was talking about.  He warned that the stock market would become ‘a battle of wits to anticipate the basis of conventional valuation a few months hence, rather than the prospective yield of an investment over a long term of years’.  He would be proved right.”

The result has been that shares are being held for ever shorter periods of time.  Share prices can rise and fall quickly based on the slightest of news.

“Increasing turnover is a sign that institutional investors’ sights are trained on the short-term movement of stock prices rather on the intrinsic, long-term value of the corporation.  High turnover can be more profitable for institutional investors than passive, long-term holding of shares…The result has been a corporate fixation on quarterly performance, which encourages consistent earnings growth to generate acceptable share price performance.”

Tying corporate executives’ compensation to share price behavior combined with MSV provides a “perfect storm” of incentives for corporate irresponsibility.  Traditional approaches to generating increased earnings include cutting costs and investing in new production capabilities and new products.  These tend to be longer-term moves that may or may not make sense for a particular company, and they may not be sufficiently promising to satisfy investors.  Companies have discovered that surest path to investor satisfaction is to use its profits not in long-term investment but in immediate share buybacks.  These purchases will drive up the share value without any improvement in company performance.

“MSV, then sets off a vicious circle.  Short-term decisions such as share buy-backs reduce long-term investment in real capital goods and innovation such as R&D.  In the long run, this will hold back productivity, the scope for higher wages will be limited, thus lowering domestic demand and the propensity to invest in the economy as a whole.  The spread of financialization deep into corporate decision-making therefore goes well beyond the immediate benefits it brings to shareholders and managers.”

Dan Catchpole provided a perfect example of how financialization of a corporation can lead to long-term harm.  He provided an article for Fortune magazine titled The forces behind Boeing’s long descent.  He began with this lede.

“A shareholder-first culture fueled the 737 Max crisis. Now it may keep the aerospace giant from recovering.”

Catchpole claims the 737 Max episode is the latest result of a change in corporate culture that occurred when Boeing merged with McDonnell Douglas in 1997.  Prior to that time, Boeing focused on producing well-engineered commercial aircraft.  McDonnell Douglas, and its executives, had a different view based on acceptance of MSV as its fundamental strategy.  The result was a Boeing that placed cost-cutting and returns to shareholders before investments in product development.

“In the years prior to the merger, Boeing had largely avoided share repurchases; McDonnell’s board, led by its CEO Harry Stonecipher, had pursued them enthusiastically. Within a year of the merger, buybacks became a cornerstone of Boeing’s strategy. As a Boeing executive and later CEO, Stonecipher also advocated aggressive cost-cutting, pushing the company to deliver an after-tax profit margin of 7%—a mark Boeing had not hit since the 1970s. His successor, Jim McNerney, continued to put profit margins first. ‘When people say I changed the culture of Boeing, that was the intent, so that it’s run like a business rather than a great engineering firm,’ Stonecipher told the Chicago Tribune in 2004. ‘It is a great engineering firm, but people invest in a company because they want to make money’.”

“For all of Boeing’s business coups and innovation, one stark statistic has come to symbolize the company’s priorities: Over the past six years, Boeing spent $43.4 billion on stock buybacks, compared with $15.7 billion on research and development for commercial airplanes. The board even approved an additional $20 billion buyback in December 2018, less than two months after the first 737 Max crash, though it subsequently shelved that plan.”

The first indication that the MSV philosophy was hurting product development came with the disastrous development of the 787 airplane.

“The downsides of cost-cutting soon appeared in Boeing’s 787 Dreamliner program, which began in 2003. Management pushed the company to save money by outsourcing development of critical components to suppliers, many of which proved not up to the task, leading to repeated breakdowns and delays. When the jet finally flew in 2011, it was three years late and $25 billion over budget. In 2013, after the plane was in service, electrical fires in batteries on two 787s prompted regulators to ground the airplane for nearly a month.”

Boeing has a viable competitor in Airbus which has been gradually approaching parity in new aircraft orders.  The next big market will involve planes that can carry 200-plus passengers longer distances with improved fuel efficiency than the established 737 class of planes.  Such planes would open up the possibility of direct flights between many more cities in the international market.  Boeing delayed the development of such new aircraft, presumably fearing the costs would impact its profit margins.

“By the middle of the past decade, Boeing was confronting its lack of a new mid-market airplane (known in-house as the NMA). This category of jetliner carries around 250 passengers over distances of 4,000 to 5,000 miles. Mid-market is the only segment of the commercial-jet business expected to see strong demand in the near future, making the category critical to Boeing and rival Airbus. ‘Boeing likely needs two clean-sheet airplanes this decade,’ says Richard Aboulafia of consulting firm Teal Group.”

“In 2016, however, then-CEO Dennis Muilenburg pledged to double Boeing’s profit margins to the mid-teens, a goal that made plane development that much more challenging. Punting on the decision to begin designing an NMA became an annual tradition for Boeing leadership.”

Boeing decided to take a half step in that direction by adding new engines and making minor changes to its 737 and calling it the 737 Max.  Meanwhile, Airbus has been cranking out new planes for that market.

“Boeing’s indecision has given Airbus room to dominate the market. Given its problems with the Max, analysts and consultants agree that the earliest Boeing can start an NMA project is 2021. Even on that timetable, Boeing ‘will have lost market share to the A321XLR and maybe the A330neo,’ two new Airbus models, says Ron Epstein, an aerospace analyst for Bank of America Merrill Lynch. ‘Some of it is fait accompli,’ he adds. ‘The XLR is here—that’s [lost] market share’ for Boeing.”  

Analysts seem to agree that Boeing needs to return to its original roots and become again a competent developer of aircraft, but it isn’t clear that they will take that advice.

“But the fallout from the Max crisis may well push Boeing in the opposite direction. Costs related to the Max have topped $9 billion and could easily double. To shore up its balance sheet Boeing is reportedly considering borrowing money to pay shareholder dividends—and cutting R&D spending.


Tuesday, February 4, 2020

Do We Still Need Affirmative Action?


What has become known as “affirmative action” is one of the more contentious aspects of our society.  In order to prevent discrimination in the future one must pass a law to prohibit it.  However, such a law then precludes providing any means for a minority that was harmed by discrimination to receive special treatment that would compensate for that prior experience.  Courts, schools, businesses, and politicians have tried to deal with this paradox for generations now.  Louis Menand provides an interesting discussion of this issue in an article for The New Yorker: The Changing Meaning of Affirmative Action (Integration by Parts in the paper version of the magazine).  Menand leans heavily on the book of Melvin I. Urofsky, The Affirmative Action Puzzle: A Living History from Reconstruction to Today, for information and data.

The term “affirmative action” was coined by Hobart Taylor Jr. in assisting the Kennedy administration in forming its Committee on Equal Employment Opportunity.

“Taylor needed a flexible phrase because Kennedy’s committee was a bureaucratic entity with a vague mandate meant to signal the Administration’s commitment to fairness in employment. Its purview, like the purview of committees dating back to the Administration of Franklin Roosevelt, was the awarding of federal contracts, and its mandate was to see that companies the federal government did business with did not discriminate on the basis of race. The committee had no real enforcement mechanism, though, so ‘affirmative action’ was intended to communicate to firms that needed to integrate their workforce something like ‘Don’t just stand there. Do something.’ What they were supposed to do, aside from not discriminating, was unspecified.”

“’Do something’ is still one of the meanings of ‘affirmative action’ today.”

The trick is to provide a distribution of societal benefits that more closely matches the demographic distribution without violating the civil rights laws and the Fourteenth Amendment.  The chosen means of accomplishing this is by identifying diversity as a social good.

“Many firms and educational institutions have affirmative-action or diversity officers. Their job is to insure not only that hiring and promotion are handled in a color-blind manner but that good-faith efforts are made to include racial minorities (and sometimes individuals in other categories, such as women or veterans or disabled persons) in the hiring pool, and, if they are qualified, to attempt to recruit them. In this context, ‘affirmative’ means: demonstrate that you did your best to find and promote members of underrepresented groups. You do not have to give them preferential treatment.”

Such an approach is subject to complaints about fairness and presents judges with difficult decisions.  The result is frequent relitigation and judicial rulings that leave many dissatisfied.  College admission has provided much of the judicial battling. The Bakke case is a good example of the issues that have arisen and the relevant legal findings.

“The Supreme Court case that admissions offices rely on today is Regents of the University of California v. Bakke. It was decided in 1978, and, despite several attempts to relitigate it, it is still the law of the land. Bakke is a good example of the jurisprudential confusion around affirmative action: the Court managed to produce six opinions in that case. The plurality opinion, by Lewis Powell, struck down an admissions program at the University of California at Davis School of Medicine, from which Allan Bakke, a white man, had been twice rejected, but it upheld the right of schools to use race-conscious admissions programs.”

UC Davis had used a separate admissions process to select minority applicants outside of the normal process.  There was a specified number of slots available for these applicants.  This specification was establishing a “quota” for minorities and judged to be clearly introducing a bias.  However, Powell called up the First Amendment as a means of providing the option for racial considerations.  Basically, UC Davis should have the right to determine the learning environment provided at the university, and if it decided a diverse student body produced an improved environment, they had the right to proceed accordingly.

“Powell argued, however, that another right was in play: the First Amendment; specifically, the right of academic freedom. There is no constitutional right of academic freedom, but Powell cited a 1957 case, Sweezy v. New Hampshire, in which Felix Frankfurter, in a concurring opinion, quoted South African jurists to the effect that the principle of academic freedom allows a university to determine who will teach its classes and who will sit in its classrooms.”

“Powell concluded that, since Davis could reasonably decide that a diverse class provides a better learning environment, considerations of an applicant’s race—as one factor among others—can fall within the exercise of a constitutionally protected right. (Under the Court’s ruling, Bakke was admitted to Davis and he became a doctor; Urofsky says that he went on to work at the Mayo Clinic, where one of his patients was Lewis Powell.)”

Given that background, there are two questions about affirmative action that need addressing: did it work (was it fair and successful), and is it still needed.  Urofsky provides the answer as to whether it worked.

“There is a whole library on racial inequality and efforts to address it, and “The Affirmative Action Puzzle” does not offer many novelties. But the book, just by the accumulation of sixty years’ worth of evidence, allows us to reach some useful conclusions, and the most important of these is that affirmative action worked. The federal government, with the backing of the courts, weaponized the 1964 Civil Rights Act and its legislative progeny—notably the Education Amendments of 1972, home to the notorious (in the R.B.G. sense) Title IX, banning sex discrimination in federally assisted educational institutions—and forced businesses to hire women and racial minorities.”

“And they did. Study after study suggests that it is just not the case that ‘it would have happened anyway.’ In 1981, for example, as Urofsky tells us, the Reagan Labor Department commissioned a report on gains in hiring among African-Americans and women. It found that between 1974 and 1980 the rate of minority employment in businesses that contracted with the federal government, and were therefore susceptible to being squeezed, rose by twenty per cent, and the rate of employment of women rose by 15.2 per cent. In companies that did not contract with the government, the rates were twelve per cent and 2.2 per cent, respectively.”

“This was so contrary to everything that Reagan had been saying about affirmative action that the Labor Department hired an outside consulting firm to vet its own report. When the firm returned with the news that the methods and the conclusions were valid, the Administration did the only thing it could do. It refused to release the report, thus allowing politicians to go on telling the public that affirmative action didn’t work.”

Menand then produced this surprising declaration.

“…the biggest defenders of affirmative action are not the N.A.A.C.P. and the Democratic National Committee. The biggest defenders are corporations and the military.”

By not having a diverse set of employees to deal with a diverse marketplace, a business can place itself at a disadvantage with respect to minority customers.  A white-male hiring policy would eliminate about two-thirds of the population as potential employees.  That makes no sense talentwise.  The nation cannot have a stable, dependable military force heavy in minority enlisted soldiers if it is led by a white-male officer corps.  In 2003, another admissions case involving the University of Michigan reached the Supreme Court.

“The extent of the corporate buy-in was put on dramatic display in 2003, when the Supreme Court heard Grutter v. Bollinger, another admissions case, this one involving the University of Michigan Law School. The Court received sixty-nine amicus briefs (a lot) arguing in favor of Michigan’s affirmative-action admissions program, and among the amici were General Motors, Dow Chemical, and Intel, along with the largest federation of unions in the United States, the A.F.L.-C.I.O. They supported affirmative-action admissions because they wanted universities to produce educated people for a diversified workforce.”

“The Court also received, in Grutter, what became known as ‘the military brief.’ This was an amicus brief signed by big-name generals like Norman Schwarzkopf, Wesley Clark, and John Shalikashvili; by a former Defense Secretary, William Cohen; and by former superintendents of the service academies, all of which, of course, are government agencies. ‘At present,’ they told the Court, ‘the military cannot achieve an officer corps that is both highly qualified and racially diverse unless the service academies and the ROTC use limited race-conscious recruiting and admissions policies.’ They were saying that if the Court ruled against Michigan it would be upending efforts, up to that point highly successful, to maintain a diverse officer corps.”

Affirmative action produced benefits to the recipients, to schools, to businesses, and to the military, but was it fair in the sense that worthy people were excluded from an opportunity they deserved?  It should be recognized that in school admissions, the number of lesser qualified whites who were admitted in place of the litigant arguing against affirmative action was probably much larger than the number of minority applicants in question.

Menand provides this perspective.

“Did white men suffer as a result of affirmative action? That turns out to be a difficult question to answer. ‘There is very little hard evidence to prove that a minority hire almost always took place at the expense of a better-qualified white person,’ Urofsky says. He also tells us that there are ‘no reliable data’ on whether men were shut out of jobs that were offered to women.”

“Urofsky’s view is that, over all, white men did not go without jobs or the chance to attend college. Turned down by one place, they went someplace else. The number who were ‘victimized’ by affirmative action, he says, is ‘minuscule.’ Certainly this is true in the case of college admissions. Most colleges accept almost everyone who applies, so when we talk about race-conscious admissions we are talking about policies that affect a relatively small number of people. Urofsky borrows from Thomas Kane, of the Brookings Institution, an analogy to handicapped parking spaces: a driver looking to park who does not have a permit might feel ‘excluded’ driving past an empty handicapped spot, but he or she usually finds a place to park.”

Finally, is affirmative action still needed?  The process involves all groups who might be discriminated against, but it was put in place in an attempt to ameliorate the centuries of harm done to black people by slavery, Jim Crow, and institutionalized racial bias.  Numerous inequalities continue to exist indicating that blacks are still a heavily disadvantaged minority.  There is much work yet to be done.

The Grutter admissions case, even with all the support for affirmative action coming from so many directions, was decided by a 5-4 vote.  The Court is becoming increasingly more conservative and has shown the willingness to eliminate long-standing protections using the argument that they are no longer needed, as with voting rights protections in Shelby v. Holder.  Menand fears that such a future awaits affirmative action.

“The Court’s decision in Shelby v. Holder vacating a central provision of the Voting Rights Act has backfired. It turns out that, when you remove enforcement mechanisms and remedial oversight, things tend to revert to the status quo ante.”

“The whole history of affirmative action shows, as Urofsky somewhat reluctantly admits, that when the programs are shut down minority representation drops. Diversity, however we define it, is politically constructed and politically maintained. It doesn’t just happen. It’s a choice we make as a society.”


Lets Talk Books And Politics - Blogged