Tuesday, December 31, 2013

Are Plants Intelligent?

Michael Pollan provides a fascinating look into the ways plants are capable of responding to their environments in an article in The New Yorker: The Intelligent Plant.

There seems to be considerable controversy in associating the clever things of which plants are capable with the concept of intelligence. Some members of the scientific community view intelligence as an attribute that is associated with a brain and a very human-like consciousness. Others apply a broader definition that describes intelligence as the ability to respond effectively to changes in environmental circumstances; this approach gives plants a good shot at being deemed intelligent. One could argue that the latter rendition of the term provides a quite comprehensive description of what humans do as well.

Pollan tells us two significant differences between plants and animals make it difficult for humans to appreciate the complexity developed by the earth’s flora. The first is related to the timescale involved in plant responses. Plants may be accomplishing wondrous things, but they occur on a timescale so long that to human observers they usually appear inert.

"It is only human arrogance, and the fact that the lives of plants unfold in what amounts to a much slower dimension of time, that keep us from appreciating their intelligence and consequent success."

Plant capabilities must be evaluated within an entirely different context. Evolutionary strategies for an organism that is firmly tied to a particular location are quite different from those of a mobile organism. An animal can develop an efficient brain and central nervous system because it can use its ability to move around as a defense mechanism to protect its vulnerable components from injury. An immobile plant has to assume that it will be chewed upon or otherwise injured and much of its structure will be lost. A localized brain would make little sense in that case and a more distributed form for sensing and logic would be appropriate.

"More likely, in the scientists’ view, intelligence in plants resembles that exhibited in insect colonies, where it is thought to be an emergent property of a great many mindless individuals organized in a network. Much of the research on plant intelligence has been inspired by the new science of networks, distributed computing, and swarm behavior, which has demonstrated some of the ways in which remarkably brainy behavior can emerge in the absence of actual brains."

Those who would argue that the label "intelligent" should be applied to plants have been accumulating some compelling information.

"The ‘sessile life style,’ as plant biologists term it, calls for an extensive and nuanced understanding of one’s immediate environment, since the plant has to find everything it needs, and has to defend itself, while remaining fixed in place. A highly developed sensory apparatus is required to locate food and identify threats. Plants have evolved between fifteen and twenty distinct senses, including analogues of our five: smell and taste (they sense and respond to chemicals in the air or on their bodies); sight (they react differently to various wavelengths of light as well as to shadow); touch (a vine or a root ‘knows’ when it encounters a solid object); and, it has been discovered, sound. In a recent experiment, Heidi Appel, a chemical ecologist at the University of Missouri, found that, when she played a recording of a caterpillar chomping a leaf for a plant that hadn’t been touched, the sound primed the plant’s genetic machinery to produce defense chemicals."

Much of the cleverness of plants is hidden from our view as roots search for the optimal path in an attempt to optimize nourishment.

"Scientists have since found that the tips of plant roots, in addition to sensing gravity, moisture, light, pressure, and hardness, can also sense volume, nitrogen, phosphorus, salt, various toxins, microbes, and chemical signals from neighboring plants. Roots about to encounter an impenetrable obstacle or a toxic substance change course before they make contact with it. Roots can tell whether nearby roots are self or other and, if other, kin or stranger."

One of the lesser-known attributes of plants is their ability to communicate and even to engage in collective behavior. The mechanism for communication is usually via the emission of an assortment of chemicals.

"One of the most productive areas of plant research in recent years has been plant signalling. Since the early nineteen-eighties, it has been known that when a plant’s leaves are infected or chewed by insects they emit volatile chemicals that signal other leaves to mount a defense. Sometimes this warning signal contains information about the identity of the insect, gleaned from the taste of its saliva. Depending on the plant and the attacker, the defense might involve altering the leaf’s flavor or texture, or producing toxins or other compounds that render the plant’s flesh less digestible to herbivores."

"Perhaps the cleverest instance of plant signalling involves two insect species, the first in the role of pest and the second as its exterminator. Several species, including corn and lima beans, emit a chemical distress call when attacked by caterpillars. Parasitic wasps some distance away lock in on that scent, follow it to the afflicted plant, and proceed to slowly destroy the caterpillars."

Pollan cites the work of a forest ecologist, Suzanne Simard, and her colleagues on determining the organization of trees in a forest:

"….trees in a forest organize themselves into far-flung networks, using the underground web of mycorrhizal fungi which connects their roots to exchange information and even goods. This "wood-wide web," as the title of one paper put it, allows scores of trees in a forest to convey warnings of insect attacks, and also to deliver carbon, nitrogen, and water to trees in need."

"The pattern of nutrient traffic showed how ‘mother trees’ were using the network to nourish shaded seedlings, including their offspring—which the trees can apparently recognize as kin—until they’re tall enough to reach the light. And, in a striking example of interspecies cooperation, Simard found that fir trees were using the fungal web to trade nutrients with paper-bark birch trees over the course of the season. The evergreen species will tide over the deciduous one when it has sugars to spare, and then call in the debt later in the season."

Pollan described several experiments designed to elicit behavior that would imply that plants are capable of performing tasks that would normally be assumed to require intelligence. An experiment with pole-climbing beans is illustrative.

A bean plant was observed using time-lapse photography as it grew positioned a few feet away from a pole mounted on a dolly. Pollan, like most of us, always assumed that bean plants grow in arbitrary directions until they bump into something to cling to. Not so. A video indicated that the plant "figured out" where the pole was and wasted no effort growing in other directions.

"The bean plant wastes no time or energy ‘looking’—that is, growing—anywhere but in the direction of the pole. And it is striving (there is no other word for it) to get there: reaching, stretching, throwing itself over and over like a fly rod, extending itself a few more inches with every cast, as it attempts to wrap its curling tip around the pole."

Pollan finds that time-lapse photography has taken plant activity into a timescale that is evocative of conscious intent.

"….a dimension of time in which these formerly inert beings come astonishingly to life, seemingly conscious individuals with intentions."

Whether a scientist or a layperson, the willingness to label plant behaviors "intelligent" is a matter of personal choice. However, one has to be amazed at how complex the seemingly simple organisms are turning out to be. Undoubtedly there is more yet to be revealed as research continues.

Humans tend to see themselves as the unassailable peak of evolution—as rulers of the earth. Pollan puts us in perspective:

"Plants dominate every terrestrial environment, composing ninety-nine per cent of the biomass on earth. By comparison, humans and all the other animals are, in the words of one plant neurobiologist, ‘just traces’."

Wednesday, December 18, 2013

Costco vs. Walmart

An article appeared in Forbes a few months ago by Rick Unger: Walmart Pays Workers Poorly And Sinks While Costco Pays Workers Well And Sails-Proof That You Get What You Pay For. To someone who loves to shop at Costco and refuses to enter a Walmart, this is a must-read.

Unger points out that Costco is thriving while Walmart has been suffering over the past few years. He suggests that Costco may be outperforming Walmart because it pays its employees a much, much higher wage than is available at Walmart, and better-paid employees are more efficient than poor, depressed employees.

"Costco’s most recent quarterly earnings report reveals a fairly healthy eight percent rate of growth in year-on-year sales—including a five percent rise in same store sales. What’s more, with membership fees rising from $459 million in the same quarter last year to $528 million this year, it’s pretty clear that a significant number of customers are moving over to the retailer to do their discount shopping."

"Meanwhile, Costco’s primary competitor, Walmart, saw an anemic 1.2 percent rise in sales, while other competitors such as J.C. Penny and Target experienced even greater disasters in their sales results."

"In an identical economy, how do we explain Costco’s growth vis-à-vis the failures over at Walmart?"

"Here’s a crazy thought—might it have something to do with the fact that Costco pays nearly all of its employees a decent living (well in excess of the minimum wage) while Wal-Mart continues to pay its workers as if their employees don’t actually need to eat more than once a week, live in an enclosed space and, on occasion, take their kids to see a doctor?"


Perhaps happy workers create a better environment for shoppers than unhappy workers.

"Walmart service now pretty much sucks—and customers don’t like it."

"Without enough employees to get the basic work of a retail operation done—and with those on site being paid a wage so low that it is difficult to expect much in the way of pride or motivation—Wal-Mart merchandise remains stacked on pallets in the warehouse rather than making it to the floor where customers can find the products they want. At the same time, check-out lines are painfully long and annoying as the overall shopping experience continues to deteriorate."

Brad Stone provided a profile of Costco CEO Craig Jelinek in Bloomberg Businessweek: Costco CEO Craig Jelinek Leads the Cheapest, Happiest Company in the World. Stone quotes Jelinek on Costco’s philosophy with respect to employees:

"I just think people need to make a living wage with health benefits," says Jelinek. "It also puts more money back into the economy and creates a healthier country. It’s really that simple."

"In February, Jelinek set Costco’s convictions in ink, writing a public letter at the behest of [Ralph] Nader, urging Congress to increase the federal minimum wage for the first time since 2009. ‘We know it’s a lot more profitable in the long term to minimize employee turnover and maximize employee productivity, commitment and loyalty’…."

Stone provides this background on wages and benefits.

"Despite the sagging economy and challenges to the industry, Costco pays its hourly workers an average of $20.89 an hour, not including overtime (vs. the minimum wage of $7.25 an hour). By comparison, Walmart said its average wage for full-time employees in the U.S. is $12.67 an hour, according to a letter it sent in April to activist Ralph Nader. Eighty-eight percent of Costco employees have company-sponsored health insurance; Walmart says that "more than half" of its do. Costco workers with coverage pay premiums that amount to less than 10 percent of the overall cost of their plans. It treats its employees well in the belief that a happier work environment will result in a more profitable company."

When the economy stumbled, other retailers cut back on hours and jobs. Costco responded by giving its workers a raise.

"As the economic downturn worsened in the fall of 2009, Costco, like every other retailer, started seeing declines in same-store sales. Macy’s, Best Buy, Home Depot, and Office Depot, were resorting to layoffs and wage cuts, but Sinegal [former CEO] approved a $1.50-an-hour wage increase for hourly employees, spread out over three years."
Stone provides a profile of one Costco employee to illustrate how the company’s policies affect its workers.

"Joe Carcello has a great job. The 59-year-old has an annual salary of $52,700, gets five weeks of vacation a year, and is looking forward to retiring on the sizable nest egg in his 401(k), which his employer augments with matching funds. After 26 years at his company, he’s not worried about layoffs. In 2009, as the recession deepened, his bosses handed out raises. "I’m just grateful to come here to work every day," he says.

Nelson Lichtenstein provides insight into the workforce Walmart strives for and what working conditions are like in his book The Retail Revolution: How Wal-Mart Created a Brave New World of Business.

"Wal-Mart itself admitted, or rather boasted, that only 7 percent of the company’s hourly associates try to support a family with children on a single Wal-Mart income....To staff its stores, the company purposely seeks school-age youth, retirees, people who want a second job, and those willing or forced to work part-time.."

A very small fraction of workers make it into a management track. For the others, it is to Walmart’s advantage for them to go away as soon as possible. There is no advantage to experience in Walmart jobs. They are constructed in such a way that any healthy individual can perform the tasks. The reason they focus on the types of people described above is because they want them to work for a while and then leave. There will always be someone else they can hire at entry level wages with no benefits.
Walmart’s treatment of its employees has straddled a fine line between mean spiritedness and criminality. Any number of law suits have demonstrated that Walmart encourages its managers to cheat employees by modifying time cards or by instructing them to punch out and then continue working. This is referred to as working "off-the-clock." If one is foolish enough to complain she can easily be rewarded with shortened hours or enough variable shifts to induce her to quit.

"Although both K-Mart and Target proscribe overtime pay and squeeze their managers and workers to keep labor costs in line, neither has been the subject of a class-action ‘off-the-clock’ suit. Wal-Mart, on the other hand, was defending itself against seventy-five such suits as of December 2008 when the company announced a settlement, perhaps costing as much as $640 million, in sixty-three of those wage and hour lawsuits covering hundreds of thousands of workers in some forty-two states."

Walmart does not have to treat its employees this way. It does what it does because that is the way it believes workers should be treated. A study of Walmart’s wage structure by the Center for Labor Research and Education at the University of California at Berkeley concluded that Walmart could impose a $12 an hour minimum wage for its employees and cover the increased wages with a 1.1% increase in prices. Would mighty Walmart come tumbling down if prices went up 1.1%? Would anyone even notice if a 98 cent object suddenly cost 99 cents?

Economists like to argue that lower prices are always beneficial. Walmart arrives at low prices by creating a workforce that requires taxpayer support in the form of Medicaid, food stamps, Earned Income Tax Credits, and free lunches for school children in order to survive. That is a strange way to run an economy. It is a strange way to allow anyone to run a public company. Costco has demonstrated that such draconian employment practices are not necessary, and are probably counterproductive.

As the economy slowly improves, Costco is out performing it while Walmart is underperforming.

Perhaps there is some justice in the world—and some hope for the future.

Tuesday, December 17, 2013

Urban Power vs. State Power: A Living Wage

We observe the daily dramas that emerge from our capital and are dismayed because nothing seems to ever get accomplished. A recent article by R. Shep Melnick in The Wilson Quarterly titled The Gridlock Illusion suggests that while we should be concerned by the politics in Washington, we may be missing a lot of progress being made at other levels of government.

"The stalemate/gridlock argument is misleading not only because it ignores so many accomplishments, but also because it focuses so intently on just one small part of domestic policy, namely passage of major pieces of legislation at the national level. Lost in this picture are the daily decisions of administrators, judges, and state and local officials, as well as members of Congress engaged in the quotidian business of passing appropriations, reauthorizations, and budget reconciliation bills. Taken individually, these decisions might seem like small potatoes, but collectively they can produce significant policy change."

Melnick provides a few examples of what he means.

"How did affirmative action—highly unpopular with the American public—become embedded in so many federal programs? Slowly, subtly, and at times surreptitiously, a long series of court decisions, agency rules, and complex legislative provisions injected the presumption of proportional representation into federal civil rights programs. How did the federal government come to set national standards for state mental institutions, schools for the developmentally disabled, nursing homes, and prisons? Largely through litigation and consent decrees negotiated by the Department of Justice."

"To take another example, how did Congress manage to pass controversial legislation guaranteeing every disabled student a ‘free appropriate public education,’ complete with an ‘individualized education plan,’ provision of ‘related services,’ and a promise that each student would be placed in the ‘least restrictive environment’? The answer is that the courts acted first, suggesting (rather obliquely) that students with disabilities might have a constitutional right to an adequate education. This forced state governments to spend much more on special education, which led them to demand that the federal government provide the money needed to comply with this federal mandate, which led Congress to provide both more money and more federal regulation, which led to more litigation and more federal requirements, which led to state demands for even more money, and so on. This is a vivid illustration of how separation of powers and federalism can produce not gridlock, but a game of institutional leapfrog that results in a steady expansion of government programs."

The lesson Melnick seems to be imparting is that if you wish to accomplish something, you need not start at the top. What happens at the top often only occurs when actions taken at lower levels of government force the higher levels to respond.

The Republicans have certainly understood this concept as they have focused on using control of state legislatures to implement laws and regulations that never could have passed at the national level. Unfortunately, they have used this opportunity to restrict voting rights, attack unions, and erode workers rights in general.

Gordon Lafer provides a summary of these actions in an article provided by the Economic Policy Institute: The Legislative Attack on American Wages and Labor Standards, 2011–2012.

"Over the past two years, state legislators across the country have launched an unprecedented series of initiatives aimed at lowering labor standards, weakening unions, and eroding workplace protections for both union and non-union workers. This policy agenda undercuts the ability of low- and middle-wage workers, both union and non-union, to earn a decent wage."

In addition to the well-publicized assaults on public-sector unions and the right-to-work laws passed, a number of other actions have been taken that limit wage and labor standards.

"Four states passed laws restricting the minimum wage, four lifted restrictions on child labor, and 16 imposed new limits on benefits for the unemployed."

"States also passed laws stripping workers of overtime rights, repealing or restricting rights to sick leave, undermining workplace safety protections, and making it harder to sue one’s employer for race or sex discrimination."

"Legislation has been pursued making it harder for employees to recover unpaid wages (i.e., wage theft) and banning local cities and counties from establishing minimum wages or rights to sick leave."

"For the 93 percent of private-sector employees who have no union contract, laws on matters such as wages and sick time define employment standards and rights on the job. Thus, this agenda to undermine wages and working conditions is aimed primarily at non-union, private-sector employees."

Is there a progressive counter to this Republican aggression? Clearly, state legislatures controlled by Democrats can be more active in pursuing a progressive agenda. However, an article in Bloomberg Businessweek by Mark Niquette suggests another possible approach. Niquette points out that the dozen largest cities have all elected Democrats as mayors. Even in deep-red Texas, Austin, Dallas, Houston, and San Antonio are all led by Democrats.

Over the years urban areas have become more strongly Democratic, while rural areas have moved in the opposite direction. Are cities and their surrounding areas taking advantage of the political cohesion they possess—and their large number of voters— to further some agenda that could be then parlayed into a state or national movement? There is little evidence of coherent initiatives. Local programs exist, but there seems to be a lack of a master plan.

What might such an agenda include? One obvious component would be a living wage requirement. A living wage is a minimum wage pegged to a level of income required to meet some income standard at the regional cost of living. A universal application of a living wage may be the only feasible means of combating the ever-increasing income inequality that is associated with free-market capitalism.

Much has been said about the demonstrations by fast-food workers demanding a $15 wage in order to make enough to live on. One usually concludes that the minimum wage should be raised to that level. The current national minimum wage is too low and should be raised, but it does not make sense to raise it to a single level when the cost of living varies so much across the country. It might require $15 to survive in New York City, but does that mean it is appropriate for a fast-food worker at a truck stop in Wyoming? The living wage approach is more flexible and fairer.

Harold Meyerson has written a long and reflective article for The American Prospect: If Labor Dies, What’s Next? The labor movement and the union movement have long been identical, but that need not be the case. Meyerson suggests that a viable alternative to unionization might be nothing more than aggressive local politics. In the same way that Republicans have campaigned to bend state legislatures to their will, progressives could play the same game at the local level.

As an example he describes the Los Angeles Alliance for a New Economy (LAANE). This organization created programmatic platforms, recruited candidates, and waged successful campaigns for local political offices. The goal was to use the leverage provided by government assistance to businesses to force those businesses to be more supportive of workers and of the community in general.

"LAANE’s first major victory was persuading the city council to require cleaning companies with which it had contracts to pay their workers a living wage—a sum several dollars higher than California’s minimum wage....Over the years, the scope of such ordinances was broadened to encompass card-check unionization at hotels and sports arenas that received redevelopment funds; local hiring requirements for developers of major projects; and clean air standards for trucks at the Port of Los Angeles. Some of these ordinances have served as models for living-wage and other laws in more than 140 other cities."

The little town of SeaTac in Washington State recently passed a $15 minimum wage that applies to certain classes of workers. 

A little local activism can make a big difference. It would be encouraging if this initiative was part of a coherent national plan; but most likely it isn’t.

Friday, December 13, 2013

1945: Violence and Vengeance

Humans are clearly capable of violence. Murder and mayhem have been common from the beginning of recorded history. It has been of recurring interest to contemplate the source of our murderous tendencies. Are we reluctantly violent when we believe it necessary, or are we violent because we are predisposed towards violence?

It is tempting to look back at what we know of our history and draw conclusions. Given nearly nonstop wars and atrocities it is easy to draw a bleak picture of human nature. Those trying to extrapolate back further in our evolution have often resorted to analogies with animal behavior where conflict between individuals and between groups of individuals over territories and females provides a convenient segue into violent human traditions. Can the frequent ethnic conflicts among humans be explained by a million or so years of evolution that favored violent response to encroachment on territory or resources by other humans?

Some argue that the animal analogy is overdrawn. It is suggested that a more fruitful approach to understanding prehistory is to explore the behavior of peoples whose current or recent societies closely match the type of existence we are presumed to have experienced for many thousands of generations as hunter-gatherers. When people record observations of hunter-gatherer societies they frequently observe behavior towards other bands that might best be described as "sociable." One observes elaborate gift-giving rituals designed to form bonds between different families and different groups. In facing good times as well as bad in terms of necessary resources, evolution could just as easily have developed a tendency to support sharing between groups as a survival mechanism. Violence in this case would be rare and require some sort of provocation.

World War II, with associated fatalities estimated as high as 85 million, would seem to support the former hypothesis. Most of the casualties were civilian, and much fighting and killing was orthogonal to the major war campaigns, taking place between ethnic groups with a history of enmity.

In 1945, the war was winding down. Previously occupied countries were being liberated, groups that had been effectively in a state of civil war since before the war were free to resume killing each other, and the Russian army was furiously racing eastward bent on revenge. The violence and killing would continue for some time, only coming to an end when the allied forces separated the various ethnic groups and herded them back within national boundaries: Germans would return to Germany; the Poles finally made Poland Polish; Soviet citizens were forcibly sent home; and Jews gradually, but inexorably, tried to make their way to Palestine to form their own homeland. Is this evidence of some primitive drive to repel "the other"—anyone who is not one of us—that was exposed by the breakdown in the veneer of civilization?

Ian Buruma, in Year Zero: A History of 1945, provides some insight into what was taking place in that year when the official fighting ended, but the violence continued. Buruma does not delve into theories of genetically wired performance. He finds plenty of familiar social and cultural means by which violence could be encouraged.

Hate and the desire for revenge are familiar emotions to us all. But what happens when you are given permission, or are even ordered to kill the offending individuals. That is essentially what happened in the fighting in Eastern Europe and Russia.

The German master plan was to quickly defeat Russia, control the food supply and let as many people as possible starve to death. Some of those who survived would be kept as slave labor and the rest transported to Siberia, including all the Jews of Europe, where they would presumably starve, or freeze, and die. German soldiers were instructed before the invasion to kill anyone in a governmental position, or who might be a threat to the German mission. Any captured soldiers were to be killed by starvation. Given those marching orders, and the subsequent behavior, is it any wonder that the Soviet forces were in a nasty mood as they began pushing the Germans back towards their homeland?

"The figures are hard to imagine. More than 8 million Soviet soldiers died, of whom 3.3 million were deliberately starved to death, left to rot in open air camps, in midsummer heat or wintry frost."

"Murder and starvation went together with constant degradation and humiliation. Russians, like other Slavs, were less than fully human in Nazi German eyes. Untermenschen, whose only role would be to work as slaves for their German masters. And those unfit to work as slaves did not deserve to be fed. Indeed, Nazi Germany had a policy, called the Hunger Plan, of starving Soviet peoples to provide Germans with more living space….and food. If fully carried out, this monstrous economic plan would have killed tens of millions."

Soviet leaders made it quite clear that soldiers were expected to have their revenge on the Germans.

"….once the Germans were forced to retreat from the Soviet Union, the Red Army troops were explicitly told to do their worst as soon as they entered German lands. Road signs on the border said in Russian: ‘Soldier, you are in Germany: take revenge on the Hitlerites.’ The words of propagandists, such as Ilya Ehrenburg, were drummed daily into their heads: ‘If you have not killed at least one German a day, you have wasted that day….If you kill one German, kill another—there is nothing funnier to us than a pile of German corpses.’ Marshal Georgy Zhukov stated in his orders of January 1945: ‘Woe to the land of the murderers. We will get our terrible revenge for everything’."

There is a long tradition of the victorious men humiliating the losers by raping their women while they watch helplessly. Soviet propagandists primed their troops by describing Nazi women as being just as bad as the men, and as having grown wealthy and fat off of stolen Soviet treasures. To these soldiers, the Germans they encountered did appear incredibly wealthy.

"What shocked the soldiers, some of whom had barely seen functioning electricity, let alone such luxury items as wristwatches, was the relative opulence of German civilian life, even in the miserable conditions of bombed cities and wartime shortages. Greed, ethnic rage, class envy, political propaganda, fresh memories of German atrocities, all this served to quicken the thirst for vengeance."

"Raping German women, especially those who appeared to have unlimited wealth, and especially in front of the emasculated ex-warriors of the ‘master race,’ made the despised Untermenschen feel like men again."

Masculine humiliation was a potent force in generating violence even for those who had physically suffered little in the course of the war. Buruma cites the conditions in France after liberation as an example.

"During the so-called wild purge….in France, which took place in 1944, before the war was even over, about six thousand people were killed as German collaborators and traitors by various armed bands with links to the resistance, often communists. Double that number of women were paraded around, stripped naked, their heads shaven, swastikas daubed on various parts of their anatomy. They were jeered at, spat on, and otherwise tormented. Some were locked up in improvised jails, and raped by their jailers. More than two thousand women were killed. Similar scenes, though not nearly on the same scale, took place in Belgium, the Netherlands, Norway, and other countries liberated from German occupation."

"….popular wrath fell disproportionately, and most publicly, on women accused of ‘horizontal collaboration.’….The submission of France by superior German force was often described in sexual terms. The rampant German army, representing a powerful, virile nation, had forced weak, decadent, effeminate France to submit to its will. Horizontal collaboration….was the most painful symbol of this submission. And so it was the women who had to be punished with maximum disgrace."

Buruma then adds another factor that can contribute to violence: guilt.

"The most enthusiastic prosecutors of filles de Boches were usually not people who had distinguished themselves in acts of courage during the war. Once liberation came to formerly occupied countries, all kinds of men managed to present themselves as members of resistance groups, strutting around with newly acquired arm bands and Sten guns, disporting themselves as heroes as they hunted for traitors and bad women. Vengeance is one way of covering up a guilty conscience for not standing up when it was dangerous. This too appears to be a universal phenomenon, of all times."

The most troubling tale provided by Buruma involves the Poles and the Jews. In 1945, many of the Jews who had survived the war and tried to return to their homes in Poland were set upon by the Poles and had to flee—incredibly—to Germany for safety. Greed, guilt, and official connivance can explain much, but some of what transpired in Poland can only be explained by long-standing ethnic hatred.

There were a number of instances where Poles attacked Jews who had just the day before been living peaceably as neighbors. Some were quite willing to do the work of the Nazis.

"In July 1941, the Jews in Radzilaw were locked up in a barn and burned alive while their fellow citizens ran around filling their bags with loot. An eyewitness remembers: ‘When the Poles started rounding up and chasing Jews, the plundering of Jewish houses began instantly….They went mad, they were breaking into houses….they’d just load up their sacks, run home and come back with an empty sack again."

Greed was certainly a powerful motive. Buruma is careful to point out that many Poles acted differently and helped their Jewish neighbors to survive the war. However, that was not an act that they were expected to express pride in.

"People who had protected Jews from being murdered were well advised not to talk about it. Not only because of God’s wrath for helping ‘the killers of Christ,’ but because of the suspected loot. Since Jews were assumed to have money, and their saviors were expected to have been richly compensated, anyone who had admitted to have hidden Jews was vulnerable to plunder."

When the few Jews left tried to return to their homes they discovered that everything they possessed had been confiscated by someone who resented their reappearance.

"A contemporary Polish weekly paper, Odrodzenie, put it succinctly in September 1945: ‘We knew in the country an entire social stratum—the newborn Polish bourgeoisie—which took the place of murdered Jews, often literally, and perhaps because it smelled blood on its hands, it hated Jews more strongly than ever."

Buruma’s recounting of the horrors that occurred in that tumultuous year is, in fact, comforting. There is nothing in his description that suggests we are genetically damaged in such a way that we are inherently susceptible to violent behavior. The murders and bloodshed that occurred on such a massive scale can be explained by rather common factors: following orders, greed, guilt, masculine pride, and culturally-created prejudices.

But, let us hope that the constraints of civilization remain in place forevermore.

Monday, December 9, 2013

SeaTac Approves a $15 Minimum Wage: Is Seattle Next?

The subject of the minimum wage has been much in the news of late. Fast-food workers have been organizing demonstrations to demand a $15 minimum wage in order to be able to support themselves. Walmart has been under pressure for some time because of its low wage policies. President Obama asked earlier this year for a rise in the national minimum wage from the current $7.25 to $9, and has recently been talking about a goal of $10.

Obama has also been talking seriously about the threat income inequality poses for our nation. It has become quite clear over recent decades that the inevitable evolution of an unconstrained free-market economy will lead to ever-growing income inequality as more earnings accrue to the owners of capital and less to laborers. This divergence has led to stagnant or falling real wages for most workers in the nation.

The one possible solution that involves little political upheaval and directly addresses the issue at hand is to raise the floor on wages to counter the decline that is occurring. Much has been claimed about what effects might follow from a significant increase. Clearly, some people would lose their jobs, but, just as certainly, other jobs would be created by the increased spending that would take place. Economists are coming around to the view that such increases in the minimum wage are likely to have a small effect on the total number of jobs.

Politically, some would view this as a redistribution of income from the wealthy to the poor and oppose it for that reason. It is probably more accurate to view such a move as a redistribution of profit between various industries. Some would see costs go up and profits decrease, but others would benefit from the increased demand created by the higher wages.

An increase in the minimum wage might have a greater effect on income inequality than most suspect. Employees tend to view their compensation with respect to the premium they are paid over the minimum wage. If fast-food workers were to attain a $15 wage, do you think the auto companies could pay starting auto workers $14 an hour? Raising the minimum would likely send ripples well up into the higher wage brackets.

A large boost in the minimum wage is an experiment waiting to happen. And now it looks like someone has taken that first step.

Steve Coll has an interesting note in The New Yorker where he discusses the approval of a $15 minimum wage for some workers in the city of SeaTac. If that name sounds familiar it is because that town is where the Seattle-Tacoma airport resides.

"Sea-Tac, the airport serving the Seattle-Tacoma area, lies within SeaTac, a city flecked by poverty. Its population of twenty-seven thousand includes Latino, Somali, and South Asian immigrants. Earlier this year, residents, aided by outside labor organizers, put forward a ballot initiative, Proposition 1, to raise the local minimum wage for some airport and hotel workers, including baggage handlers. The reformers did not aim incrementally: they proposed fifteen dollars an hour, which would be the highest minimum wage in the country, by almost fifty per cent."

Needless to say, this bold move aroused considerable local activity.

"Business groups and labor activists spent almost two million dollars on television ads, mailings, and door knocking—about three hundred dollars per eventual voter."

The election was held on November 5 and the proposition passed by 77 votes out of a total of about 6,000. Coll suggests that a recount might possibly overturn the results. However, an editorial from the Seattle Times dated December, 8, 2013 contains this comment:

"BACKERS of a $15 minimum wage who are celebrating victory in SeaTac now aim to seize the day in Seattle. They do have the political momentum. What they don’t have is a sense of responsibility or of any information on the actual effects of the law they favor."

There is no mention of a recount threat, but there is the comment about the possibility Seattle voters might be interested in a similar move.

"The Seattle City Council has approved $100,000 for a study of the issue. The study should find real information on jobs gained and lost, consumer spending and business investment in SeaTac before it reaches a conclusion. Let’s find out what happens in that small city before making a decision for a population 23 times larger."

SeaTac is an ideal place to begin a minimum-wage movement. Because it contains the regional airport it captures a number of employers with very deep pockets, including major airlines and hotel chains. These businesses would have a difficult time saying that they cannot afford to pay more to a hand full of baggage handlers or maids. However, it is faulty as an economic demonstration project. The proposition approved does not include all workers. It would take a place that is big and diverse such as Seattle to perform the appropriate experiment.

The idea of a significant rise in the minimum wage is not politically impossible. As Coll points out:

"Twenty-one states and more than a hundred counties and cities have enacted laws that set minimums above the federal one. Before SeaTac’s vote, an Indian reservation in California had the highest local minimum in the country, of ten dollars and sixty cents. San Francisco’s is just a nickel less. But political support for higher wages extends well beyond Left Coast enclaves. According to a Gallup poll taken earlier this year, a majority of Republicans favor a minimum wage of nine dollars. That reflects a truth beyond ideology: life on fifteen thousand a year is barely plausible anymore, even in the low-cost rural areas of the Deep South and the Midwest."

All great movements start with a single step. Stay tuned!

Friday, December 6, 2013

Hospital Admission Is the Third Largest Cause of Death!

While writing an essay on Medicare spending, this quote from an article entitled The Hospital Is No Place For the Elderly was encountered:

"Hospitals are fine for people who need acute treatments like heart surgery. But they are very often a terrible place for the frail elderly. ‘Hospitals are hugely dangerous and inappropriately used,’ says George Taler, a professor of geriatric medicine at Georgetown University and the director of long-term care at MedStar Washington Hospital Center. ‘They are a great place to be if you have no choice but to risk your life to get better’."

The statement seemed a bit harsh at the time—most people would not consider a hospital stay as being dangerous in itself—but it was re-quoted as is. The very next day an article was published in the New York Times by Tina Rosenberg: To Make Hospitals Less Deadly, a Dose of Data. She presented results from a recent study that indicated that adverse events encountered by patients in hospitals were far higher than previous estimates had suggested.

"Until very recently, health care experts believed that preventable hospital error caused some 98,000 deaths a year in the United States — a figure based on 1984 data. But a new report from the Journal of Patient Safety using updated data holds such error responsible for many more deaths — probably around some 440,000 per year. That’s one-sixth of all deaths nationally, making preventable hospital error the third leading cause of death in the United States. And 10 to 20 times that many people suffer nonlethal but serious harm as a result of hospital mistakes."

These numbers are astonishing! This government source (2010) lists annual deaths from heart disease at 597,689 and cancer at 574,743. The next category, chronic lower respiratory diseases, comes in at 138,080. Accidental death is fifth on the list at 120,859. One begins to wonder why we worry so much about deaths from drunk driving when the carnage is so much greater in our hospitals than on our highways.

Another way to view these numbers is to compare this to deaths from airplane crashes. The rate is equivalent to 1205 deaths per day. That is equivalent to about four commercial aircraft crashes per day. Can you imagine what would happen if four aircraft crashed in a single day? The entire industry would be shut down indefinitely.

The report Rosenberg referred to can be found here. She chose to pursue the issue of lax data keeping that makes accumulating such numbers so difficult and so uncertain. Here we will be more interested in extracting information from that report, and learning more about the credibility of the reported estimates.

The report provides this background information:

"Based on 1984 data developed from reviews of medical records of patients treated in New York hospitals, the Institute of Medicine estimated that up to 98,000 Americans die each year from medical errors. The basis of this estimate is nearly 3 decades old; herein, an updated estimate is developed from modern studies published from 2008 to 2011."

The report uses the term "preventable adverse event" (PAE) to describe any harmful effect on a patient due to caregiver error. The observable effects from such mistreatment are not necessarily immediately recognizable. Some may take years to appear. The report lists some illustrative examples that also serve to indicate how difficult it is to arrive at an accurate estimate of the totality of PAEs.

"The harmful outcomes may be realized immediately, delayed for days or months, or even delayed many years. An example of immediate harm is excess bleeding because of an overdose of an anticoagulant drug such as that which occurred to the twins born to Dennis Quaid and his wife. An example of harm that is not apparent for weeks or months is infection with Hepatitis C virus as a result of contaminated chemotherapy equipment. Harm that occurs years later is exemplified by a nearly lethal pneumococcal infection in a patient that had had a splenectomy many years ago, yet was never vaccinated against this infection risk as guidelines and prompts require."

The report was based on four preexisting studies of limited populations of patients. In each case a tool referred to as the Global Trigger Tool (GTT) was used to evaluate the medical records of the population of patients in order to discover suspicious events that might be PAEs. Physicians then reviewed these flagged records to decide if they should be considered as PAEs.

One study of Medicare patients is illustrative.

"Investigators looked at the medical records of 780 randomly selected patients chosen to represent the 1 million Medicare patients "discharged" from hospitals in the month of October 2008. The total number of hospital stays for the 780 patients during this period was 838 because some of the beneficiaries were hospitalized and discharged more than once during the 1-month index period. Using primarily the GTT developed by the Institute for Healthcare Improvement to find adverse events, investigators found 128 serious adverse events….that caused harm to patients, and an adverse event contributed to the deaths of 12 of those patients. Seven of these deaths were medication related, 2 were from blood stream infections, 2 were from aspiration, and the 12th one was linked to ventilator-associated pneumonia….The authors of this report estimated that "events" contributed to the deaths of 1.5 % (12/780) of the 1 million Medicare patients hospitalized in October 2008. That amounts to 15,000 per month or 180,000 per year."

Based on this single study, preventable deaths of Medicare patients alone far exceed earlier estimates for the entire population. And remember that the number of preventable events that caused harm was 10 times greater. That means that Medicare patients suffered from poor care at the rate of 1.8 million times per year. Truly, hospitals are dangerous places for old people.

The four studies used to arrive at a total population estimate were based on different types of patients from different regions. The Medicare data, for example, could not be applied to the entire nation. Based on the results of the four studies, as reported, the report arrived at this preliminary conclusion:

"Based on our extrapolation from the 4 modern studies, there are at least 210,000 lethal PAEs detectable by the GTT approach to record reviews."

However, there is plenty of evidence that medical records are not to be trusted. Doctors and hospitals not only get to bury their mistakes, they get to bury the evidence as well. These examples of sources of underreporting are provided.

"In a study that broke past the wall of silence about discovery of medical errors that were missing from medical records, Weissman and colleagues found that 6 to 12 months after their discharge, patients could recall 3 times as many serious, preventable adverse events as were reflected in their medical records. This study involved review of 998 medical records of patients hospitalized in Massachusetts for medical or surgical treatment from April to October 2003."

"….some medical errors are not known by clinicians and only come to light during autopsies, which have found misdiagnoses in 20% to 40% of cases. ‘Aggressive’ searches for adverse drug events and prompted self-reports from clinicians have shown a much higher rate of adverse drug events than are evident in the medical records. A comparison of direct observation for medication errors with review of documentation in medical records in 36 hospitals and skilled-nursing facilities found that far more errors were found by direct observation than by inspection of medical records."

"A recent national survey showed that physicians often refuse to report a serious adverse event to anyone in authority. In the case of cardiologists, the highest nonreporting group of the specialties studied, nearly two-thirds of the respondents admitted that they had recently refused to report at least one serious medical error, of which they had first-hand knowledge, to anyone in authority. It is reasonable to suspect that clear evidence of such unreported medical errors often did not find their way into the medical records of the patients who were harmed."

The method of surveying medical records also neglects to count errors of omission.

"Premature deaths as a result of medical errors may occur many years after the hospital stay because the patient’s care was not optimal or did not follow guidelines. Furthermore, lethal PAEs can been [sic] missed by the GTT and by physician reviews. The GTT does not detect diagnostic errors or errors of omission, especially those involving failure to follow guidelines. Lethal diagnostic errors have been estimated to affect 40,000 to 80,000 people per year including outpatients."

It has been estimated that it takes about 10 years before an accepted medical finding makes its way into general usage by physicians. This delay can be lethal.

"Physicians have been indefensibly slow to adopt guidelines that would potentially prevent premature deaths or harm. One egregious example is the estimated 100,000 heart failure patients that died prematurely each year in the late 1990s because they did not receive beta-blockers. The efficacy of beta-blockers was established by a study published in the JAMA in 1982."

For all of these reasons, the estimate of PAEs based on the four studies had to be considered a lower limit. The final estimate essentially doubled the number of deaths and harmful events.

"Using a weighted average of the 4 studies, a lower limit of 210,000 deaths per year was associated with preventable harm in hospitals. Given limitations in the search capability of the Global Trigger Tool and the incompleteness of medical records on which the Tool depends, the true number of premature deaths associated with preventable harm to patients was estimated at more than 400,000 per year. Serious harm seems to be 10- to 20-fold more common than lethal harm."

This might also be conservative and could be considered a lower limit.

How could things be so bad? Who is responsible?

The aircraft crash analogy that was used earlier was first made by Captain Sulllenberger, the pilot who became famous a few years ago when he successfully executed an emergency landing in the Hudson River. The American Hospital Association once invited Captain Sullenberger to give them a talk on how lessons learned in enhancing aircraft safety could be used to improve the quality of hospital care. He gave them an earful. First he provided this background on aviation practices.

"Thirty plus years ago, before CRM (cockpit resource management), captains could be alternately Gods or cowboys, ruling their cockpits by preference or whim with insufficient consideration of best practices or procedural standardization…And first officers trying to do the right thing would never quite know what to expect. Some captains did not bother with check lists (and it was unclear who was responsible for what)."

"We worked to build a culture of safety that allows us to face an unanticipated dire emergency, suddenly, one for which we had never specifically trained, and saved every life on board…"

Then he pointed out that most medical errors are the result of system failures. Just as in the aviation industry, safety required the development of a culture where all participants were viewed as members of a team. A comparison is implied between the cowboy pilots of yesteryear and the doctors of today who choose not to bother with checklists, then he goes on to decry the current medical situation where it is assumed that when something goes wrong it is the result of an individual’s error.

"Conventional wisdom often has it that if a nurse makes a mistake, he or she should be terminated, but the vast majority of harmful events are due to system failures not practitioner error. The (health care) leaders are responsible for the maintenance of these support systems, not the caregivers. And the current punitive culture only drives problems underground where they can never be examined or solved."

In Sullenberger’s opinion, it is the hospital administrators who are at fault for not imposing systematic safeguards that could prevent most mistakes from occurring.

He continued on—optimistically.

"Federal legislation is about to link health care payments to the quality of service. I believe in ten years, when this is integrated throughout the system, we’ll look back at where we are today and will know that we were flying blind."

Meanwhile, stay healthy! Your life may depend on it.

Thursday, December 5, 2013

Obamacare, Medicare, and Falling Healthcare Costs

The Council of Economic Advisors (CEA) gathered together some data on costs and trends in healthcare and produced a short paper extolling the progress made in slowing the growth in healthcare expenses. The purpose of the document was to make sure everyone was aware of the progress being made, and to insure that credit was apportioned to Obamacare.

This chart was provided to illustrate how per capita costs have increased since 2000. It includes "real" costs, which, presumably, means costs over and above inflation. The data is broken down into three time intervals: before the recession, during the recession, and after the recession.





The pre-recession numbers are intended to convey the long term trend prior the recession and the fall that followed during the recession. There is considerable disagreement between various people who have studied the issue as to how much of the drop in healthcare spending growth rate can be attributed to the slow economy and people short on cash to spend on their health. The interval 2010-2013 is included to illustrate how expenditures continued to grow at an even slower rate in the period when Obamacare was being implemented and anticipated. The message should be clear: Obamacare is having a beneficial effect.

To insure the reader is convinced that structural changes are at work rather than mere economic hardship, this plot was provided:



Roughly 19%, about one in five, of Medicare patients that are admitted into a hospital must be readmitted for health reasons within 30 days of release. This rather scary statistic is attributed to lax release procedures on the part of doctors and hospitals. Sick people, especially old sick people, do better when they are properly prepared to take care of themselves after going home. Better outcomes are observed when there is follow up by the caregivers to verify that the patient is taking prescribed medications and doing whatever else he/she has been advised to do. This type of follow-up adds to the cost of dealing with patients and the high rate of readmissions is interpreted as a lack of incentive on the part of caregivers to spend extra money in a manner that would, ultimately, hurt their profit margin. Readmission has been a fiscal positive for them.

Obamacare contained financial inducements to reduce this readmission rate. That the rate has, in fact, fallen during Obamacare implementation signifies that structural changes are taking place in healthcare delivery, not just economic effects.

The slowing in the growth of healthcare spending has caused to Congressional Budget Office (CBO) to lower its projected costs for Medicare and Medicaid in each of the last three years.



If one considers the outlook for the year 2020—which is not that far off—the current lower projections indicate a savings of about 0.8% of GDP which translates into about $135 billion. That is $135 billion less government expenditure for just the year 2020.

The cost of healthcare exerts a tremendous amount of leverage on debt projections; that is why it is so important that the current trends persist. To endure, these savings must be based on fundamental changes in how healthcare is delivered, not on transient economic effects.

Jonathan Rauch provides insight into how Medicare patients have been treated by healthcare providers and how greater savings can be attained. He has produced an article in The Atlantic: The Hospital is No Place for the Elderly.

As people have extended their lifetimes, more experience a lengthy period in which multiple physical functions degrade.

"Thanks to modern treatment, people commonly live into their 70s and 80s and even 90s, many of them with multiple chronic ailments. A single person might be diagnosed with, say, heart failure, arthritis, edema, obesity, diabetes, hearing or vision loss, dementia, and more. These people aren’t on death’s doorstep, but neither will they recover. Physically (and sometimes cognitively), they are frail."

"Seniors with five or more chronic conditions account for less than a fourth of Medicare’s beneficiaries but more than two-thirds of its spending—and they are the fastest-growing segment of the Medicare population."

The aged and unhealthy need frequent access to healthcare professionals, but not necessarily the intense and expensive care involved in emergency room visits.

"Right now, when something goes wrong, the standard response is to call 911 or go to the emergency room. That leads to a revolving door of hospitalizations, each of them alarmingly expensive. More than a quarter of Medicare’s budget is spent on people in their last year of life, and much of that spending is attributable to hospitalization."

Hospitals are dangerous for all people, but particularly for the elderly.

"Hospitals are fine for people who need acute treatments like heart surgery. But they are very often a terrible place for the frail elderly. ‘Hospitals are hugely dangerous and inappropriately used,’ says George Taler, a professor of geriatric medicine at Georgetown University and the director of long-term care at MedStar Washington Hospital Center. ‘They are a great place to be if you have no choice but to risk your life to get better’."

If asked, very few seniors would say they expect to spend the last days of their lives in pain, with bodily functions maintained by machines, and surrounded by strangers; but that is how many end up.

"For many, the worst place of all is the intensive-care unit, that alien planet where, according to a recent study in the Journal of the American Medical Association, 29 percent of Medicare beneficiaries wind up in their last month of life. ‘The focus appears to be on providing curative care in the acute hospital,’ an accompanying editorial said, ‘regardless of the likelihood of benefit or preferences of patients’."

If hospitals are overly expensive and inefficient at dealing with the elderly, what is the alternative?

Atul Gawande wrote an article for The New Yorker discussing the way end-of-life patients should be treated: Letting Go. He referred to two studies that are particularly relevant to medical care for the aged sick. They involved the effect of hospice care on medical outcomes.

Hospice care is concerned with trying to maintain quality of life at the current moment. That means worrying about pain and discomfort and trying to keep the patient alert and active as long as possible. Its goal is to bring the patient peace while nature takes its course. Under hospice care the patient has much more opportunity to interact with a team of personnel who closely monitor his/her condition.

Gawande refers to a study performed by Aetna. Normally hospice care is only provided to patients after they forego all other treatments and when life expectancy falls below six months. Aetna decided they would let a group of patients have hospice care and their normal treatments concurrently. The results would be compared with a control group only following their regular treatment. Both groups had a life expectancy of less than a year. The results were stunning. Patients benefited greatly from having the hospice service available. Within this group, emergency room visits declined by 50% and the use of hospitals and ICUs dropped by two-thirds. Overall costs dropped by almost a quarter. The ultimate conclusion was that this more personal care received from the hospice workers improved the patient’s health and well being.

Gawande describes another study.

"Like many people, I had believed that hospice care hastens death, because patients forgo hospital treatments and are allowed high-dose narcotics to combat pain. But studies suggest otherwise. In one, researchers followed 4,493 Medicare patients with either terminal cancer or congestive heart failure. They found no difference in survival time between hospice and non-hospice patients with breast cancer, prostate cancer, and colon cancer. Curiously, hospice care seemed to extend survival for some patients; those with pancreatic cancer gained an average of three weeks, those with lung cancer gained six weeks, and those with congestive heart failure gained three months. The lesson seems almost Zen: you live longer only when you stop trying to live longer."


The lesson to be derived from these studies is that there is a more efficient and effective way to deal with very sick people. It involves considerable more personal interaction and different skill sets than normally provided in the doctor’s-office-visit-to-hospital-stay loop. And this approach need not apply only to the elderly.

Having a hospice-like team manage the healthcare of an extremely sick person is advantageous for the individual patient, and it is less expensive for society as a whole, but it has not been a profit winner for hospitals. They lose money via this approach because there are fewer hospital days and fewer emergency room visits. Something has to change before this approach can scale up from small pilot programs.

The point of Rauch’s article is to demonstrate that change has, in fact, come to the healthcare industry.

"Under the new health-care law, Medicare has begun using its financial clout to penalize hospitals that frequently readmit patients. Suddenly, hospitals are not so eager to see Grandma return for the third, fourth, or fifth time. Obamacare has also earmarked money specifically to test new care models, including home-based primary care."

Rauch describes an organization called Hospice of the Valley that operates in Phoenix, Arizona.

"In the early 2000s, Hospice of the Valley began experimenting with an in-home program designed to bridge the frailty gap—that is, the gap between hospital and hospice. That experiment led to the development of a team-based approach in which nurses, nurse-practitioners, social workers, and sometimes physicians visit clients’ homes, provide and coordinate care, and observe people outside the context of the medical system. ‘That face time is what makes the program work,’ David Butler, Hospice of the Valley’s executive medical director, told me. Butler says that for the 900 people it serves, the program decreases hospitalizations by more than 40 percent, and ER visits by 25 to 30 percent."

This effort was run mostly on funds from charitable contributions and research grants. That is no longer the case.

"Insurance companies and other providers have begun asking Hospice of the Valley to contract with them to pick up their caseloads of high-cost, chronically ill patients. At the beginning of this year, the program was earning enough in reimbursements to cover one out of seven patients; today the rate is more like one in three. That is still not enough, but when a few more big contracts come through, Butler says, perhaps in a year or 18 months, enough of the patient base will be covered to tip the program into the black."

Sutter Health, a large network of hospitals and doctors in Northern California, has been experimenting with a program it refers to as Advanced Illness Management (AIM) that is similar in philosophy and execution to that of Hospice of the Valley.

"….each patient is assigned to a team of nurses, social workers, physical and occupational therapists, and others. The group works under the direction of a primary-care physician, and meets weekly to discuss patient and family problems—anything from a stroke or depression to an unexplained turn for the worse or an unsafe home."

"Thanks to a $13 million Medicare innovation grant, for example, Sutter is rolling out Advanced Illness Management to its entire health network, to test whether the program can be scaled up."

Rauch indicates that Sutter management is determined to make AIM its standard approach to dealing with the very ill.

"At Sutter, AIM has acquired institutional and financial momentum of its own; executives there say they expect to expand their annual patient load from about 2,000 today to between 5,000 and 7,000, which would make it the Sutter network’s standard method of care for the frail."

Perhaps, the biggest change of all that has been brought about by Obamacare is the realization within the healthcare community that the old business model is no longer viable. Whereas the previous focus was on generating revenue, now attention must be paid to cost and quality of care. Rauch provides this comment:

"Jeff Burnich, a vice president at Sutter, told me that the business case for AIM is only getting stronger. ‘Most health providers, if not all of us, lose money on Medicare, so how we make up for that is, we cost-shift to the commercial payers,’ he said. But the space for cost-shifting is shrinking. ‘The way you bend the cost curve now is by focusing on where there’s waste and inefficiency, and that’s the end of life in the Medicare population.’ He expects to see a wave of hospitals fail in coming years if they don’t provide better value. ‘The music has stopped,’ he said, ‘and there are five people standing, and one chair’."

There is a long way to go yet in cutting healthcare costs. The $135 billion the CBO projects in savings in 2020 is the tip of an iceberg. There is much more yet to be gained.

Sunday, December 1, 2013

Medicating Flu Victims: The Media and Medical Research

Oseltamivir is the chemical name for a drug in the category of neuraminidase inhibitors (more commonly referred to as antiviral drugs). It is marketed most familiarly under the commercial name Tamiflu. Tamiflu has an interesting history that was detailed by Helen Epstein in an article in the New York Review of Books: Flu Warning: Beware the Drug Companies. She describes the path taken by this medication from its initial evaluation for public usage by the medical committee of the FDA as being not worthy of FDA approval for treatment of flu sufferers, to its current position where billions of dollars worth of the material are deemed required to be stockpiled as a means of treating any coming flu pandemic. It seems a drug company that has control of the research data and has vast resources at its disposal can be quite effective at influencing people in positions of power.

Tamiflu’s use for the treatment of flu is controversial, especially where children are involved. A summary is provided by Wikipedia.

"BMJ [British Medical Journal] editor Dr. Fiona Godlee, said ‘claims that oseltamivir reduces complications have been a key justification for promoting the drug's widespread use. Governments around the world have spent billions of pounds on a drug that the scientific community has found itself unable to judge’."

And why is Tamuflu so difficult to evaluate? In addition to the expected problems associated with evaluating complex medical responses of humans, unbiased researchers have been hindered by the refusal by its manufacturer to make all the data it has compiled available.

"A subsequent Cochrane review, in 2012, maintains that significant parts of the clinical trials still remains unavailable for public scrutiny, and that the available evidence is not sufficient to conclude that oseltamivir decreases hospitalizations from influenza-like illnesses. As of October 2012, 60% of Roche's clinical data concerning oseltamivir remains unpublished."

It should also be noted that Japan, as the earliest and heaviest user of Tamiflu, decided in 2007 to recommend that the medication should not be prescribed for children because of dangerous side effects.

"In March 2007, Japan's Health Ministry warned that oseltamivir [Tamiflu] should not be given to those aged 10 to 19."

Given this background, it was with interest that a recent article was viewed in the New York Times by Catherine Saint Louis: Lifesaving Flu Drugs Fall in Use in Children. Note that the original title for the article when it appeared was Antiviral Drugs, Found to Curb Flu Deaths in Children, Fall in Use. It seems that a Times editor believed it was necessary to pump up the title in order to attract more readers. The media’s role as a conveyer of public knowledge and a molder of public opinion will be returned to shortly.

Saint Louis refers to a study that appeared recently in the journal Pediatrics: Neuraminidase Inhibitors for Critically Ill Children With Influenza. She provides this opinion on the significance of the report:

"’Antivirals matter and they decrease mortality, and the sooner you give them the more effectively they do that,’ said Dr. Peggy Weintrub, the chief of pediatric infectious diseases at the University of California, San Francisco, who was not involved in the research. ‘We didn’t have nice proof on a large scale until this study’."

But can this really be called a "large study," and can it really be considered as "nice" proof?

The study examined, in retrospect, the cases of 784 children under the age of eighteen who had been admitted to intensive care units in California suffering from severe flu symptoms (2009-2012). Of these, 653 were administered neuraminidase drugs such as Tamiflu, while 113 were not. Six percent of those administered the drug subsequently died; eight percent of those not provided the drug died as well. Is this proof that two percent of the 113 patients not administered the drug would have survived if they had been properly medicated? That seems to be the point Saint Louis is making.

Anyone familiar with the statistics of small numbers will be wary of drawing conclusions from nine events (eight percent of 113). If one applies the most straightforward statistical analysis to the data presented in the study, one finds that, with 95 percent certainty, the probability of death for those administered the drugs is within the range 4.2 to 7.8 percent, and between 3.3 and 12.7 percent for those who did not receive the drugs. All that can be said is that outcomes of the two classes of patients fall within the same statistical range. Never trust any account that does not attempt to assess the uncertainty in the results presented.

The authors of the study upon which Saint Louis based her article could not and did not make any definite claims based on their data, and did provide estimates of uncertainties. They produced this rather more equivocal summary of their results:

"Prompt treatment with NAIs [neuraminidase inhibitors] may improve survival of children critically ill with influenza."

What has happened here is that a research result that is "suggestive" in a research journal, became definite as interpreted by the media. A cynic might suspect that a drug industry public-relations firm was lurking in the background providing Saint Louis with a "summary" of the research findings, and suggesting people she should contact who would be willing to provide publishable quotes in order to arrive at an "eye-catching" title. An even deeper cynicism would drive one to suspect the validity of the entire medical study. Saint Louis provides this quote from one of the study’s authors, Janice K. Louie:

"One of the goals of the study was to increase awareness and remind clinicians that antiviral use is important in this population."

This seems to be an admission that the goal of the study was not to determine if drugs like Tamiflu were effective, but to prove that they were.

The deepest level of cynicism would suggest that what is at work here is yet another clever marketing campaign by the drug companies.

An uncertain research study was translated in the pages of the New York Times to convey to the public the notion that drugs like Tamiflu definitely save lives and if their sick child is suffering from the flu it should be medicated. The inevitable conclusion will be that if it is good for children, why not provide it for everyone?

Such reviews of medical research "findings" often appear in the press. The sad fact is that most of these compelling articles are subsequently proved to be either false or the research to be inconclusive. The even sadder fact is that if the conclusion presented by Saint Louis is subsequently proven to be nonsense, the public is likely to never hear about it.

An article in The Economist addressed the issue of public presentation of medical research by the popular media: Journalistic deficit disorder.

"IF ALL the stories in the newspapers claiming that a cure for cancer is just around the corner were true, the dread disease would have been history long ago. Sadly, it isn’t. But though some publications do have a well-deserved reputation for exaggeration in this area, many of these reports are at least based on respectable research published in peer-reviewed journals. So what is going on?"

"Research on research—particularly on medical research, where sample sizes are often small—shows that lots of conclusions do not stand the test of time. The refutation of plausible hypotheses is the way that science progresses. The problem was in the way the work was reported in the press."

To illustrate how media fails in its reporting of medical research, the article describes the work of Francois Gonon of the University of Bordeaux who studied reportage on a number of studies related to ADHD (attention-deficit/hyperactivity disorder), a common diagnosis for children who are deemed to be having trouble paying attention or behaving in school.

"First, they studied subsequent scientific literature to see what had become of the claims reported in the top ten papers. Seven of these had reported research designed to test novel hypotheses. Though each concluded at the time that the hypothesis in question might be correct (ie, the data collected did not refute it), the conclusions of six were either completely refuted or substantially weakened by the subsequent investigations unearthed by Dr Gonon. The seventh has neither been confirmed nor rejected, but he and his colleagues, citing two independent experts on ADHD, say its hypothesis "appears unlikely"."

"The other three papers in the top ten were following up existing hypotheses rather than presenting novel ideas. Two of them were confirmed by the subsequent work Dr Gonon tracked down, whereas one was weakened."

So, of the ten most reported-upon medical findings, only two were supported by subsequent research, six were suggested to be false by later studies, and two remain uncertain in accuracy. How were these new revelations treated by the press?

"In total, the original top ten papers received 223 write-ups in the news. But then the newspapers lost interest. Dr Gonon and his team found 67 further studies examining the conclusions of the original ten, but these subsequent investigations earned just 57 newspaper articles between them. Moreover, the bulk of this coverage concerned just two of the ten. Follow-ups to the other eight got almost no attention. Dr Gonon’s team do not pull their punches. There is, they say, an ‘almost complete amnesia in the newspaper coverage of biomedical findings’."

The article wisely pointed out that news reporters were not totally to blame. Some of the problem emanates from publishing bias in favor of exciting new results within the medical community itself.

"It would be easy to point the finger at lazy journalists for this state of affairs, and a journalistic version of attention-deficit disorder is, no doubt, partly to blame, for the press has a natural bias towards the new and exciting. But science itself must carry some responsibility, too. Eight of the ten articles whose fates Dr Gonon studied were published in respected outlets such as the New England Journal of Medicine and the Lancet. The deflating follow-ups, by contrast, languished in more obscure publications, which hard-pressed hacks and quacks alike are less likely to read."

And The Economist proves it is not above a bit of snark with this concluding comment:
"And, for what it is worth, as The Economist went to press, a search on Google News suggested that, a week after its publication, not a single newspaper had reported Dr Gonon’s paper."
Lets Talk Books And Politics - Blogged