Friday, February 28, 2014

Problems with Behavioral Economics

It’s not just liberals who are enthralled by behavioral economics. Since it appears to be science and since its inventors have received multiple honors, many conservatives have happily touted the new discipline.

What could be wrong with using public policy to induce, to nudge, even to force people into making better decisions about their finances?

Critics of behavioral economics have suggested that its proponents, especially those who want to apply it to policy, suffer from a puppeteer complex.

Those who possess great minds see their inferiors as marionettes are awaiting the arrival of a better puppeteer. The proponents of behavioral economics do not respect what happens when free individuals make free decisions in free markets. They believe that markets are fundamentally inefficient because they are prone to speculative bubbles provoked by irrational actors. Some question whether there really is such a thing as free choice.

Surely, there is value in behavioral economics, but it still seems to be the latest attempt to allow an oligarchy of great minds, what Plato called a guardian class, to control the economy.

Last month, economist Allison Schrager wrote an excellent column about the limitations of behavioral economics.

As she wrote:

Even if we can understand why people don’t always act rationally, it’s not clear if that can lead to better economic policy and regulation.

What, after all, makes anyone think that regulators will always act rationally? Doesn’t behavioral economics assume that government officials, unburdened by the profit motive, will always do what is in the best interest of the collective? Isn't there a name for this?

And then we need ask who decides what constitutes irrational behavior. What is the difference between irrational behavior and a mistake? Is there a difference? Surely, different people assess risk differently. If a high risk investment works out, who is to say that it is irrational?

Clearly, some people do act contrary to their best interest. But if someone decides that thrills are more important than economizing, who is to say that he has made an irrational decision?

And, Schrager continues, what gives anyone the right to change someone else’s behavior:

Mixing behavioral economics and policy raises two questions: should we change behavior and if so, can we? Sometimes people make bad choices—they under-save, take on too much debt or risk. These behaviors appear irrational and lead to bad outcomes, which would seem to demand more regulation. But if these choices reflect individuals’ preferences and values can we justify changing their behavior? Part of a free-society is letting people make bad choices, as long as his or her irrational economic behavior doesn’t pose costs to others. 

Of course, this is the crux of the debate.

We know that people who do not save for retirement may become utterly destitute. So, we have a Social Security program that forces them to save for retirement.

No one objects to the program.

And, if someone smokes or drinks to excess we know that he may need more health care, paid for by the state. So, we institute sin taxes to discourage such behaviors and, in principle, to pay for the medical cost of such behaviors.

No one has a problem with that, either.

Strictly speaking, neither instance involves nudging people toward better behavior. It imposes a discipline. Neither involves rigging markets to produce desired outcomes.

We did not need behavioral economics to institute Social Security. Behavioral economics has been more ambitious, in one sense, and more restrained in another. Many of its policy proposals have seemed to be innocuous.

Schrager reports:

But the limits of these policies are apparent in a new OECD report on the application of behavioral economics to policy. The report gives examples of regulations adopted by different OECD countries that draw on insights from behavioral economics. Thus it’s disappointing that, with all economists have learned studying behavioral economics the last ten years, the big changes in regulation seem limited to more transparent fee disclosure, a ban on automatically selling people more goods than they explicitly ask for, and standard disclosures fees and energy use. These are certainly good policies. But is this a result of behavioral economics (helping consumers over-come behavioral bias that leads to sub-optimal choices) or is it simply requiring banks and merchants to be more honest?

Here, she could have mentioned Obamacare. After all, the policymakers who dreamed it up believed that no one would mind being lied to about keeping their plans or their doctors because they would easily accept that the new plans were cheaper and better.

Under the aegis of behavioral economics President Obama and the Democratic Party decided that people would not care about being deprived of their freedom. As of today, it appears that they do.

Do people really not care about being deprived of the right to make a mistake? Again, who decides what is or is not a mistake?

Economists frame the question in terms of the relative efficiency or inefficiency of markets. Some believe that markets are inefficient and need to be strictly regulated. They know that human beings are capable of speculating and making bad investments. They believe that these excesses must be prevented by the intervention of government authorities.

Defenders of free markets believe that markets produce rational results, but are capable of becoming irrational. Markets tend to correct speculative bubbles. It might not be pretty but it does happen.

Worse yet, Schrager continues, what makes anyone think that regulators, who are human beings too, know what is best for everyone and have the capacity to engineer it.

In her words:

According to [Robert] Shiller, markets are inefficient and misprice assets because of behavioral biases (over-confidence, under-reaction to news, home bias). This leads to speculative bubbles. But it’s not clear what financial regulation can do to curb this behavior. According Gene Fama, Shiller’s co-laureate who believes markets are rational, …  it’s not possible to systematically separate “irrational” behavior (that distorts prices) from healthy speculation, which aids price discovery. If speculators (who have an enormous financial interest) don’t know better, how can we expect regulators to?

As I understand it, the Federal Reserve policy of quantitative easing is one of the most grandiose efforts to save the financial system by controlling the bond market. 

One day we will know whether these activities have simply put off the inevitable, even made it worse, or whether they have saved the world.

Will the bond market find an equilibrium that promotes economic growth and fiscal stability or will the forces of correction rear their ugly head once more.

Free marketeers like Jim Grant believe that a market denied will take revenge on those arrogant souls who believed that they knew better.

Today, we read this story in Business Week:

When the Federal Reserve raises interest rates, it will probably cause a financial-market convulsion similar to the “tantrum” that occurred last year after the Fed said it was considering trimming its bond purchase program, economists said in a warning to policy makers.

“Whenever the decision to tighten policy is made, then the instability seen in summer of 2013 is likely to reappear,” said the economists including Michael Feroli, the chief U.S. economist for JPMorgan Chase & Co. in New York, and a former Fed economist. “Risks of instability have not been eliminated.”

The warning coincides with a Fed debate on whether to give more weight to concerns over financial stability when changing so-called forward guidance or monthly bond buying known as quantitative easing. Feroli and his coauthors presented the paper to a gathering including Fed Governor Jeremy Stein and at least six other Fed policy makers at the U.S. Monetary Policy Forum in New York.


“Our analysis does suggest that the unconventional monetary policies, including QE and forward guidance, create hazards by encouraging certain types of risk-taking that are likely to reverse at some point,” said Feroli and his co-authors Anil Kashyap of the University of Chicago, Kermit Schoenholtz of New York University’s Stern School of Business and Hyun Song Shin of Princeton University.

It's not so easy to ride a tiger.

The Culture of Psychotherapy in Argentina

Responding to yesterday’s post about Argentina, an anonymous commenter referred to a CNN article about psychotherapy in Argentina. I am grateful for the reference.

The article offers a fuller picture of the culture of psychotherapy in that nation. I still contend that the predominance of Freudian psychoanalysis, in particular, must have something to do with the decline and fall of the once proud nation. Showing why and how something so innocuous as psychotherapy can influence a nation's economic and future feels like a daunting task, but that is no reason to gloss over it.

Of course, I still leave open the possibility that psychoanalysis appeals to a demoralized nation. Nevertheless, if the function of therapy is to raise morale, it seems to have failed in Argentina.

The CNN article recounts some interviews with psychoanalysts:

Many Argentines I spoke with agreed that their culture is one in which people talk about their personal issues more openly than in the United States.

"In other countries, people are more closed off about their problems," Frankenberg said. "There's much more of a push for people to resolve their issues elsewhere, like throwing themselves into work."

People in Argentina commonly kiss one another on the cheek in saying hello and goodbye, expressing a warm feeling even between a dentist office receptionist and patient. They talk about their feelings. They sit in cafes without a sense of urgency, drinking café con leche with a small glass of soda water and eating small cookies.

Brok said the United States tends to have a culture more oriented toward shame and individualism, and an ethic of finding solutions to particular problems.

Argentina, he says, is more into introspection. The Argentine tango, too, invokes nostalgia and self-exploration, Frankenberg said.

The slowness of psychoanalysis in particular may make it unattractive in other cultures, Rolon said. No analyst can guarantee a result in six months, and therapy goes as long as it continues to feel right to the patient and analyst. Rolon has himself been going to psychoanalytic therapy for 25 years.

"Maybe a patient comes because of a problem. And when that problem is resolved, he realizes that he wants to continue working on other problems. In analysis, that is permitted," he said. "In other kinds of therapy, when a problem is resolved, it's over."

Surely, some Americans will take offense at the notion that people in another culture are better at expressing their feelings and talking about personal issues.

In those realms, Americans are really amateurs. They prefer to solve problems. They do not often glorify those who complain about why they cannot solve their problems.. That might be why Americans are far more likely to solve their problems than are Argentinians.

Unfortunately, some Americans believe that they ought to remake their culture to make it more like Argentina. Taking the therapy culture to that extreme exacts a price, as Argentina shows.

Thursday, February 27, 2014

Cry for Argentina

Last week, The Economist ran a long cover story about the decline and fall of Argentina.

Today Roger Cohen reports from Argentina and summarizes the nation’s problems well.

In his words:

Argentina … is a perverse case of its own. It is a nation still drugged by that quixotic political concoction called Peronism; engaged in all-out war on reliable economic data; tinkering with its multilevel exchange rate; shut out from global capital markets; trampling on property rights when it wishes; obsessed with a lost little war in the Falklands (Malvinas) more than three decades ago; and persuaded that the cause of all this failure lies with speculative powers seeking to force a proud nation — in the words of its leader — “to eat soup again, but this time with a fork.”

A century ago, Argentina was richer than Sweden, France, Austria and Italy. It was far richer than Japan. It held poor Brazil in contempt. Vast and empty, with the world’s richest top soil in the Pampas, it seemed to the European immigrants who flooded here to have all the potential of the United States (per capita income is now a third or less of the United States level).

Argentina fascinates because decline fascinates. Those who believe that progress is linear have a problem explaining what happened to Argentina.

For my part, I am less interested in the moral predations of Juan and Eva Peron. I am intrigued by the fact that Argentina is a world leader in psychoanalysis. I am persuaded that on a per capita basis Argentinians are leading the world in psychoanalysis.

For many years Argentinian psychoanalysts tended to follow the lead of Melanie Klein. Over the past few decades it has become entranced by the teachings of my old friend, Jacques Lacan. It is not an exaggeration to say that among Argentinian psychoanalysts Lacan is something of a deity.

If you ever criticize Lacan, a leading Argentinian analyst once told me, they will run you out of town.

Cohen notes the Argentinian love affair with psychoanalysis:

In psychological terms — and Buenos Aires is packed with folks on couches pouring out their anguish to psychotherapists — Argentina is the child among nations that never grew up. Responsibility was not its thing. Why should it be? There was so much to be plundered, such riches in grain and livestock, that solid institutions and the rule of law — let alone a functioning tax system — seemed a waste of time.

Of course, one might argue that Argentinians need all of the treatment because their morale has been declining with their nation’s fortunes. Yet, it is difficult to escape the suspicion that psychoanalysis is as much the problem as the solution.

Psychoanalysis values regression over progress. Could it be that all of the regression has turned the nation into a “child among nations?”

I grant that Argentinians know what they really, really want, but has all the effort put into accessing their desire caused them to forget about accessing the credit markets?

Has psychoanalysis gotten them so involved in their minds that they have lost touch with reality?

Have they so thoroughly absorbed Freudian blame-shifting that they cannot take responsibility for their own failings?

And then, there are the politics, both cultural and economic.

As we know well, French Lacanians proudly situate themselves on the radical left. In their youth many of them were Maoists and Marxist-Leninists. Today they have become strong supporters of the Socialist party.

Some psychoanalysts consider themselves the last line of defense against an invasion by an alien Anglo-American culture. Some of them reject cognitive-behavioral treatment for autistic children because they fear a back-door American cultural invasion. They have nothing better to offer themselves, and they know that the American treatments are more effective. They do not care.

Being good culture warriors they are opposed to anything that smacks of British and American culture. That includes industrialization, free trade, free enterprise,  liberal democracy and Margaret Thatcher.

Unfortunately, these attitudes are in perfect harmony with an Argentinian tendency to live off the land, to assume that the soil will provide, to avoid large-scale industrialization and to disparage free trade.

All reputable analyses accept that Argentina has fallen behind because it has failed to embrace the modern world. Perhaps, Argentinians have taken all that Freudian regression a bit too literally.

Cohen describes Argentinian political philosophy as a mishmash:

Argentina invented its own political philosophy: a strange mishmash of nationalism, romanticism, fascism, socialism, backwardness, progressiveness, militarism, eroticism, fantasy, musical, mournfulness, irresponsibility and repression. The name it gave all this was Peronism. It has proved impossible to shake.


Obviously, some of these qualities are dissonant with Freudian theory. And yet, a nation that values romanticism, eroticism, mournfulness and irresponsibility has certainly made progress in its psychoanalytic treatment.

The Case of Justina Pelletier

Their only recourse was the media. Justina Pelletier’s family had no other way to fight the Boston Children’s Hospital and the Massachusetts Department of Children and Families.

The Boston media has been covering the story. Megyn Kelly has reported extensively on it for Fox News. ABC News had a good story about it yesterday. Last week the Huffington Post ran a well written jeremiad about the indefensible and deplorable abuse of state and psychiatric authority.

(For the record, the HuffPo article was written by an advocate for people who are suffering from Justina's condition.)

It is not a story of left versus right; it’s not even a story about parents versus physicians. It’s a story about one group of physicians against another, and about the abuse of state power.

Now, Massachusetts legislators are getting involved and the courts are apparently paying attention.

It took a year.

At issue is Justina’s mitochondrial disorder. There is no test that can identify a cause, so the disorder is diagnosed by a constellation of symptoms. Physicians observe how well or poorly the patient responds to treatment. If she responds well, the diagnosis is considered affirmed.

The Huffington Post outlined the medical problem:

Justina and her older sister struggle with an invisible condition called mitochondrial disease. It is caused by a disruption in the cellular energy centers of our bodies, the mitochondria. The crisis inside her cells isn't always apparent on the outside. But inside, they are teetering on a precarious cliff where the body's demands exceed the body's ability to supply enough energy to live and to thrive. For mito patients, it's more than just being tired. It's never having adequate fuel to operate necessary functions of our body. Justina's gut stopped working a couple of years ago, a common problem for mito patients, resulting in a tube to help her digest and eliminate food. Mito patients also struggle with muscle weakness, pain, memory loss, erratic blood pressure, vision problems, hearing problems, and debilitating fatigue.

Justina’s older sister also suffers from this condition. Since the older sister had been successfully treated at Tufts Medical Center, a leading facility for this condition, Justina's parents brought her there for treatment.

Last year, when Justina was suffering from flu-like symptoms, her physicians at Tufts recommended that she be taken to Boston Children’s Hospital. The physicians at BCH examined Justina for a couple of days and decided that her problems were psychiatric, not medical. They decided that she was suffering from a somatoform disorder, something that used to be called conversion hysteria.

The physicians at BCH did not consult with Justina’s physician at Tufts.

The Boston Globe explains what happened at BCH:

Then last February, Justina was brought to Children’s Hospital after suffering severe intestinal issues, and having trouble walking. Doctors there, in a matter of a few days, concluded that her problems were primarily psychiatric, and that the parents were ignoring the root cause of her problems and pushing for unnecessary medical interventions.

When the parents sought to discharge Justina, the hospital filed medical child abuse charges, which were ultimately supported by the state and later a juvenile court judge.

Children’s Hospital, while still monitoring her medical care, has said in a statement that it has been pleased with the girl’s progress in and out of the hospital. Justina’s parents, however, contend that her condition has worsened in the past year, and that she can now only move around in a wheelchair.

For emphasis, parents who relied on expert physicians, and who agreed with them that Justina was suffering from a medical condition, were charged with “medical child abuse.”

All indications suggest that Justina’s condition has deteriorated. Why are these physicians pleased with her progress?

Since the parents wanted to return Justina to Tufts in order to continue her treatment, the hospital and a state agency and a judge took custody of the child. They placed her in a psychiatric clinic and allowed her parents only brief, limited supervised visits. For good measure, the court eventually hit Justina’s parents with a gag order prohibiting them from speaking out in public about the issue.

When Justina’s father went on the Megyn Kelly show to plead his case, he was accepting the possibility that he might be held in contempt of court.

The Huffington Post reports the horror:

Justina was diagnosed with a psychiatric condition at Children's Hospital, moved to a locked psychiatric unit, and was not treated for her current illness or for her mitochondrial disease, which necessitates a daily regimen of cellular supplements and dietary adjustments. Her body weakened. She complained she could not walk, felt nauseous, and had trouble eating. She was told her symptoms were all in her head, and behavioral therapies are forced upon her. Justina's doctors -- who had treated her at length -- were never consulted. Family counseling was never offered as she got sicker, and the months went on.

Think about it: this has been going on for a year!

It is cold comfort to see that these psychiatrists did not try to cure Justina with the talking cure. And yet, why would psychiatric professionals ever imagine that cutting a child off from nearly all contact with her family would improve her mental health? Shouldn’t these professionals be investigated?

In any event, the Huffington Post reports on the psychological damage inflicted, not only on Justina but also on the Pelletier family:

What none of the media is talking about is how shattered this experience has left this family. The damage is irreparable. Justina's parents are consumed with grief, fear, suspicion, and desperation. And, Justina, the child who is "protected" by the law, has lived without the comfort and protection of a mom and dad and sisters for more than a year. Their lives have been destroyed. At what point did we become so detached that we can let this type of chaos and tragedy ensue for a week, much less for over a year? Where are the people who will stand up and insist "enough is enough!"?

A few days ago the juvenile court judge ordered Jessica to be transferred to a residential psychiatric clinic, one where she would not be able to receive any treatment for her mitochondrial disorder. Justina’s mother passed out in the court and was removed from the building on a gurney.

Recent reports suggest, however, that the public outcry and the interventions of several Massachusetts state legislators might lead the judge to release Justina to her parents.

Fox News Connecticut filed this report yesterday:

There were dramatic new developments Wednesday in the case of West Hartford teenager, Justina Pelletier, that could pave the way for her release from Massachusetts State custody.

Justina Pelletier’s next hearing is slated for March 17, but a group of Massachusetts lawmakers is pushing for an immediate release.

According to Mass. State Representative Marc Lombardo, as of 5 pm Wednesday, he and Rep. Jim Lyons had 12 representatives backing a resolution that would start the process of releasing Justina Pelletier to her parents right away.

Rep. Lombardo will raise the resolution at the next House of Representatives session, which is scheduled for March 5.

“The Pelletier case is a dispute between conflicting medical opinions… the decision on which medical treatment to adopt should rest with the parents, not with DCF. The Department’s heavy-handed, unjustified interference with the rights of these parents is an example of what is wrong with this agency,” said Rep. Jim Lyons (R-Andover).

I for one would like to know the identities of the medical professionals who are behind this? If they are confident about their diagnosis and treatment recommendations, they should defend themselves in public.





Wednesday, February 26, 2014

Disappointed with Obama

Prof. James Ceaser is correct to see the Barack Obama presidency as a spiritual movement.

From the onset of the Obama administration I have argued on this blog that the American people elected Barack Obama in order to atone for sins.

Faced in 2008 with a financial crisis they did not understand, Americans bought a theory that they could grasp, one that had been peddled relentlessly in the media and the schools. They decided that when the credit markets had frozen and the financial system was about to implode it meant that God was punishing America for its sins. Those sins were racism, sexism, homophobia, Islamophobia and carbon emissions.

So, Americans went looking for a Savior. They found on in Barack Obama. It was easier to atone for their sins by electing Obama than actually working their way out of the crisis.

Ceaser renders vividly the idea that the Obama presidency was a spiritual revival:

In the promiscuous blending of politics and culture that characterizes our age, the launch of the Obama campaign in 2007 marked the beginning of a politico-spiritual movement that promised a new beginning and a transformation of the nation. It was to be the “moment when the rise of the oceans began to slow and our planet began to heal .  .  . [when we] restored our image as the last, best hope on Earth. Faith in the leader knew no bounds. Obamaism spilled out from the college campuses and tony enclaves of Manhattan and San Francisco into the mass public to become first an American and then a worldwide phenomenon. The legion of believers included not only the youth in their T-shirts emblazoned with the silk-screen Obama image, but also many of the nation’s most experienced political observers. 

By 2013, Ceaser explains, the bloom was not only off of the Obamarose, but people began to grasp that the oft-trumpeted national revival wasn’t going to happen. They began to see that they had elected a transformative incompetent who was going to transform the nation for the worse.

In Ceaser’s words:

No date was fixed for the fulfillment of all the hopes and promises—extensions were continually asked for under the excuse that “change would never be easy”—but enough time had transpired by the end of 2013 for people to sense that the deadline had come and gone. 

Taking a page from social psychologist Leon Festinger, Ceaser suggests that people have three ways to deal with the trauma of disappointment.

They can accept that they were duped, deny that their god has failed or deflect the blame.

Obviously, the path to a real recovery begins with an acceptance that one was duped, that one had been the victim of a hoax. This entails feeling ashamed, but it allows one to reconcile with reality, get one’s feet on the ground, gain some traction and move forward.

Ceaser describes accepters:

Accepters are those who conclude that they have succumbed to an error or perhaps been victims of a hoax. In the psychologists’ jargon, they admit to “disconfirmation.” Such recognition may come with powerful feelings of pain—a sense of emptiness, the despair of lost hope, or the embarrassment of having been “had” by a confidence man. 

The most prominent accepter is former White House press secretary Robert Gibbs. According to Ceaser:

While he still supports Obama’s political program, Gibbs has recently appeared on television admitting that “2013 was a lost year for the president,” and that the people doubt that Obama’s team is “remotely capable of solving those problems.” He no longer frequents the White House.

On an encouraging note Ceasar suggests that many citizens have finally figured out that their faith and hope was misplaced, and that Obama is not going to deliver on his grandiose promises. Moreover, they have discovered that their president lies all the time.

In Ceaser’s words:

On the level of the mass public, poll data show a stunning loss of confidence in the leader, as more and more erstwhile followers have come to accept that “the change” was pure fiction. While there are signs of a mild and pervasive depression—nearly two-thirds of the public think the nation is on the wrong track—many seem to be adjusting to life after Obamaism.

And then there are the deniers. They continue to believe in their Messiah and refuse to accept that they have been duped. They will find something positive to say about him, no matter what.

Ceaser describes them well:

… some followers have invested so much in their adherence that they cannot eliminate the dissonance by adjusting to reality. They instead “effectively blind themselves to the facts” and band together, fortifying their beliefs by the support of others who agree. “If more and more people can be persuaded that the system of belief is correct, then clearly it must, after all, be correct.” In brief, to quote another expert, they cling to religion.

One suspects that deniers are culture warriors more than economic of political reformers. They seem unworried about the continuing bad economy. They do not seem concerned that America’s role in the world has been diminishing. They are happy that America exited Iraq and will soon be exiting Afghanistan. Since they hate military culture, they are happy to see it diminished.

Deniers want Americans to suffer. They want Americans to be punished for their sins. They do not care about the stock market or the labor market or the Arab Spring.

Moreover, their arena is hearts and minds, not the marketplace or the battlefield. They are fighting racism, sexism, homophobia, Islamophobia and carbon emissions. If the nation is moving toward greater tolerance they are happy. This is what they voted for and they feel good about their votes.

Some of the deniers still to rationalize Obama’s ineptitude. Others simply do not care. They voted for a cultural revolution and they believe that they are watching it unfold.

This brings us to the largest group, the deflectors, those who see that the Obama administration is failing, but who are happy to shift the blame to other people… like the Republican Congress, Rush Limbaugh and Fox News. Deflectors are in closer touch with reality but refuse to hold Obama to account.

Ceaser describes them:

Deflectors admit that the anticipated outcome did not actually occur, which is their concession to reality. But they go on to say that the failure was not the result of a falsehood or a hoax. The prophecy would have been fulfilled but for the existence of a countervailing force that canceled it out. The promise in a sense was kept, only its effects were nullified. Where deflection is ably executed, it can serve to strengthen belief among the faithful, who now conceive of themselves as saints in an implacable struggle with the sinners.

At times, President Obama seems to be leading the charge toward shifting the blame. At other times, he sounds like he is in denial.

Ceaser describes his strategy:

For the most part, however, Obama follows the predicted model of resolving dissonance by being a denier and deflector. He is still asking followers to have patience, going to the extreme of fighting Providence with executive orders… that extend crucial deadlines. Obama appears at his most natural and sincere in the role of deflector-in-chief. All the great things, he suggests, would have happened but for sinister forces working against the change. 

By now, many of the accepters and deflectors have shifted their allegiance. They are preparing to open a new front in the culture war, the better to promote American spiritual renewal.

Just as the election of Obama proved that America has overcome its racist past, so now America will have the chance to show how it has overcome sexism by electing Hillary in 2016.

If the tactic worked once, there is no reason why it will not work again.


Tuesday, February 25, 2014

Biochemical Puppets?

According to Sam Harris, noted neuroscientist and atheist, we are all “biochemical puppets.”

Obviously,  this precludes our having free will. It precludes our being capable of making rational  decisions. It absolves us of responsibility for our actions.

We are left with what, exactly? It is difficult to avoid the conclusion that Harris sees human beings as mindless, amoral monsters.

To my knowledge, he does not ask a salient question: if we are all puppets, who is the puppeteer? Surely, we are more than a bunch of biochemical processes responding to stimuli. If there isn't a puppeteer, what makes your actions yours? What does it mean to say that actions are yours if you are a biochemical puppet?

These are complicated, difficult and important questions. That being the case, addressing them seriously is an achievement.

So, our compliments to Yale psychologist Paul Bloom for an excellent article where he shows that  neuroscience need not make us into “biochemical puppets,” devoid of free will, reason and responsibility.

Of course, Bloom is obliged to go beyond neuroscience. He argues that even if we suffer biochemical influences, that in itself does not preclude our making a free and rational choice and being responsible for it.

Bloom explains:

The deterministic nature of the universe is fully compatible with the existence of conscious deliberation and rational thought—with neural systems that analyze different options, construct logical chains of argument, reason through examples and analogies, and respond to the anticipated consequences of actions, including moral consequences. These processes are at the core of what it means to say that people make choices, and in this regard, the notion that we are responsible for our fates remains intact.

The universe might be deterministic, but if, for example, you are playing chess, the outcome is not predetermined. When playing chess, as Bloom notes, we analyze options, construct argument, reason through different alternatives and make our moves. We may develop a winning strategy. We may not. In neither case can we say that we are merely biochemical puppets.

Some neuroscientists, like Sam Harris, argue their points by referring to tests that show how the mind can be influenced by external stimuli.. If you see someone wearing a hat or strumming a guitar, you are more likely to react in a certain way.

Yet, the fact that we are subject to influence does not mean that our decisions are irrational. If you are exposed to two ads for detergent they are both attempting to influence your decision, but when making a decision you will normally weigh them along with other considerations, like past experience and recommendations. Even if you decided to go along with the ad that has the fashion model or the one that has the pile of clean clothes, you have still chosen to follow one set of influences, and not another.

Bloom explains the point well:

Just because something has an effect in a controlled situation doesn’t mean that it’s important in real life. Your impression of a résumé might be subtly affected by its being presented to you on a heavy clipboard, and this tells us something about how we draw inferences from physical experience when making social evaluations. Very interesting stuff. But this doesn’t imply that your real-world judgments of job candidates have much to do with what you’re holding when you make those judgments. 

It’s relevant that people whose polling places are schools are more likely to vote for sales taxes that will fund education. Or that judges become more likely to deny parole the longer they go without a break. Or that people serve themselves more food when using a large plate. Such effects, even when they’re small, can make a practical difference, especially when they influence votes and justice and health. But their existence doesn’t undermine the idea of a rational and deliberative self. To think otherwise would be like concluding that because salt adds flavor to food, nothing else does.

He also adds this point:

Yes, we are physical beings, and yes, we are continually swayed by factors beyond our control. But as Aristotle recognized long ago, what’s so interesting about us is our capacity for reason, which reigns over all. If you miss this, you miss almost everything that matters.

Obviously, this is not merely a theoretical exercise. It aims at defining a notion of moral responsibility. Are we inclined to do the right thing? Will rational deliberation be more likely to lead us to do the right thing? At what point are we no longer responsible for our actions?

If we are biochemical puppets, how do we know right from wrong? How do we, Bloom argues, learn to improve our moral principles? We might say that we do it by trial-and-error, but still, what gives us the first principles and why do we believe that trial-and-error is better, say, than inspiration?

It ought to be well enough known, thanks to David Hume, that science does not tell us what we should do. It tells us what is.

When people take action in the world, Bloom continues, they make plans and implement those plans. These plans are purposive.

Our capacity for rational thought emerges in the most-fundamental aspects of life. When you’re thirsty, you don’t just squirm in your seat at the mercy of unconscious impulses and environmental inputs. You make a plan and execute it. You get up, find a glass, walk to the sink, turn on the tap. … Making it through a single day requires the formulation and initiation of complex multistage plans, in a world that’s unforgiving of mistakes (try driving your car on an empty tank, or going to work without pants). The broader project of holding together relationships and managing a job or career requires extraordinary cognitive skills.

It is worth mentioning that when you feel thirsty and get up to get a drink of water your behavior is more likely to feel automatic than planned. This does not make it less rational. It makes it less directed. The fact that we can analyze the act of getting a glass of water into a series of cognitions and gestures does not necessarily mean that we performed all of them in our minds before turning on the faucet.


Fighting the War Against Cancer

A little context, please.

In the 2012 presidential campaign Barack Obama torched his opponent for suggesting that the government should stop funding Planned Parenthood.

Obama said:

When Governor Romney says that we should eliminate funding for Planned Parenthood, there are millions of women all across the country who rely on Planned Parenthood for not just contraceptive care. They rely on it for mammograms, for cervical cancer screenings.

Some media outlets quickly pointed out that Planned Parenthood does not perform mammograms. It refers women to outside providers.

Still, the damage was done. As always, Romney failed to respond with sufficient vigor and the meme contributed to the concept that Republicans were trying to compromise women’s health.

A recent study from Canada, however, has shown that yearly mammograms are not necessarily such a good thing.

Cancer surgeon Marty Makary writes in Time:

A recent study found that yearly mammograms do not prolong the lives of low-risk women between the ages of 40 and 59. Following 89,000 women for 25 years in a randomized controlled trial (the gold standard of science), the study is as methodologically impressive as they come. In fact, in research terms, the report has more scientific merit than any medical study of chemotherapy. As hard as it is for our pro-screening culture to believe, the data are clear. We are taxing far too many women not only with needless and sometimes humiliating x-rays, but also with unnecessary follow-up surgery.

It is worth noting that many of the extra tests and procedures are anything but benign. And, we emphasize that the Canadian study recommended that women do regular self-examinations.

Makary also underscored another important point. As a nation we screen too much for disease:

New research is finding that some health screening efforts have gone too far.

The annual mammogram is not the only vintage medical recommendation under scrutiny recently. Another large study found that among low-risk adults, a daily aspirin — a recommendation hammered into me in medical school — kills as many people from bleeding as it saves from cardiac death. Doctors are also re-evaluating calls for regular prostate-specific antigen (PSA) tests and surgical colposcopies after “borderline” Pap smears because of the risks of chasing false positives and indolent disease.

In his article, Makary was re-evaluating his own approach to cancer. Being a cancer surgeon, and having seen many people from the disease, he wants to do everything possible to prevent the disease, or better, to stop it before it becomes incurable.

Yet, our culture has cast him, like other physicians, as an armed combatant in the war against cancer:

As a surgeon, I’m trained to crush cancer. For many years, every tumor I palpated and family I counseled drove me to hunt for cancer with vengeance, using every tool modern medicine has to offer. 

Who could possibly argue with that?

And yet, in their zeal to do battle with the great killer, physicians mistook good intentions for good results. Imagining that they were doing God’s work they failed to evaluate the outcomes of the tests objectively.

Makary explains how he discovered this:

The patient’s story began with a full-body CAT scan, a screening test used to detect tumors, which revealed a cyst on his pancreas. Some 3 percent of humans have these cysts and they are rarely problematic. Based on his cyst’s size and features, there was no clear answer as to what to do about it, but he was given options.

The patient tossed and turned every night, agonizing over stories of pancreas cancer tragedies, consumed by the dilemma of whether to risk surgery to remove the cyst or leave it alone. The conundrum strained his marriage and distracted him from his work.

Months before I met him, the patient underwent the surgery, which revealed that the cyst was of no threat to his health. The operation was supposed to cost $25,000 and eight weeks out of work. But the toll was much greater, including a debilitating surgical complication.

I thought: this is why he shouldn’t have had a CAT scan in the first place. Screening made him sick.

In truth, information about the risks of overtesting and performing unnecessary procedures has been around for a while now. I have reported on it here.

It is good to see surgeons question what have become standard prophylactic procedures and that they stop touting the virtues of screenings that cost the nation a massive amount of money, provide little benefit and occasionally cause harm.

Monday, February 24, 2014

A Moratorium on Apologies?

According to Dov Seidman and Andrew Ross Sorkin too many people are offering too many of the wrong kinds of apologies.

Having written about the value of public apology in my book Saving Face, I feel qualified to opine. In it, I emphasized that no one had ever apologized for the greatest foreign policy failure of our time, the Vietnam War.

Whether it was right or wrong to fight in Vietnam, the Kennedy-Johnson administration got us into it, escalated it, and mismanaged it. In the absence of leaders who were willing to admit to their own failure, fault devolved on the soldiers who fought and died there.

I doubt that my book provoked the wave of public apologies, but I like to think that it had some influence.

In any event, Seidman thinks that the situation has gotten out of control. From a country where no one really apologized, we have become a nation where everyone apologizes all the time. The glut of apologies, he argues, has caused the gesture to lose its meaning. I am not quite sure that it’s a “dangerous crisis,” but the point is worth considering.

In Seidman’s words:

Business, politics, media, academia, sports and celebrity – virtually every aspect of our public lives – are in the midst of a dangerous apology crisis. That truth has been with us for some time, but the mea culpas have kept on coming to the point where they are reaching the level of parody.

Seidman recommends that we all limit our use of apology, to around six a year. He explains:

What if we were allowed to deliver only half a dozen apologies each year? Aside from the saintly among us, we’ll each have more occasions than that. What would be the effect of having to apologize “within your means?” We would be much more frugal with the act. Note that I did not say cheap. We would hold the behavior as treasure, not as an easily renewable token to be flicked into a moral turnstile that grants admission for a shot at redemption. Reminded that the pursuit of forgiveness should be treated as precious and rare, we would restore its value — to both the offended and the offender.

As it happens, the uses of apology are complex. There are apologies and then there are apologies.

When you unintentionally jostle someone on the street, you normally say that you are sorry. It is the lowest level of apology. It signifies that your offense was unintended. If you do not apologize, the other person will be right to assume that your gesture was hostile.

Surely, we do not want to limit such expressions of regret. In their absence human interaction would become more coarse and conflict-laden.

But, there is a difference between the mild embarrassment you feel when you inadvertently make a mistake and the shame you feel when you recognize that you have failed on a grand scale.

At the least, the first is mostly individual. The second usually refers to the way you have failed to fulfill a responsibility that you held within a larger group. In both cases a fault requiring an apology involves unintended consequences.

Again, we need to distinguish between different kinds of public apologies. Chris Christie’s apology for closing the George Washington Bridge is not the same as Lance Armstrong’s mea culpa for doping.

If we assume that Chris Christie did not know about the bridge closing his apology showed that he was accepting responsibility that was inherent in his role as governor.

Lance Armstrong, however, was trying to apologize for an intentional act. He cheated. He knew he was cheating. He gained great advantages from cheating.

Under the circumstances his apology counts as insincere. He cannot be expected to forfeit his ill-gotten gains but they should certainly be taken from him.

Note well, in traditional Chinese thought apology can be sincere or insincere. It is assumed, unless proven otherwise, that an apology is sincere. But, that precludes instances like Lance Armstrong’s where the offense is intentional and involves breaking a rule.

If a sincere apology is accompanied by an effort to make amends, at times by retiring from a company or from public life, an insincere apology is an effort to hang on to ill-gotten gains and to avoid prosecution.

Seidman and Sorkin tend to refer, reasonably, to recent apologies. And yet, if we go back in time we recall Bill Clinton’s apology for his dalliance with Monica Lewinsky and Attorney General Janet Reno’s apology for the holocaust of the Branch Davidians in Waco, Texas.

Even here, the apologies are quite different. If the first sign of a sincere apology lies in the facial expression of the individual who is apologizing, then Janet Reno appeared to be far more sincere than did Bill Clinton.

Janet Reno looked as though she sincerely regretted what happened at the Branch Davidian compound when she apologized before a Congressional committee. Bill Clinton looked as though he did not regret anything when he was finally forced to sorta-apologize for the Lewinsky affair. Clinton’s apology was laden with contempt for anyone who dared to judge his behavior.

In the world of faked apologies, Bill Clinton seems to have set the standard. As with Lance Armstrong, Clinton was looking to avoid punishment for having cheated. His greatest regret seems to have been … getting caught.

Normally, people hold those whose apologies are insincere to account. In Reno’s case, one suspects that no one believed that she was anything more than a stand-in for those who had given the order. Question: who do you think really gave the order?

In Clinton’s case, his dereliction seemed more personal than public. It did not appear to involve his conduct of his office.

Bill Clinton cheated on his wife, but most people did not believe that he was cheating on his country.

In many ways, most people were probably wrong about this point. When a leader gets caught in flagrante delicto, he compromises the dignity of his office. He diminishes the amount of respect it commands. He has made it into more of a theatrical performance; as was his apology.

America does not have too many examples of recent, sincere apologies. Allow me to recall the 50-year-old case of John Profumo. As British minister of defense Profumo got involved with a prostitute who was also servicing a Soviet naval attaché. Thus, his affair had the potential to compromise national security.

Profumo apologized for his dereliction, resigned from office and retired from public life. He went to work as a social worker in London. After several years, he was welcomed back into society and was honored by the Queen as a Commander of the Order of the British Empire. Of course, he was honored for his charity work.

It is possible to recover one’s reputation  after failing and apologizing. But, no one does it right away and no one does it without giving up something of great value.

Obviously, the extent of one’s authority and the egregiousness of one’s failure determine what one should give up. An apology that costs nothing is, by definition, insincere.

Aside from the expression of sincere shame, someone who apologizes is pledging not to do it again. As I have often pointed out, even on this blog, an apology implies a promise never to do it again. If you apologize for cheating and then cheat again, you have gone back on your word and your future apologies will count as insincere.

Now apology has become theatrical. Even though celebrities are distorting the ritual of apology, it is worthwhile, as Seidman and Sorkin suggest, to hold public figures to a higher standard.

As for the notion of a moratorium of frivolous or excessive apologies, I believe that we do better not to go down that road.

In the first place, who knows whether people will stop saying that they are sorry in situations where a quick and small apology will facilitate social harmony.

Second, a Confucius might have said, an insincere apology is better than no apology at all. Or as 12 steppers would have it: fake it until you make it.

The solution to an excess of fake apologies is holding people to account. If they apologize and do not give up anything, if they apologize and then go back on their word, the public at large ought to reject them.


Sunday, February 23, 2014

The Lambs of Wall Street

Kevin Roose has taken a somewhat unlikely but thoroughly useful approach to Wall Street. Instead of looking at the balance sheets and the spread sheets he has examined the people who work there and the lives they lead.

After following eight young bankers in their first few years on the street, he has written a new book,  Young Money: Inside the Hidden World of Wall Street’s Post-Crash Recruits. To promote his book Roose has contributed an essay to the Atlantic.

As of this morning, Roose’s book has been doing very well on Amazon. Judging from the analysis he provides in his Atlantic piece, the success is well deserved.

Wall Street isn’t what it used to be. The 2008 financial crisis dealt it a body blow. The glory that was investment banking is no longer. Newly minted Ivy League grads who had believed that a job at Goldman Sachs was a ticket to untold prosperity and prestige quickly learned that the old model no longer pertained.

To be fair, the same is true of New York’s legal profession, and, in many cases to the practice of law. Whereas an associate’s position in a big New York firm used to put you on the fast track to fame and fortune, this is no longer the case. First, because the firms are hiring far fewer young associates. Second, because there is less money to divide.

In some ways, it also applies to the medical profession. For now one does not know how the turmoil in the medical profession will shake out, but it is seems clear that today’s physicians will be less likely to build lucrative private practices. They will become employees of hospitals, with their compensation controlled by government agencies and insurance companies.

As Roose points out, the status of a Wall Street job has fallen precipitously since the financial crisis. In many cases jobs at Google and Facebook have taken up the slack. They are sexier and perhaps even pay better. Besides, San Francisco and its environs have better weather.

We should add that some of the prestige that used to attach to banking has now transferred to private equity firms and hedge funds.

Examine some of Roose’s points.

He opens with the observation that the only good thing about working on Wall Street is the high pay. A young Ivy League grad will likely earn between $90,000 and $140,000 a year.

In his words:

The pay is good. Everything else is bad.

Over a few beers after work one spring evening, two junior Goldman Sachs employees started contemplating the best ways to kill themselves.

Not a very encouraging comment on their work satisfaction.

Unfortunately, by New York standards the pay is not really very good. The numbers might seem high in other parts of the country, but in New York City, that salary will be subjected to serious taxation. And New York’s rents begin at around $3,000 a month. 

How much money will these young grads have for bottle service at clubs and to date fashion models?

So, on this point I would correct Roose. For New York City the pay is subsistence. It's not cheap living in Bill de Blasio's Worker's Paradise.

The same is even more true for people who do not work on Wall Street. Recently, the American Thinker reported the case of a couple named Donna and Frank. (Via Maggie’s Farm) They were older than the demographic Roose studied—early thirties and early forties. She made a 6 figure salary and he wanted to start teaching.

Together they eked out a subsistence level existence. Until, that is, they moved to Orlando. You know what happened to their standard of living. For the details, I refer you to the link above.

Returning to Wall Street, Roose points out that long work hours make a social life difficult, if not impossible.

The demands of the job are such that young associates are forced to break appointments regularly. It is psychologically debilitating to keep going back on your word. A job on Wall Street will undermine your character.

The point is not often made. It deserves emphasis.

Roose explains it clearly:

Wall Street is notorious for the long hours it imposes on its worker bees. (One young banker bragged to me about working the “banker 9-to-5,” defined as 9 a.m. until 5 a.m. the next day.)…

What this means, in practice, is that young bankers live in a state of perpetual anxiety, and advance planning becomes impossible. Boyfriends and girlfriends get upset about broken dinner plans, friends and family members become estranged, and phones function as third limbs. This unpredictability, combined with the sheer number of hours involved, takes a toll. 

Now, Roose continues, if the long hours guaranteed future security, if young people thought that they were on track to become managing directors or partners, they might be more sanguine about the demands of the job. But, the business no longer works by the promotion model. It no longer offers the promise of untold future riches. It chews people up and spits them out.

In Roose’s words:

Once, it had been relatively certain that a young banker or trader who did well would earn much more with each passing year, and would eventually become a millionaire, probably before his or her 30th birthday. But after 2008, the golden pathway began to splinter. New regulation meant to prevent another financial crisis made banks less profitable, and the struggling markets meant that even young bankers—who had historically been immune from layoffs during downturns, so cheap was their labor compared to that of senior bankers—were at risk of losing it all. 

Let’s not overlook the fact, as Roose emphasizes, that new regulations, aka Dodd-Frank have done significant damage to Wall Street. To what extent, I do not know. But clearly, Wall Street banking no longer functions according to free market principles. I suspect that in a highly regulated market the emphasis shifts to seeing what you can get away with.

As one might suspect, and as Roose reports, young Ivy League graduates bring their own expectations to their jobs. They have been taught that their work must be meaningful. They sign up for jobs on Wall Street because they believe that it’s a good way to save the world.

Beyond what they have learned in college they are surely aware of the fact that the tech oligarchs in Silicon Valley are giving away massive sums of money to worthy and unworthy causes. The practice reminds one of the old Western Pacific Indian practice of potlatch.

What is potlatch? In it two tribal chiefs compete to see who is more wealthy by burning their goods. It does not quite feel like charity, but perhaps that is the point. It's a sign of obscene wealth.

If today’s banks were in the business of allocating capital, that would be one thing. Apparently, that is less and less the case.

If you are working in the mortgage market helping people to become homeowners you might consider it to be a worthwhile endeavor. If you are working on foreclosures and short sales, effectively throwing people out on the streets, your job will feel a lot less rewarding.

It’s even worse when you discover that what you are really doing is shuffling money around in order to remain solvent, by whatever means are necessary.