Feb

6

The book Ego Check, by Mathew Hayward, seems like it was written exactly for me. It's about the tendency towards overconfidence in striving individuals. The four major hallmarks of same are: excessive pride and boastfulness; failure to listen to foils who tell you when you're wrong; refusal to get feedback about the outcome of your activities; and not planning for problems, consequences, and corrective measures in advance.

The author gives case studies that show how these four faults led to disaster in the case of mountain climber Rob Hall, business executives Jean Messiers, Meg Whitman, Steve Jobs, Michael Dell, Dean Kamen, Merck in the Vioxx disaster, NASA in the Challenger and Columbia disasters, and The National Kidney program.

Hayward could have included me in his case studies because I have succumbed in each area. In my career in sports and markets, I have paid much too much attention to trying to be number one. I have not relied enough on family (especially my father, when he told me to take it easy), and executives within my own organization who doubted the wisdom of my activities. I have not relied enough on checkpoints to see if the original reasons for my activities were no longer valid, and I have not done enough war gaming to see what to do when my decisions or game plans go astray.

I hope that now that I have confessed these faults, which I understand is one of the keys to self-improvement, to which I will not so readily succumb in the future.

The main problem with the book is that it relies mainly anecdotal methodology to prove its points. It includes numerous cases where pride was very successful, such as Apple and Dell, where the same executive was guilty of hubris and of perfectly rational overconfidence. It espouses people like Jack Welch and Warren Buffet as role models for how not to let hubris get the better of their organization. But anyone who seriously studies these executives' activities might conclude, as I do, that these are sanctimonious scoundrels who are masterful at retrofitting their personae into a form that the media will love and whose judgment is superior to the free market.

As I read the book, I found myself thinking about my hobby, electric circuits. So many of them go into short circuits and uncontrollable output because the output is tied to the input in a positive feedback loop rather than a negative one that dampens the volatility and controls the output. Anyone who plays with op amps or amplifier circuits will know exactly how important is the dampening influence of monitoring the output and then controlling it when it gets out of a range.

A bit of modeling with economic, electrical, or game theoretic concepts like this would have helped to put many of the points in a more systematic form for me and would have led to many more testable hypotheses. And yet, Hayward is a Columbia PhD who collaborated on Harvard works, and professor of psychology at Colorado University, who has interviewed many of the actors in the case studies that he writes about. I find him particularly insightful. And I agree with his point that hubris is the key fault that leads to great disaster in striving individuals.

To his credit, Hayward realized that the mantra espoused by Collins in Good to Great, i.e., that the successful executive should be meek and humble and prudent at all times, is retrospective mumbo jumbo not suitable for the risks and leadership role successful executives must take in today's dynamic and uncertain world. The problem is how to differentiate the overconfidence that has a positive expectation, from the ones that will lead to disaster.

Vincent Andres writes:

"Retrofitting their persona into a form that the media will love . . ." 

Those words trigger others. I believe media, as do many, prefer it the simple way. That is to say it's easier to agree than to disagree. (Disagreeing needs proof to work. Agreeing needs just to believe or rely on others' work.

Thus agreeing implies *resonance*. In other words the initial signal is enlarged. No added value/*information*, but *added power*/energy.

But, the public hearing many loudspeakers, gets the impression that so many loudspeakers equals so many (independents) sources, which is dramatically untrue.

And this may also be linked to the "halo effect." With resonance, the halo's envelop grows and grows, becoming a bubble, pumping plenty of energy/power … but pumping, in fact, little true information.

But, doesn't this give us our daily bread?

Model attempt: If we liken information to halo's envelope and energy to halo's volume, as the halo grows, if there is no added information, then we get a thiner and thiner envelope, until a given point, where the too unbalanced ratio ends with … blood on the walls.

See also: "Larsen Effect" at wikipedia.org/wiki/Audio_feedback 


Comments

Name

Email

Website

Speak your mind

1 Comment so far

  1. Anthony L. Burns on February 14, 2007 10:08 am

    “The problem is how to differentiate the overconfidence that has a positive expectation, from the ones that will lead to disaster.” - are they not oftent the same. Often one is dead right and dead wrng at the same time.

Archives

Resources & Links

Search