I have no real expertise in banking or real estate. Neither fields hold much interest for me. But I am interested in failure behaviour of large complex systems. There are a few posts out there that hold better than average insights on the current financial mess and I wanted to post links to them.
Jim Kunstler‘s blog, the full name of which is a bit too coarse to post here. None-the-less, Kunstler writes one of the best blogs out there on business topics.
Georgetown Law Faculty Blog has posted part of an article intended for the banking community. It is lifted from the American Banker which, sadly for me, requires a subscription. The theme supports my contention that business- banking included- should be treated as part of contemporary anthropology rather than an abstract exercise in arithmetic.
Business isn’t just an math exercise. There is a lot of anthropology to it. Unfortunately, anthropology isn’t on the curriculum of most MBA programs. MBA’s worry me. They seem to be hustling the rest of us into an Orwellian future with methodologies taught by faculty members who are more interested in tidy formalisms than people.
There are a lot of cocky bastards in business who are always certain, but frequently wrong. This banking mess is an example of what happens when they achieve a quorum. In fact, I think they have ascended to the level of mythical archetype.

There is a direct tie in to personality type and profession. People migrate towards areas where they are comfortable and feel capable. Most physical scientists are the Myers Briggs category INTJ, for example. Almost all the rest are nearly so.
When you get a group together with similar values and attitudes and ways of thinking, group think and tunnel vision occur. It just is the dynamic of groups. (I wrote an essay about this regarding managers in science companies.)
Sociology and psychology ought to be taught to everyone to understand these things, allow diversity of thought or devil’s advocates or just to think outside of the self-created box.
How Doctors Think discussed a study of the accuracy (or concurrence) of radiologists on images and diagnoses – from what I remember, certainty was negatively correlated with accuracy (the people who were most certain were more likely to be mistaken than those who were less certain). I don’t know if my instinctive generalization of this idea is accurate, but I keep using it anyway. It explains a lot – the failures of the Bush Administration, for example.
I would have thought that internally or externally companies dealing with lots of money would want some institution to force people to check their own certainties, so that mistakes don’t get out of hand. Barring that, that would seem to be a job of regulation (with the caveats of 1) regulatory capture and 2) the possibility that people might take the regulatory entity as being the authoritative source and thus lead to worse versions of the above mess).
Stable systems are self-correcting. I don’t know if this is one of those.