The ever interesting and insightful Tim Hartford makes mention of the illusion of understanding complex systems in his latest blog.
Switch on any business TV channel and you’ll be bombarded with scrolling quotes and jagged graphs. It’s all very well if you like to watch that kind of thing, but the flow of financial data can lull us into an illusion: that we understand what is happening in the economy at the moment that it happens. We don’t.
Take those share prices: each one is a miniature forecast of future profits for the company in question. They may or may not be good forecasts, but what they are not is a measure of real economic activity today: the price of shares in BP reflects today’s supply and demand for shares in BP, not today’s supply and demand for petrol.
Away from the financial markets things are even more obscure. We have indices of house prices, measuring how they have changed over the past month, quarter or year. But the houses bought and sold last month are not the same as the houses bought and sold the month before: are they more expensive because housing itself is more expensive, or because fewer studio flats and more penthouses were sold?
The interesting this about trading is that you dont actually have to understand what is happening – in fact the more you seem to know the less you seem to be able to make rational judgements about anything. Knowing more doesnt seem to be actually related to making better decisions. In fact the more you know the worse your decision making seems to become. This applies to the wider world as well as trading.
Consider of a cardiologist named Lee Goldman developed a decision tree that only used four factors to determine the likelihood of a heart attack. Goldmans decision tree was put to the test at Cook County Hospital in Chicago to see whether it worked better than the judgements made by staff cardiologists. Goldmans tree turned out to be 70% at identifying patients who were not having a heart attack. It was also safer. Experienced cardiologists were found to be correct with the most serious patients between 75% and 89% of the time. The decision tree was right 95% of the time.
Knowing more doesn’t always relate to better decisions