Walking up to a roulette table, a visibly nervous Englishman lays down a pile of chips worth over $135,000. This sum was the culmination of the man’s life savings and included the proceeds from selling every last one of his possessions. The man, Ashley Revell, turns to the crowd and with an uncomfortable laugh, asks “red or black?” He turns back to the table, the croupier spins the wheel and the ball drops. Ashley puts it all on red. The ball careens over the vanes before landing on 7 — a red tile.
In a matter of seconds, Revell doubles his net worth on a spin and it comes out in his favor. The outcome was unquestionably good for Revell, but was it a good decision?
If you answered that question with “yes,” then you’re guilty of the fallacy of resulting.
Judging the Quality of a Decision by the Process Not the Outcome
Resulting is the name Annie Duke gives to this common cognitive bias. It occurs when we judge a decision by its outcome (e.g. win or lose) rather than by the odds of the decision itself.
Ashley Revell’s example should be clear. There are 37 slots on the roulette wheel, 18 red, 18 black, and one green 0. That gives his red bet a 48.6% chance of winning to double his money. While it worked this time, the odds were against him. Repeating this all-in on red (or anything in roulette for that matter) will eventually lead to blowing up. For Revell, the outcome was tremendous, but the decision was poor.
To illustrate with the opposite example, we’ll turn to one from Duke’s book, Thinking in Bets.
The Super Bowl’s Most Infamous Decision
There 26 seconds remaining in Super Bowl XLIX with Pete Carroll’s Seattle Seahawks behind the New England Patriots by four points.The Seahawks have the ball and it’s second and goal from the 1-yard line and the Seahawks have one timeout remaining and the clock running. A quick pass play is called, the QB drops back to throw, and it’s intercepted in the end zone essentially ending the Seahawks’ bid to be Super Bowl champions in dramatic fashion.
Immediately the announcers begin criticizing the play, exclaiming how terrible it was.
The announcers — and most critics — are guilty of resulting. The play didn’t turn out well due to execution, but given the situation, it was actually an excellent call. This particular outcome had a 3.1% chance of occurring, and in the other 96.9% of the possible outcomes, the Seahawks can either try again or win the game. Analysis after analysis of the decision process have vindicated Pete Carroll and show that he made the right call, but got unlucky. Of course, this hasn’t prevented bitterness and befuddlement from those who fall into the resulting bias.
Separating Luck from Skill
Resulting can lead to catastrophic consequences, particularly in trading and investing. Consider an inexperienced investor. Our investor puts $20,000 into a stock after doing some research. He nets a 30% return on his investment, sells and decides to try this again.
The investor has a lovely winning streak and ups the commitment to his method and further concentrates his portfolio. His success gets him believing his own brilliance and leads to more concentration and more risk.
One day, the crash comes, and our inexperienced investor learns a very expensive lesson: he’s not so clever after all.
The markets have a lot of random noise in them and some people can get very lucky for long periods of time. They look like geniuses as their portfolios balloon and they continue to make money. They’re relying solely on their results and not analyzing their process for flaws or weaknesses, oblivious to the actual odds and actual risks they’re taking. Every time they hit red on roulette, they congratulate themselves and continue on!
Combating Cognitive Biases
Never confuse genius with a bull market
How do we know when we actually have a good strategy, or when we’re just getting lucky?
The first step requires analysis of your decision making process. Treat every decision as a scientific hypothesis. Be specific!
If you’re a value investor, lay out each of your assumptions and the factors that are contributing to your investment decision. Project out future cash flows, do your discounting, and try to understand how that company is going to sit within its industry over your investment time horizon and how that industry will grow over the same time frame. Update these metrics when you get new data (e.g. with quarterly reports) and see how well your assumptions and model is holding up. Soon enough, you’ll be able to confirm your hypothesis or not and take that information to refine your process so you’ll do better the next time.
If you’re a systematic investor, think through the implications of each signal you’re using and why you’re using them. Are you trying to catch trends or catch outliers for mean reversions? What’s your time frame? Does your re-balancing and trading schedule give you enough time to allow trends to develop or for prices to revert? What markets and securities are you trading? How do these tend to behave and do they fit within the goals of your system? What are you doing for money management? You’re building a hypothesis about what will work in your market.
Once you have your system in place and you understand your various design decisions, then you can test it with a backtest. This will give you the statistics of your system and help you to decide if you should work with this system or tweak your hypothesis and try again.
In either case, it’s critical that you stick with your system once you go live with it to really test it. If the system is good, the results will come. Evaluating your system should come first and foremost via scrutiny of the process you employ.
Fighting your Biases
We all fall prey to the resulting bias. It takes hard work and practice to improve our thought processes and the way we interpret the results we get.
When investing, clear systems provide an easier path to overcome the resulting bias. Backtests — while not perfect predictors of a system’s future performance — do provide some idea of what to expect in the long run. These systems yield clear rules that can be automated and followed perfectly without subjective work or rough estimates of what a company or industry will do in the future.
At Raposa, we provide the tools to build powerful trading systems with high-quality data and a no-code framework so you can test and implement your system quickly, get the relevant statistics, and go forward with your process in mind. Good processes yield good results.
Check out our free demo here!