Robo Called The Big Game Early ... Is The 50-Year Super Bowl Indicator Dead?

Amazon-backed statistical models picked the wrong winner. Wild human talent threw a literal curve ball. Success is how well you roll with surprises.

On Friday, the most sophisticated artificial intelligence had already called the Super Bowl. Technology built inside Amazon’s cloud services unit and endorsed by the NFL itself picked the 49ers by at least 3-13 points.

The only question they had was how to model Mahomes’ eye-popping arm. The league could provide flawless historical data on every other aspect of the game, but clearly failed to reckon with that cosmic wild card.

Ordinarily that failure would be just another bad bet. But because this particular big data platform also feeds Amazon’s sports forecasting capabilities, it casts a cloud on more than the fantasy football managers who live and breathe statistical outcomes.  

A related analytic system runs McKesson’s consulting front end, for example. And if you rely on Alexa’s prediction capabilities, the same core technology is there, too.

Someone who trusted that technology would have backed the wrong team or not even bothered to watch the game, thinking they had the outcome already.

And until they got the word on the real winner, every decision they made after that was built on faulty information. Once garbage gets into the model, the model delivers a mix of garbage and accuracy.

This is bigger than football. Apply the Mahomes factor to the market and you’ve just described every statistical investment platform on the planet.

The magic word is “adaptive”

Ironically the majority of ESPN football writers picked the Chiefs to win, so this is not a situation where everyone gets blindsided.

Human expertise found something on the Kansas City team that made the people back that side. They probably couldn’t describe or quantify it well enough to feed into a computer model, but they were right.

Now the technocrats need to figure out how to factor in another unique human talent. Maybe they’ll get it right next year. Sooner or later, however, another unique human talent will come along, breaking the model again.

All forms of “robot” investing ultimately follow a similar trajectory. Markets conform to predictable parameters until they don’t. 

The breaks usually come when irrational factors intrude on the scenario. Sometimes those irrational factors come from human beings, our limits and our ambitions.

Insulate the market from every human actor and I suspect it would run like clockwork, capturing every fluctuation in the fundamentals perfectly. Every stock would be fairly valued. Volatility would drop to a background hum.

But as long as we have human actors in the system, we’ll have to live with discontinuities. People get old. Their cash flow situation changes and puts different stresses on the portfolio. When we die, the portfolio needs to change to reflect what the heirs need.

Even the best robots will be forced to get out of our way. And in the short term, humans will figure out how to exploit the robots’ mistakes.

Someone with high conviction could have made a lot of money betting against Amazon in the Super Bowl. All you need is ambition and confidence.

Beyond the indexed world

I think we’re in the early stages of seeing active stock picking evolve to capture opportunities the robots can’t capture. After all, when institutional fund managers ruled the world, retail money got smart by crowding into stocks too small for the big money . . . or abandoning timing entirely and letting the random walk do all the work.

The random walk, in turn, invented the modern world of index fund domination. That’s the status quo that few investment robots are equipped to effectively bet against or even hedge around.

And that’s okay. But we’re clever creatures who will always try to game the system for a better outcome. We bend the norms in ways the robots currently can’t.

Often, that means doing the dumb thing. Human error has always wrecked a lot of portfolios as we get sucked into someone else’s fallacies or simply read the market wrong.

Sometimes human error works on our behalf. The other trader gets a fat finger or needs to stop out for external reasons. We exploit the circumstance, occasionally pausing to contemplate that one day the positions will be reversed.

The robot exploits the error and stays on course. Something like complacency sets in. The models predict reality too well. 

When reality bites back, it’s always irrational. That irrational factor this year was Mahomes. It’s up to Amazon and the programmers to learn from this.

Maybe they’ll hedge for the unknown a little better next year. They’ll keep learning, if they’re smart. We’ll keep learning because it’s what we do.

If you aren’t always adapting to the unknown and anticipating it on behalf of your clients, you’re not really adding value. But I think you are. 

You anticipate how your clients will react to any point in the news cycle. You know from experience who is going to call in a panic despite all your coaching, because you grew up in a human world.

The coaching perspective 

And that’s what the Amazon algorithms are actually being developed to do. The company isn’t contemplating a big entry into sports betting at all.

These predictions are for coaches so they can manage human talent more effectively and train the people they have. It’s all about the apps on the tablets you see on the field now.

It takes a human being to hold the tablet and make the calls on behalf of human players. That’s what the advisor does.

Talent on its own is a random walk. You throw the ball and hope someone is there to catch it.

Put even a little predictive power in the hands of human leadership and even an underdog team can do amazing things. Bring on the A.I. market. Retail investors will still make a lot of money.

But think of all the patterns that every market participant has been trained to observe, all the correlations like the "January Effect" or the Super Bowl Indicator.

Remember when we all sold in May and went away for six months because there just wasn't any upside to capture? People learned to game that pattern and now it's broken. 

Likewise, people and the robots we've trained are probably looking to game the Super Bowl results now. An AFL win means the bears are in control, right?

I don't think so. What this shows us is that the algorithms aren't in perfect tune with reality. If they got this wrong, they'll probably make other mistakes until the programmers make corrections.

Right now algorithms tuned to read an AFL win as a sell signal are making mistakes. Humans can pass into the gap.

 

Popular

More Articles

Popular