Fractal Analytics Blog

Cause or Effect? Why I Am Relieved Nate Silver Was Wrong About SuperBowl

Cause or Effect? Why I Am Relieved Nate Silver Was Wrong About SuperBowl

By Careen Foster
February 5, 2013

Nate Silver has earned public acclaim for his accurate predictions about elections. Turns out, his approach isn’t planning out for football (so far). Here’s why I’m not bothered (and why I would have been if he had been right again).

First, let me say I think we have tremendous opportunities to apply science to social predictions.  Perhaps we can learn how to avoid a fiscal cliff, pandemic, or stock market crash. I’m wondering at what point does a prediction go beyond identifying a probability to actually cause the effect?

Social predictions

Growing up, I was enthralled by Isaac Asimov’s Foundation series where our hero Hari Seldon, used sophisticated mathematics to make accurate social predictions generations in advance. My recollection is that Hari was not caught up in predicting specifics, rather he would predict a major social trend such as a war, but not specifically the individuals that would be involved.

If Nate Silver was right about the Super Bowl, and continued to be correct about making predictions where people make decisions (isn’t that just about everything?), what happens when Nate or someone else decides to predict a social event, something negative like political unrest?

At what point does a prediction become so accurate and trusted where it becomes a self-fulfilling prophecy (for quantum mechanics fans, an example of Heisenberg’s Uncertainty Principle) — where a leading indicator actually creates the event by sheer force of perceptual determination?

Let’s play hypothetical

So let’s use a hypothetical example and see what happens. Let’s say we make social predictions that turn out to be accurate. For example, let’s pretend the data suggests a stock market crash and run

Ultimate Preparedness Library

on the banks will occur in the year 2020 (just a hypothetical, I don’t have data or a prediction to support this).  Let’s also say we have learned to trust the prediction, because the man-machine algorithm has been proven accurate 90% of the time over the past 30 predictions in 5 years. I’d say that’s a strong enough track record to give me a fair degree of confidence in the next prediction.

Cause or effect?

To some extent, we already live with this effect with media reports of consumer confidence, extensive polling, detailed election coverage, what-if scenarios, and too many pundit predictions to count.  The difference I see in social predictions we use today and those that may occur in the future is the degree of acceptance and influence.

Pundits and general sentiment drives some portion of the prediction causation effect, but how much?  With statistically-driven man-machine computations that are proven valid at close to 100% accuracy, these predictions would serve as a modern marionette, driving behavior rather than measuring.  In this scenario, doesn’t the effect become a lot more certain?

Predictions based on exit polls for elections are banned in countries like India where elections are done in multiple phases. Possibly for the same reason, to ensure the predictions do not impact election outcomes.

See Nobel winning Granger Causality Test.

What do you think?

Which prediction scenarios run the risk of impacting the results through predictions? Are there certain situations where we need to be wary?

What happens if we have strong predictions and no way to validate them until after the fact?

If we have a trusted prediction source that influences group social behavior, how do we separate the cause from effect?

What do you think will happen to trust in experts?

Do you think we can avoid moving toward determinism?

Ultimate Preparedness Library
Post Comment

Leave a Reply

Your email address will not be published. Required fields are marked *


Institutionalize Forecasting Within an Organization

Download Paper


  • expand2017 (7)
  • expand2016 (7)
  • expand2015 (43)
  • expand2014 (15)
  • expand2013 (47)
  • expand2012 (15)