There are two questions to ask when evaluating a political prediction, whether it's from Nate Silver, a pollster, an academic or your favorite Yahoo News predictions blog:
A) How useful was the prediction the day before the election?
B) How useful was the prediction the day after the election?
A great deal of attention is devoted to scoring the performance of various seers and prognosticators on Point A. We went 50 for 51 in that regard, getting every state correct except Florida—of course it was Florida—in our last prediction before voters went to the polls. (We might humbly point out that our original prediction, announced in February, was precisely the same as the one we made on Nov. 5. And, predictions in February are a lot more useful to the multi-billion dollar campaign industry than predictions in November.)
Evaluating Point B is trickier. Have forecasters like Silver, who relies primarily on aggregating polls, taught us anything about how elections work and what motivates voters?
While polls do offer some insight into how public opinion responds to high-profile events—though always at a delay of at least a day—they're powerless to reveal the high-level factors, such as the economy, that influence elections months and even years ahead of time. That's why The Signal prefers to start with models, like the one we debuted in February: It teaches us which factors correlate with election results and which do not.