≡ Menu

Why Haven’t We Improved At Making NFL Predictions?

Yesterday, we looked at the biggest “covers” in NFL history: those games where the final score was farthest from the projected margin of victory. In a 2010 game in Denver, the Raiders were 7-point underdogs, but beat the Broncos by 45 points. That means the point spread was off by 52 points, the most in any single game.

The first year we have historical point spread data was in 1978. That year, the average point spread was off by (or the average amount of points by which the favorite covered by was) 9.9 points. That number probably doesn’t mean much to you in the abstract, so let’s give it some context. From 1978 to 1982, the average point spread was off by 10.4 points. Over the last five years, the average point spread has been off by… 10.3 points.

Now I’m not quite sure what you expected, but isn’t that weird? In 1978, Vegas bookmakers were using the most rudimentary of models. Think of how farther along we are when it comes to football analytics than we were four decades ago. All of that work, of course, has to have made us *better* at predicting football games, right?

But don’t these results suggest that we are not any better at predicting games? If Vegas was missing games by about 10 points forty years ago, why are they still missing games by about 10 points? One explanation is that the NFL is harder to predict now, which… well, I’m not so sure about that. After all, even if you think free agency and the salary cap bring about parity (which is a debatable position regardless), it’s not like the lines are more accurate later in the season once we know more information. Games are also slightly higher scoring, and you could make the argument that we should be measuring how far games are off by as a percentage of the projected over/under?

Let’s look at the data. The graph below shows in blue the average “cover” in each game for each year since 1978.  As it turns out, 2016 was a really good year for Vegas — the average cover was just 9.0 points, which ranks as the most accurate season ever.  However, there’s no evidence that this was anything more than a one season blip: 2013 and 2015 were average years, and 2014 was the least accurate season ever.  It’s not like our prediction models just started getting sophisticated last season.

For reference, in the orange line, I have also shown the average point spread for each game.  That line has also been pretty consistent over time, with the average spread usually being just above 5 points.

On average, the “winning” team in Vegas has covered the spread by 10.4 points over the last 39 years. Should we expect that number to ever decrease? There is a certain level of unpredictability inherent in the nature of sports in general, and it’s probably even higher for football in particular. That said, perhaps the better question is why wasn’t that number higher than it was? Even if you limit our look back, consider 1994 and 1995: then, the average cover was still less than 10 points both seasons. And then think about how antiquated our prediction models were 23 years ago.

What conclusions can you draw?

{ 15 comments }