Is week 1 a window into a team’s soul? Or is week 1 best left ignored by analysts, since results are skewed by teams that are still shaking off the rust from the summer? As it turns out, week 1 isn’t just like any other week: it’s more like any other week than, uh, any other week. What do I mean by that?
Let’s begin with a hypothesis. The best teams in the league are [more/less] likely to win in week 1 than they are normally. This is because the best teams are [at their best/rusty] in week 1. How would we go about proving this to be true?
One method would be to take a weighted average winning percentage of teams in week one, with the weight being on the team’s actual season-ending winning percentage. For example, the Patriots went 16-0 in 2007, which means New England was responsible for 6.25% of all wins in the NFL that season. That year, the Colts went 13-3, so Indianapolis was responsible for 5.1% of all wins that year. If we want to know whether good teams play [better/worse] in week 1, we care a lot more about how teams like the ’07 Patriots and Colts fared than the average team.
By using weighted average winning percentages, we place more weight on the results of the best teams, which is exactly what we want to do. So when the ’07 Patriots and ’07 Colts won in week one, rather than being responsible for 6.25% of the league, they are now are responsible for over 11% of the NFL’s weighted week 1 winning percentage. Of course, you can probably figure out pretty quickly that by using this methodology, we are ensuring that the “average” winning percentage over the course of the season will be quite a bit over .500, since the best teams will win more often than not. And that’s exactly what we see: the average weighted winning percentage across all weeks, using this methodology, was 0.574. As it turns out, that’s exactly what the average is in week 1, too.
The blue line in the graph below shows the average weighted winning percentages in each week of the season; the red line shows the average across all weeks:
You’ll note that weeks 6 and 7 seem to be two of the worst weeks for the best teams, while weeks 11 and 12 are the two best. To use a concrete example, teams that have won at least 10 games in a season from ’02 to ’13 had records of 77-41 (0.652) and 71-36 (0.664) in weeks six and seven1, but were 95-29 (0.766) in week 11 and 92-34 (0.730) in week 12.2 Meanwhile, all 10+ win teams were 91-38 (0.705) in week 1 over the last 12 years.
So what does this mean? Not much, most likely. We know that splits happen. The results at the end of the graph aren’t too surprising: teams in weeks 15 through 17 are more likely to be resting players (or injured), so you would expect those results might not be the best weeks for the best teams. You also have weather impacting games and neutralizing certain talent advantages.
As for week 1, we can safely reject both hypotheses: the best teams are neither better nor worse in week one than you would expect. If a team fares poorly in week 1, it’s fine to dismiss that performance just because it’s one week. But it doesn’t make much sense to dismiss it because it’s week one.
- You might wonder, looking at this, why week 6 isn’t the worst week. There are other outliers in the data, such as 2 or 3 win teams going 5-11 in week 7, but 2-15 in week 6, that make week 7 the worst week for the best teams. [↩]
- Again, the model used to create the graph looks at all possibilities and weighs them equally, so the numbers here won’t perfectly match up. One explanation for why week 11 looks better than week 12 in this sentence is that teams that won exactly 10 games won 66% of their games in week 11 but just 57% in week 12; if you exclude the 10 win teams, the top teams look much better in week 12. [↩]