Michael Silver of Yahoo is apparently at 74% this season, and was at 78% before this past week. You can probably get around 70 by picking the favorite for every game. I'd be more curious to see the postseason predictions from DVOA rankings over a span of several decades.
Last year DVOA(DVOA in parenthesis): AFC: (#6) Jets over (#15) IND - Correct (#6) Jets over (#1) NE - Incorrect (#4) BAL over (#17) KC - Correct (#2) PIT over (#4) BAL - Correct (#2) PIT over (#6) Jets - Correct NFC: (#3) GB over (#5) PHI - Correct (#30) SEA over (#10) NO - Incorrect (#3) GB over (#8) ATL - Correct (#16) CHI over (#30) - SEA - Correct (#3) GB over (#16) CHI - Correct Superbowl: (#3) GB over (#2) PIT - Incorrect DVOA Playoff record 2010: 8-3 Pretty good no? Upsets will always happen, but 8-3 last post season is pretty damn good.
Also what metric do you use for "favorite"? Betting line? That is actually a very bad indicator of team success, it's an indicator of where the money will go. Teams like Pats, Steelers, Cowboys ALWAYS end up with inflated lines because of their large fan bases. DVOA becomes more robust throughout the season; this week it went 10-4. Silver went 6-8.
Half of a season is a small sample size for determining accuracy for something like this. A single postseason (11 games) is laughable. On any given week, there are only going to be a handful of games where there isn't a clear favorite (and thus all of the different predictions don't agree). These are really the only games that matter in terms of separating one prediction method or algorithm from another. If you call that an average of 3 games per week, then you have a total of 27 points of data right now. With the amount of randomness and variation in the NFL, that is practically nothing. I also would be interested to see the accuracy of DVOA since its inception (I believe 2003). If you really wanted to convince me of anything, you could research the accuracies of other predictions over the same span for the sake of comparison. I also noticed that in their article (http://www.footballoutsiders.com/dvoa-ratings/2011/week-9-dvoa-ratings), Football Outsiders specifically mentioned that their high ranking of the Jets does not coincide at all with other "advanced rating systems". For example, Pro Football Reference's system has the Jets ranked 11th (http://www.pro-football-reference.com/years/2011/). Jeff Sagarin has a highly regarded system that has the Jets ranked 8th (http://www.usatoday.com/sports/sagarin/nfl11.htm). Advanced NFL Stats has the Jets ranked 16th in opponents adjusted success rate (http://wp.advancednflstats.com/teamSR.php). How do these systems compare in accuracy to DVOA over a long period of time (e.g. 2003-2011)? If there is no significant advantage (mathematically), it is hard to take this ranking for more than it is... one person's (or system's) opinion.
RMorin was specifically asked how DVOA performed during playoffs, he was answering that question. The large paragraph you typed out is the same one I included in the block above. The author of the system is a Pats fan. I'm sure it is biased toward his opinion.
I asked over several decades - one season's post season is not any kind of a sample size. DVOA is impressive, but they themselves recognize some faults in their own system. And yeah, I think that the "favorites" as established by the betting line would get close to 70% just in picking winners.
I don't care how it predicted those things in one season's playoff results. That's too small a sample to take seriously. I do care that it's trying to tell us that the Jets are better than the Packers so far this season which is a completely laughable assertion. It's not even just laughable, it's absurd. A metric that produces absurdities is in itself absurd.
You must be a troll. Your ENTIRE argument against DVOA was the 2009 Playoffs were incorrect which I mathematically proved your assumptions wrong at least three or four times now That's why I explained it over and over in multiple threads; you were telling me I'm wrong when MATHEMATICALLY I'm not. So go to hell, now that I have proved you wrong in front of everyone you saying "one post season is not a big enough sample size" makes your opinion worth nothing. FYI Refer to this thread: http://forums.theganggreen.com/showthread.php?t=68707 YOU NEVER ANSWERED THE ONE QUESTION I POSED TO YOU. How does DVOA predict winners this season of 74% percent (that's 10 weeks!!!) and still be unreliable? Finally this idiot stated seriously that the Dolphins would fire Sparano the day after he won their first game game. Pretty sure he still has a job. http://forums.theganggreen.com/showthread.php?p=2294170#post2294170 Clearly he knows nothing about the NFL. Do not feed the trolls. When I hear idiots on WFAN I think who the @#$% are these people, now I understand they are peopel like Bradysux(in whatever his stupid spelling is).
It's a useless metric. How many times in the last decade has the top ranked team won the Super Bowl? Once. How many times in the last decade has the top ranked team appeared in the Super Bowl? Three times. How many times has a team not ranked in the top 5 won the Super Bowl? Four times. How many times in the last decade has a team not ranked in the top 5 appeared in the Super Bowl? Nine times.
Br4dw4y5ux ====> Bradway Sux ====> Terry Bradway (Jets GM prior to Tannenbaum) sucks. I am still interested to see whether DVOA stands up over a longer period of time, or more specifically, whether you can support the notion that DVOA is significantly more accurate than the other "advanced ranking metrics" that seem to generally favor different teams. Another interesting point is that the Patriots defensive DVOA increased (i.e. got worse) after facing the Jets and forcing seven 3-and-outs in ten drives. And that was after getting handled by MIA, SD, and Buffalo, so I'm sure the defensive DVOA was already abysmal going into that game. Some things just don't make sense going by their metrics.
Answer my question. Either answer or stop responding. I am asking you a simple and direct question. Why does DVOA predict better then 74% of games head to head this season if it is a "useless metric". Once you do that I'll respect your opinion. However you don't know what you are talking about, You legitimately argued with me that Sparano would be fired the day after he won his first game. So far I am 2-0 against you. I hate to finally bring this up (the gang green, I've tried over multiple threads to educate this guy before bringing this up) but I have taken multiple graduate school statistic courses. I am a doctoral candidate at UMass. I know what the hell I am talking about statistically. You don't. You KEEP talking about playoff seedings, but you have never, ever mathematically refuted the fact the DVOA rankings are an accurate predictor of winning. Keep living in dream world.
DVOA goes back 20 years on Football Outsiders. It does stand up. Patriots DVOA got worse because of the very fundamental notion of DVOA: that a ten yard completion is worth a different amount depending on the significance of the play. If you remember the Patriots let the Jets offense drive down the field to put them within one score in the fourth quarter. That's a big deal no matter how you cut it. DVOA is a VERY smart system, it looks beyond the simple stats to situational stats. Had the Jets D/Pats O performed differently on that last drive the Jets could have been in a position to win the game. Pats O sealed the game, the Pats D let them back in.
I agree with what you are saying and simple people that don't understand statistic concepts will never grasp what DVOA is saying.
Am I supposed to take your word for it? What is its accuracy over that span, and how does that compare to similar systems (like the ones I linked to earlier)? You don't have to have a "smart" system to notice that the Pats defensive performance against the Jets was an upgrade compared to the four weeks prior. 10 drives for the Jets resulted in seven three and outs. That is tremendous defensive play, regardless of the other three drives. Keeping the offense off the field so efficiently is a huge accomplishment. One of those remaining "drives" which culminated in a touchdown was really just the special teams blown coverage allowing McKnight to get into our redzone on the return. That should be accounted for in a smart system (i.e. a large portion of "blame" should be shifted to ST). That leaves two touchdown drives which the defense should be held accountable for. So one of those came in a "clutch" situation... sure, you can increase the "strength" of that drive. But unless that "clutch" factor is hugely overblown, there is simply NO POSSIBLE WAY that the performance as a whole can be considered a "downgrade" compared to MIA+SD+BUF+OAK collectively. It simply makes no sense. Especially when the "clutch" situations are mostly determined by the production on the offensive side of the ball.
I'll make it very simple you: either read and understand the methodology behind the stat or listen to people that have. One thing you can not argue is that head to head DVOA is in fact a very successful predictor of wins.
Very successful is a relative term. If the other systems boast similar accuracy, yet they have significantly different ratings, then all that means is that DVOA has separated the good teams from the bad teams... it hasn't succeeded in separating the "elite" from the "very good" from the "good" from the "average", etc. And please don't give me ESPN analysts as a comparison. That is just a joke in itself.
It isn't just simple people not understanding statistics. DVOA is a complicated system which weighs different statistics to come up with a comprehensive ranking. As Football Outsiders doesn't release that formula, it is hard to determine exactly what things are measured, and how much "weight" they carry, and most importantly whether they achieved the right balance with all of those things. The only real way to judge the merit of the DVOA results is to compare them to either (a) your perceived rankings, or (b) whether higher ranked teams consistently beat lower ranked teams. What myself and Br4dw4y5ux are arguing is that the system doesn't always pass test (a). The DVOA rankings of different patriots defensive performances doesn't seem to match up with how strongly they actually played (IMO). Furthermore, the fact that the Jets are ranked higher than GB seems questionable at best. RMorin is demonstrating example (b) by saying that the system has 74% accuracy this season, and was 8-3 in the playoffs last year. The problem with that is that both are terribly small sample sizes for this type of analysis, and he provides no basis for comparison. I don't know whether those numbers are good or bad. Anyone can create a very simple ranking system that will differentiate very good teams from very bad teams. Most likely that system will result in an accuracy greater than 50%. Maybe it will even be close to the 74% that DVOA boasts for this season... I don't know. While RMorin is preaching DVOA to be the most advanced, most accurate system that exists, he is providing no reasonable comparisons to actually make that judgement. Several people have pointed out flaws in the numbers that he is throwing out here, but he only seems interested in repeating the numbers that he has used and acting like we are too retarded to understand basic concepts. While he supposedly is nearing a doctorates degree in a statistics related field, he seemingly has no interest in approaching this with a reasonable sample size, or making comparisons to other systems to prove significance.