What's new
Fantasy Football - Footballguys Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

Ask Joe and me anything about FBG (1 Viewer)

Has FBG ever thought about adding "FBG Live Chat" to the Community section?

staffers could pop in randomly (or even at an announced time for a special session) and members could log on whenever or need quick responses to WDIS/WDIDraft questions

i can see on sunday morning this being a great tool in addition to the forum injury threads

for an example to see its implementation, i think dynastyleaguefootball.com has one

 
love the site, i'd give everything on it a 9 or 10 rating except for my question below ...any chance the forum search tool could be improved drastically?
what is it you're trying to improve?generally these internet forums are just software packages that people license like any other program --- not to say that it's impossible to mod them up a bit, but they're pretty much prepackaged 3rd party software.they didn't code this board.
yeah i knew it was third party, it's why i asked "any chance it could be" instead of "why havent you fixed it" ;) it won't let you search for less than 4 letter terms is one thing i'd fix
 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognisticator did? Without an evaluation at the end of the year, everything is just anecdotal smoke!

For that matter, the FF industry should have ratings at the end of the year comparing advice given by each FF site. I do not know who could do that since each site would have a vested interest in either blaring the good news or burying the bad news.

 
Last edited by a moderator:
It has already been asked, but I'll add my voice to the herd:Why is there no post-season analysis of how well each FBG prognisticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
 
Aaron Rudnicki said:
Mohawk said:
It has already been asked, but I'll add my voice to the herd:Why is there no post-season analysis of how well each FBG prognisticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
What's the problem with comparing the projections to end of year results and analyzing why they were off/accurate?My main question for FBG is: What's up with the site update? You click on MyFBG and it takes you back to the old style, what's the timeline to have everything streamlined and cleaned up?
 
What's the problem with comparing the projections to end of year results and analyzing why they were off/accurate?

My main question for FBG is: What's up with the site update? You click on MyFBG and it takes you back to the old style, what's the timeline to have everything streamlined and cleaned up?
This doesn't happen for me.....it goes to a page similar to the homepage re-design, which is nothing like the old MyFBG page.
 
Aaron Rudnicki said:
Mohawk said:
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognisticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
Google, Microsoft, Yahoo, etc. have figured out a way to measure the accuracy of search results. You would think it wouldn't be hard to come up with a way to measure the accuracy of fantasy football projections. Sure, sites can argue about the system (just like we do about MVP voting, HOF balloting, etc), but at least it's there.
 
What's the problem with comparing the projections to end of year results and analyzing why they were off/accurate?

My main question for FBG is: What's up with the site update? You click on MyFBG and it takes you back to the old style, what's the timeline to have everything streamlined and cleaned up?
This doesn't happen for me.....it goes to a page similar to the homepage re-design, which is nothing like the old MyFBG page.
I have this problem. Old system and bad times new roman font.
 
Aaron Rudnicki said:
Mohawk said:
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognisticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
Google, Microsoft, Yahoo, etc. have figured out a way to measure the accuracy of search results. You would think it wouldn't be hard to come up with a way to measure the accuracy of fantasy football projections. Sure, sites can argue about the system (just like we do about MVP voting, HOF balloting, etc), but at least it's there.
i'm not sure that it's hard to measure as much as any measurement would show that the projections would appear to be wildly inaccurate
 
Aaron Rudnicki said:
Mohawk said:
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognisticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
Google, Microsoft, Yahoo, etc. have figured out a way to measure the accuracy of search results. You would think it wouldn't be hard to come up with a way to measure the accuracy of fantasy football projections. Sure, sites can argue about the system (just like we do about MVP voting, HOF balloting, etc), but at least it's there.
i'm not sure that it's hard to measure as much as any measurement would show that the projections would appear to be wildly inaccurate
That's okay. I think the search rankings show search engines have massive room for improvement in providing relevant results, but it still shows which search engine is most relevant and helps encourage improved search algorithms.Accountability is good.

 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognosticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
There is no IDEAL way! But I bet most any math major could easily come up with an analysis along the lines of using the scoring regimen your site uses for projections, analyzing how statistically close was each prognosticator to the actual results? That sounds pretty easy to me. IT's not rocket science and it doesn't have to be IDEAL!

 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognosticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
There is no IDEAL way! But I bet most any math major could easily come up with an analysis along the lines of using the scoring regimen your site uses for projections, analyzing how statistically close was each prognosticator to the actual results? That sounds pretty easy to me. IT's not rocket science and it doesn't have to be IDEAL!
And- btw- what ever became of the list of questions this thread was supposed to generate?
 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognosticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
There is no IDEAL way! But I bet most any math major could easily come up with an analysis along the lines of using the scoring regimen your site uses for projections, analyzing how statistically close was each prognosticator to the actual results? That sounds pretty easy to me. IT's not rocket science and it doesn't have to be IDEAL!
And- btw- what ever became of the list of questions this thread was supposed to generate?
might want to check out the front page. your answer may be waiting.
 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognosticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
There is no IDEAL way! But I bet most any math major could easily come up with an analysis along the lines of using the scoring regimen your site uses for projections, analyzing how statistically close was each prognosticator to the actual results? That sounds pretty easy to me. IT's not rocket science and it doesn't have to be IDEAL!
And- btw- what ever became of the list of questions this thread was supposed to generate?
might want to check out the front page. your answer may be waiting.
http://forums.footballguys.com/forum/index...howtopic=534570

 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognosticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
There is no IDEAL way! But I bet most any math major could easily come up with an analysis along the lines of using the scoring regimen your site uses for projections, analyzing how statistically close was each prognosticator to the actual results? That sounds pretty easy to me. IT's not rocket science and it doesn't have to be IDEAL!
And- btw- what ever became of the list of questions this thread was supposed to generate?
might want to check out the front page. your answer may be waiting.
 
It has already been asked, but I'll add my voice to the herd:

Why is there no post-season analysis of how well each FBG prognosticator did?
I don't believe we've ever come up with an ideal way to rate the accuracy of rankings/projections.
There is no IDEAL way! But I bet most any math major could easily come up with an analysis along the lines of using the scoring regimen your site uses for projections, analyzing how statistically close was each prognosticator to the actual results? That sounds pretty easy to me. IT's not rocket science and it doesn't have to be IDEAL!
And- btw- what ever became of the list of questions this thread was supposed to generate?
might want to check out the front page. your answer may be waiting.
OOPS! I often run breeze by the front page. I'll check it out more closely!

 
Sara Holliday (FF Librarian) started doing rankings analysis for all the FF websites via the FSTA. She blogs about it here:

http://www.fflibrarian.com/2010/01/2009-ac...lts-are-in.html

48 sites gave her rankings. We finished 11th.

Donnie from a website called themostcredible (that link now brings back a webpage that my virus protector does not like) presented with Sara and judged 22 different sets of projections. I submitted my numbers and I took second. I don't have a link to share because he gave a pitch at the FSTA and they never posted the results anywhere.

I paid money to Andy Hicks to track weekly projections from all of the pay sites, some free sites, etc. I came in first and Bloom came in second. But the second we even tried to publish the results we had sites telling us that they would sue us, etc. That's the big problem. Competing sites that wing it will never give anyone permission to show results that make them look bad. and even though the content is useless going forward to anyone, it is paid content. We don't have the rights to publish it. Even getting everyone on board on how to judge stuff is next to impossible.

There was another site a few years ago who did this for inseason and we came out on top then too. They quit after a year as a pay website.

Even if no one here believes these results, ask these things from the sites that do weekly projections.

- Add up all the players. What are the total yards, yards per carry, TDs that are expected. Do these conform with league norms?

- How often are the numbers updated after Friday? Many don't change them on the weekend. It doesn't take a genius to know sites predicting 50 yards from the guy that is inactive are going to suck (because others should be higher while he is at zero)

- Do they predict fractional TDs? I say this because it's ridiculous to project either 0 or 1 TDs for 90% of the TEs as most score 3-4 TDs per year.

- How many players are projected for at each position? Some of these sites list 50 WRs. I think we all know that more than 50 WRs will catch a pass each week. In some dual flex leagues, you might need to go 100+ deep just to properly evaluate all of your options.

I KNOW our stuff checks out because of the processes we use. And although Sigmund and I can differ wildly as to the numbers we think overall on a player for a given week, we use similar processes to create our numbers . So it makes sense that in objective studies, we crush this stuff. The other site that always does good in these things is 4for4.com. That's because like us, they simulate games and then make the numbers match this expected result. They also are one of the few websites (besides ourselves) that use fractional TDs. Most everyone else sucks at this unfortunately. I wish I could prove it more easily as it would clearly help sell subscriptions.

I would also ask this question. We have 40,000 subscribers and have grown our numbers every single year. If we sucked at projecting, do you think this would be the case? We have multiple people on staff with statistic backgrounds including a PHD in Stats in Doug Drinen. Do you think he would let us publish numbers that don't conform to historical norms?

I am done with this subject even though it will get asked 10,000 more times before the preseason ends. I get it. People are results driven. They want us to prove to them how good we are. Like someone else said though, the real thing is to show we suck way less than everyone else. But we can't show other people's work without getting sued. and therein lies the world of fantasy information services all claiming they are better than the other guy.

 
Last edited by a moderator:
This is a much clearer picture than the typical "We just don't do it," rhetoric we've been given in the past. Thanks for posting this David! While a few years ago I had asked for some type of qualifier/proof that your projections were really good, I got over it that you chose not to and have been content to believe you guys were near or at the top. Using your info and consistently getting to the championship game and winning it on occasion sure helped. This post really brings it home and actually proves that you were willing to take a good hard look at your own results, as far as I'm concerned.

I did in fact see Ms. Holliday's analysis last year and thought she did a pretty good job of tying it all together. I hope she does it again this year.

Thanks again for this inside information! You guys are really showing us that you are here to help us WIN CHAMPIONSHIPS!!!

One more time, I gotta say that I am so impressed that you take the time to respond to us! There is nothing more satisfying than being able to communicate with the owners of my favorite website.

Kuddos :goodposting: :thumbup: :thumbup: :thumbup:

 
Ruffrodys05 said:
This is a much clearer picture than the typical "We just don't do it," rhetoric we've been given in the past. Thanks for posting this David! While a few years ago I had asked for some type of qualifier/proof that your projections were really good, I got over it that you chose not to and have been content to believe you guys were near or at the top. Using your info and consistently getting to the championship game and winning it on occasion sure helped. This post really brings it home and actually proves that you were willing to take a good hard look at your own results, as far as I'm concerned.I did in fact see Ms. Holliday's analysis last year and thought she did a pretty good job of tying it all together. I hope she does it again this year.Thanks again for this inside information! You guys are really showing us that you are here to help us WIN CHAMPIONSHIPS!!! One more time, I gotta say that I am so impressed that you take the time to respond to us! There is nothing more satisfying than being able to communicate with the owners of my favorite website.Kuddos :goodposting: :popcorn: :thumbup: :thumbup:
Yes, I agree with Ruffrody.Good answer David. I appreciate the time and thought you put into your response.Like I said before, I love the site and was only raising a question. Thanks for your responsiveness to your clientele!
 
David, do you guys project injuries at all or do you try to assume all starters will play all 16 weeks even though we know that's not going to be reality?

 
David, do you guys project injuries at all or do you try to assume all starters will play all 16 weeks even though we know that's not going to be reality?
We all project games played. Guys like Todd Heap and Donovan McNabb always get less starts. Known suspensions are always factored in. We build curves in the Draft Dominator for rookies, those coming back from injury etc to show where we expect the production to be weighted (this is used in spreading the fantasy points to the appropriate week). The issue on random injuries though is it's essentially impossible to predict. Some major players will go down and it's a guess who they are. I have never found a good way really to deal with this except to try and draft a good deep team. Brady was the #1 QB the year before he went down, etc. Maurile's numbers seem to project games missed better than the rest of us, but he gets lots of criticism every year for having super low numbers. Since all of us use history to guide us, we do factor in RB dings, etc by showing a player having say 220 carries. For the most part, we over-project all QBs every year because of injuries.
 
David, do you guys project injuries at all or do you try to assume all starters will play all 16 weeks even though we know that's not going to be reality?
We all project games played. Guys like Todd Heap and Donovan McNabb always get less starts. Known suspensions are always factored in. We build curves in the Draft Dominator for rookies, those coming back from injury etc to show where we expect the production to be weighted (this is used in spreading the fantasy points to the appropriate week). The issue on random injuries though is it's essentially impossible to predict. Some major players will go down and it's a guess who they are. I have never found a good way really to deal with this except to try and draft a good deep team. Brady was the #1 QB the year before he went down, etc. Maurile's numbers seem to project games missed better than the rest of us, but he gets lots of criticism every year for having super low numbers. Since all of us use history to guide us, we do factor in RB dings, etc by showing a player having say 220 carries. For the most part, we over-project all QBs every year because of injuries.
Would it be possible to include Games Played on your projections? I prefer to draft players as if they will play all 16 games and basically would like to see what I can expect a player to average on a week that he plays. If he doesn't play that week, then I can fill in with a backup player.
 
I thought we added games last year. Yes we submit our numbers with games, but I now see it's not being shown on the projections pages. We will get this fixed soon.

 
Sara Holliday (FF Librarian) started doing rankings analysis for all the FF websites via the FSTA. She blogs about it here:

http://www.fflibrarian.com/2010/01/2009-ac...lts-are-in.html

48 sites gave her rankings. We finished 11th.

Donnie from a website called themostcredible (that link now brings back a webpage that my virus protector does not like) presented with Sara and judged 22 different sets of projections. I submitted my numbers and I took second. I don't have a link to share because he gave a pitch at the FSTA and they never posted the results anywhere.

I paid money to Andy Hicks to track weekly projections from all of the pay sites, some free sites, etc. I came in first and Bloom came in second. But the second we even tried to publish the results we had sites telling us that they would sue us, etc. That's the big problem. Competing sites that wing it will never give anyone permission to show results that make them look bad. and even though the content is useless going forward to anyone, it is paid content. We don't have the rights to publish it. Even getting everyone on board on how to judge stuff is next to impossible.

There was another site a few years ago who did this for inseason and we came out on top then too. They quit after a year as a pay website.

Even if no one here believes these results, ask these things from the sites that do weekly projections.

- Add up all the players. What are the total yards, yards per carry, TDs that are expected. Do these conform with league norms?

- How often are the numbers updated after Friday? Many don't change them on the weekend. It doesn't take a genius to know sites predicting 50 yards from the guy that is inactive are going to suck (because others should be higher while he is at zero)

- Do they predict fractional TDs? I say this because it's ridiculous to project either 0 or 1 TDs for 90% of the TEs as most score 3-4 TDs per year.

- How many players are projected for at each position? Some of these sites list 50 WRs. I think we all know that more than 50 WRs will catch a pass each week. In some dual flex leagues, you might need to go 100+ deep just to properly evaluate all of your options.

I KNOW our stuff checks out because of the processes we use. And although Sigmund and I can differ wildly as to the numbers we think overall on a player for a given week, we use similar processes to create our numbers . So it makes sense that in objective studies, we crush this stuff. The other site that always does good in these things is 4for4.com. That's because like us, they simulate games and then make the numbers match this expected result. They also are one of the few websites (besides ourselves) that use fractional TDs. Most everyone else sucks at this unfortunately. I wish I could prove it more easily as it would clearly help sell subscriptions.

I would also ask this question. We have 40,000 subscribers and have grown our numbers every single year. If we sucked at projecting, do you think this would be the case? We have multiple people on staff with statistic backgrounds including a PHD in Stats in Doug Drinen. Do you think he would let us publish numbers that don't conform to historical norms?

I am done with this subject even though it will get asked 10,000 more times before the preseason ends. I get it. People are results driven. They want us to prove to them how good we are. Like someone else said though, the real thing is to show we suck way less than everyone else. But we can't show other people's work without getting sued. and therein lies the world of fantasy information services all claiming they are better than the other guy.
David, I think we all understand why you can't line up your FBG team against all the other FF websites and see where you stack up (being sued, whatever) and that's fine. By why not some in-house analysis/breakdown of staff projections? Where you hit and where you missed and more importantly why that may have happened? Did this staffer overlook something that this staffer didn't, did this staffer put too much stock in certain things, etc?I think the initial comment of it being "meaningless to look backwards" coming from a staff member here really confused some people, because a great deal can be learned by doing so. That answer along with the "we just don't do it" schtick, and the "I'm done with this even though it will get asked 10,000 more times" just doesn't really fly and seems may have seemed like a copout. If it's going to get asked 10,000 more times, it might be something to take a look at providing.

 
I would also ask this question. We have 40,000 subscribers and have grown our numbers every single year. If we sucked at projecting, do you think this would be the case?
Do you think those male enhancement pills would keep selling if they didn't really work?
 
I thought we added games last year. Yes we submit our numbers with games, but I now see it's not being shown on the projections pages. We will get this fixed soon.
Hi David, just wondering if this is still on the table to add. I saw the projections were updated yesterday but no new columns.Thanks!
 
I thought we added games last year. Yes we submit our numbers with games, but I now see it's not being shown on the projections pages. We will get this fixed soon.
Hi David, just wondering if this is still on the table to add. I saw the projections were updated yesterday but no new columns.Thanks!
:unsure:
I emailed Doug about it yesterday, but have not heard anything. It should be an easy fix
 
One more time, I gotta say that I am so impressed that you take the time to respond to us! There is nothing more satisfying than being able to communicate with the owners of my favorite website.
You're more than welcome Ruffrody. I know sometimes we have stuff slip by where we miss a response but you guys are the reason David and I are able to do this. J
 
given the discussion in here, figured some people might be interested in the results of this study:

http://www.fantasypros.com/2010/08/accurac...ults/#more-2900

http://www.fantasypros.com/expert-accuracy-rankings-2009/

http://www.fantasypros.com/about/faq/accuracy-methodology/

We’ve had an overwhelming response to last week’s Accuracy Awards announcement. After being knee deep in data for the past several months, it was awesome to see that other people really care about this topic as much as we do. In light of this, we wanted to share some more of the detail and insights from the study.

Note: If you just want to check out the rank order of how everyone did, you can go to our Accuracy Rankings. If you crave a little more meat and want to see the actual numbers, you can go to our Accuracy Scores. If you’re looking for some bedtime reading, you can take a look at our Methodology.

Key Findings

1. The most accurate expert was between 19% and 31% better than the least accurate expert, depending on the position. RB had the tightest spread while QB had the largest.

2. There were drastically larger spreads when looking at individual weeks. On average, the difference between the best and worst expert was 58% for RB and 182% for TE.

3. When looking purely at whether the expert’s predictions were right or wrong (i.e. with no weighting based on the value of the predictions), the spreads were considerably tighter.

4. The data we’ve pulled together offers a ton of insight beyond just the summary accuracy scores that we’ve published so far.

More Thoughts

1. A lot of experts were bunched together with very little spread in their accuracy scores. This makes sense to us. Similar to sports betting, it’s really difficult for experts to distance themselves from the pack over the long haul. But also similar to sports betting, something as small as a 5% edge can make a huge difference. How many times have you lost by a small margin in your heads up match each week? Just a few points can mean the difference between a win and a loss! A few other takeaways:

* Be wary of sites that claim their advice is 40% more accurate than other sites – especially if they’re making you pay for this advice. It’s especially concerning that some of these sites don’t publish their methodology. If you have a site that claims this and can back it up, we’d love to hear from you and enter you in our 2010 accuracy study.

* Player list size and average fantasy points per position naturally influenced the size of the spreads. We assessed 40 RB and 50 WR spots compared to 20 QB and 15 TE spots. From a “fantasy points per prediction” perspective, QB was the highest and TE was the lowest. These two factors – fewer predictions and more points per prediction – contributed to the relatively larger accuracy differences for the QB position.

2. Similar to sports gambling again (disclaimer: we are not sports gambling site and do not promote this activity in any way, unless you’re with us in Vegas), these “fantasy cappers” can get hot and cold with their picks from week to week. The amount of variance is again related to the number of predictions that we analyze – the more predictions, the smaller the spread. This is evidence that:

* It may not be a good idea to draw hard conclusions from studies that only examine one ranking per year. This is one reason why we chose to focus on weekly rankings (16 weeks of data) vs. draft rankings (1 list per year). In the future, we’re going to track both draft and weekly rankings and keep year-to-year records.

* Despite the fact that the average edge may get smaller and smaller with the more data that we gather, I’m a strong believer that the best experts will rise up over time. Similar to poker, anyone can drag a monster pot or even win a tournament or two, but the guys that can show positive results over millions of hands are the true experts.

3. On a Win % basis, where we just calculate how often the expert was right, the spread between the best and worst expert was only 14% to 19% depending on position. We don’t use Win % as our final accuracy rating because it doesn’t incorporate the value of each prediction. When you pick a guy that scores 15 points more than the other guy, that prediction should be worth more than when you correctly predict something that only nets 1 fantasy point. Also, when looking at these numbers, please keep in mind that:

* We’re not including every possible prediction. We only score the predictions that involve at least some disagreement between the experts. There’s no reason to score the Chris Johnson vs. Kevin Faulk match-up if every expert is picking Chris Johnson. It’s also not a match-up a typical advice seeker would actually seek. So, when you look at the Win %, think of it as the expert’s ability to make the right call on decisions that are actually contemplated by fantasy players. Including all predictions would naturally improve every expert’s Win %.

4. Running this analysis just got us more curious about other insights that our data can provide. We’ll do our best to share more of this cool information as we dig through it. Here are just a few of the questions we’re hoping to answer:

* For each expert, are there certain players that they have pegged much better than other experts? Are there certain players that they just completely missed on?

* Similarly, are there certain players that the expert tended to overvalue or undervalue relative to the other experts?

* Are certain experts more likely to go against the consensus opinion than others? When they do back the underdog, are they correct more often than not?

* Which experts tend to agree or disagree with each other the most?

* When an expert misses badly on a player, does he tend to over correct the next week, or does he stick to his original opinion?

* The list is endless…we’re just getting started!

Please note that these summary numbers refer to the RB, WR, QB, and TE positions only. DST and K were evaluated and the ratings are published, but given the wide differences in league settings for these two positions, I would take the results with a grain of salt. In fact, we did not include these two positions for our Overall Accuracy Awards.

Overall Accuracy Rankings

Based on weighted average of RB, WR, QB, & TE positions

1. Andy Behrens – Yahoo! Sports

2. Pat Fitzmaurice – Pro Football Weekly

3. Brandon Funston – Yahoo! Sports

4. Gregg Rosenthal – Rotoworld

5. David Dodds – FootballGuys

6. Scott Pianowski – Yahoo! Sports

7. Sigmund Bloom – FootballGuys

8. Staff – FantasyFootball.com

9. Brad Evans – Yahoo! Sports

10. R.J. White – FanHouse

11. Erik Kuselias – ESPN

12. Staff – FantasyCafe

13. Scott Engel – RotoExperts

14. Eric Karabell – ESPN

15. Staff – FFToolbox

16. Christopher Harris – ESPN

17. Matthew Berry – ESPN

18. Staff – SI.com

19. Staff – CBS Sports

20. Staff – KFFL

21. Paul Greco – FantasyPros911

Top RB Experts

RB Accuracy Rankings

1. David Dodds – FootballGuys

2. Sigmund Bloom – FootballGuys

3. R.J. White – FanHouse

4. Pat Fitzmaurice – Pro Football Weekly

5. Erik Kuselias – ESPN

6. Andy Behrens – Yahoo! Sports

7. Matthew Berry – ESPN

8. Staff – FFToolbox

9. Brad Evans – Yahoo! Sports

10. Scott Pianowski – Yahoo! Sports

11. Brandon Funston – Yahoo! Sports

12. Staff – FantasyFootball.com

13. Scott Engel – RotoExperts

14. Staff – FantasyCafe

15. Eric Karabell – ESPN

16. Gregg Rosenthal – Rotoworld

17. Christopher Harris – ESPN

18. Staff – CBS Sports

19. Staff – KFFL

20. Paul Greco – FantasyPros911

21. Staff – SI.com

Top WR Experts

WR Accuracy Rankings

1. Brandon Funston – Yahoo! Sports

2. Gregg Rosenthal – Rotoworld

3. Pat Fitzmaurice – Pro Football Weekly

4. Andy Behrens – Yahoo! Sports

5. Scott Pianowski – Yahoo! Sports

6. Christopher Harris – ESPN

7. Sigmund Bloom – FootballGuys

8. Staff – FantasyFootball.com

9. Erik Kuselias – ESPN

10. David Dodds – FootballGuys

11. Scott Engel – RotoExperts

12. Brad Evans – Yahoo! Sports

13. Staff – FantasyCafe

14. Eric Karabell – ESPN

15. Matthew Berry – ESPN

16. R.J. White – FanHouse

17. Staff – SI.com

18. Staff – FFToolbox.com

19. Staff – KFFL

20. Staff – CBS Sports

21. Paul Greco – FantasyPros911

Top QB Experts

QB Accuracy Rankings

1. Brandon Funston – Yahoo! Sports

2. Andy Behrens – Yahoo! Sports

3. Scott Pianowski – Yahoo! Sports

4. Gregg Rosenthal – Rotoworld

5. Staff – FantasyFootball.com

6. Pat Fitzmaurice – Pro Football Weekly

7. Staff – FantasyCafe

8. Staff – FFToolbox

9. Eric Karabell – ESPN

10. David Dodds – FootballGuys

11. Brad Evans – Yahoo! Sports

12. Erik Kuselias – ESPN

13. Christopher Harris – ESPN

14. Matthew Berry – ESPN

15. Sigmund Bloom – FootballGuys

16. R.J. White – FanHouse

17. Staff – WhatIfSports

18. Scott Engel – RotoExperts

19. Paul Greco – FantasyPros911

20. Staff – CBS Sports

21. Staff – KFFL

22. Staff – SI.com

Top TE Experts

TE Accuracy Rankings

1. Gregg Rosenthal – Rotoworld

2. Staff – FFToolbox

3. Eric Karabell – ESPN

4. Scott Engel – RotoExperts

5. Brad Evans – Yahoo! Sports

6. Andy Behrens – Yahoo! Sports

7. Staff – FantasyCafe

8. Scott Pianowski – Yahoo! Sports

9. Staff – FantasyFootball.com

10. R.J. White – FanHouse

11. Pat Fitzmaurice – Pro Football Weekly

12. Staff – SI.com

13. Christopher Harris – ESPN

14. Brandon Funston – Yahoo! Sports

15. David Dodds – FootballGuys

16. Paul Greco – FantasyPros911

17. Matthew Berry – ESPN

18. Sigmund Bloom – FootballGuys

19. Erik Kuselias – ESPN

20. Staff – WhatIfSports

21. Staff – CBS Sports

22. Staff – KFFL

Top DST Experts

DST Accuracy Rankings

1. Brandon Funston – Yahoo! Sports

2. Scott Pianowski – Yahoo! Sports

3. Brad Evans – Yahoo! Sports

4. Matthew Berry – ESPN

5. Andy Behrens – Yahoo!

6. Erik Kuselias – ESPN

7. R.J. White – FanHouse

8. Eric Karabell – ESPN

9. Staff – CBS Sports

10. Paul Greco – FantasyPros911

11. Gregg Rosenthal – Rotoworld

12. Pat Fitzmaurice – Pro Football Weekly

13. Staff – SI.com

14. Staff – WhatIfSports

15. Staff – KFFL

16. Staff – FantasyFootball.com

17. David Dodds – FootballGuys

18. Christopher Harris – ESPN

19. Staff – FantasyCafe

K Accuracy Rankings

1. Pat Fitzmaurice – Pro Football Weekly

2. Scott Pianowski – Yahoo! Sports

3. David Dodds – FootballGuys

4. Staff – FFToolbox

5. Staff – FantasyFootball.com

6. Erik Kuselias – ESPN

7. Brandon Funston – Yahoo.com

8. Staff – FantasyCafe

9. Staff – SI.com

10. R.J. White – FanHouse

11. Brad Evans – Yahoo! Sports

12. Andy Behrens – Yahoo! Sports

13. Gregg Rosenthal – Rotoworld

14. Christopher Harris – ESPN

15. Eric Karabell – ESPN

16. Matthew Berry – ESPN

17. Paul Greco – FantasyPros911

18. Staff – CBS Sports

19. Staff – KFFL

20. Staff – WhatIfSports

*** SPOILER ALERT! Click this link to display the potential spoiler text in this box. ***");document.close();
 
given the discussion in here, figured some people might be interested in the results of this study:

http://www.fantasypros.com/2010/08/accurac...ults/#more-2900

http://www.fantasypros.com/expert-accuracy-rankings-2009/

http://www.fantasypros.com/about/faq/accuracy-methodology/

We’ve had an overwhelming response to last week’s Accuracy Awards announcement. After being knee deep in data for the past several months, it was awesome to see that other people really care about this topic as much as we do. In light of this, we wanted to share some more of the detail and insights from the study.

Note: If you just want to check out the rank order of how everyone did, you can go to our Accuracy Rankings. If you crave a little more meat and want to see the actual numbers, you can go to our Accuracy Scores. If you’re looking for some bedtime reading, you can take a look at our Methodology.

Key Findings

1. The most accurate expert was between 19% and 31% better than the least accurate expert, depending on the position. RB had the tightest spread while QB had the largest.

2. There were drastically larger spreads when looking at individual weeks. On average, the difference between the best and worst expert was 58% for RB and 182% for TE.

3. When looking purely at whether the expert’s predictions were right or wrong (i.e. with no weighting based on the value of the predictions), the spreads were considerably tighter.

4. The data we’ve pulled together offers a ton of insight beyond just the summary accuracy scores that we’ve published so far.

More Thoughts

1. A lot of experts were bunched together with very little spread in their accuracy scores. This makes sense to us. Similar to sports betting, it’s really difficult for experts to distance themselves from the pack over the long haul. But also similar to sports betting, something as small as a 5% edge can make a huge difference. How many times have you lost by a small margin in your heads up match each week? Just a few points can mean the difference between a win and a loss! A few other takeaways:

* Be wary of sites that claim their advice is 40% more accurate than other sites – especially if they’re making you pay for this advice. It’s especially concerning that some of these sites don’t publish their methodology. If you have a site that claims this and can back it up, we’d love to hear from you and enter you in our 2010 accuracy study.

* Player list size and average fantasy points per position naturally influenced the size of the spreads. We assessed 40 RB and 50 WR spots compared to 20 QB and 15 TE spots. From a “fantasy points per prediction” perspective, QB was the highest and TE was the lowest. These two factors – fewer predictions and more points per prediction – contributed to the relatively larger accuracy differences for the QB position.

2. Similar to sports gambling again (disclaimer: we are not sports gambling site and do not promote this activity in any way, unless you’re with us in Vegas), these “fantasy cappers” can get hot and cold with their picks from week to week. The amount of variance is again related to the number of predictions that we analyze – the more predictions, the smaller the spread. This is evidence that:

* It may not be a good idea to draw hard conclusions from studies that only examine one ranking per year. This is one reason why we chose to focus on weekly rankings (16 weeks of data) vs. draft rankings (1 list per year). In the future, we’re going to track both draft and weekly rankings and keep year-to-year records.

* Despite the fact that the average edge may get smaller and smaller with the more data that we gather, I’m a strong believer that the best experts will rise up over time. Similar to poker, anyone can drag a monster pot or even win a tournament or two, but the guys that can show positive results over millions of hands are the true experts.

3. On a Win % basis, where we just calculate how often the expert was right, the spread between the best and worst expert was only 14% to 19% depending on position. We don’t use Win % as our final accuracy rating because it doesn’t incorporate the value of each prediction. When you pick a guy that scores 15 points more than the other guy, that prediction should be worth more than when you correctly predict something that only nets 1 fantasy point. Also, when looking at these numbers, please keep in mind that:

* We’re not including every possible prediction. We only score the predictions that involve at least some disagreement between the experts. There’s no reason to score the Chris Johnson vs. Kevin Faulk match-up if every expert is picking Chris Johnson. It’s also not a match-up a typical advice seeker would actually seek. So, when you look at the Win %, think of it as the expert’s ability to make the right call on decisions that are actually contemplated by fantasy players. Including all predictions would naturally improve every expert’s Win %.

4. Running this analysis just got us more curious about other insights that our data can provide. We’ll do our best to share more of this cool information as we dig through it. Here are just a few of the questions we’re hoping to answer:

* For each expert, are there certain players that they have pegged much better than other experts? Are there certain players that they just completely missed on?

* Similarly, are there certain players that the expert tended to overvalue or undervalue relative to the other experts?

* Are certain experts more likely to go against the consensus opinion than others? When they do back the underdog, are they correct more often than not?

* Which experts tend to agree or disagree with each other the most?

* When an expert misses badly on a player, does he tend to over correct the next week, or does he stick to his original opinion?

* The list is endless…we’re just getting started!

Please note that these summary numbers refer to the RB, WR, QB, and TE positions only. DST and K were evaluated and the ratings are published, but given the wide differences in league settings for these two positions, I would take the results with a grain of salt. In fact, we did not include these two positions for our Overall Accuracy Awards.

Overall Accuracy Rankings

Based on weighted average of RB, WR, QB, & TE positions

1. Andy Behrens – Yahoo! Sports

2. Pat Fitzmaurice – Pro Football Weekly

3. Brandon Funston – Yahoo! Sports

4. Gregg Rosenthal – Rotoworld

5. David Dodds – FootballGuys

6. Scott Pianowski – Yahoo! Sports

7. Sigmund Bloom – FootballGuys

8. Staff – FantasyFootball.com

9. Brad Evans – Yahoo! Sports

10. R.J. White – FanHouse

11. Erik Kuselias – ESPN

12. Staff – FantasyCafe

13. Scott Engel – RotoExperts

14. Eric Karabell – ESPN

15. Staff – FFToolbox

16. Christopher Harris – ESPN

17. Matthew Berry – ESPN

18. Staff – SI.com

19. Staff – CBS Sports

20. Staff – KFFL

21. Paul Greco – FantasyPros911

Top RB Experts

RB Accuracy Rankings

1. David Dodds – FootballGuys

2. Sigmund Bloom – FootballGuys

3. R.J. White – FanHouse

4. Pat Fitzmaurice – Pro Football Weekly

5. Erik Kuselias – ESPN

6. Andy Behrens – Yahoo! Sports

7. Matthew Berry – ESPN

8. Staff – FFToolbox

9. Brad Evans – Yahoo! Sports

10. Scott Pianowski – Yahoo! Sports

11. Brandon Funston – Yahoo! Sports

12. Staff – FantasyFootball.com

13. Scott Engel – RotoExperts

14. Staff – FantasyCafe

15. Eric Karabell – ESPN

16. Gregg Rosenthal – Rotoworld

17. Christopher Harris – ESPN

18. Staff – CBS Sports

19. Staff – KFFL

20. Paul Greco – FantasyPros911

21. Staff – SI.com

Top WR Experts

WR Accuracy Rankings

1. Brandon Funston – Yahoo! Sports

2. Gregg Rosenthal – Rotoworld

3. Pat Fitzmaurice – Pro Football Weekly

4. Andy Behrens – Yahoo! Sports

5. Scott Pianowski – Yahoo! Sports

6. Christopher Harris – ESPN

7. Sigmund Bloom – FootballGuys

8. Staff – FantasyFootball.com

9. Erik Kuselias – ESPN

10. David Dodds – FootballGuys

11. Scott Engel – RotoExperts

12. Brad Evans – Yahoo! Sports

13. Staff – FantasyCafe

14. Eric Karabell – ESPN

15. Matthew Berry – ESPN

16. R.J. White – FanHouse

17. Staff – SI.com

18. Staff – FFToolbox.com

19. Staff – KFFL

20. Staff – CBS Sports

21. Paul Greco – FantasyPros911

Top QB Experts

QB Accuracy Rankings

1. Brandon Funston – Yahoo! Sports

2. Andy Behrens – Yahoo! Sports

3. Scott Pianowski – Yahoo! Sports

4. Gregg Rosenthal – Rotoworld

5. Staff – FantasyFootball.com

6. Pat Fitzmaurice – Pro Football Weekly

7. Staff – FantasyCafe

8. Staff – FFToolbox

9. Eric Karabell – ESPN

10. David Dodds – FootballGuys

11. Brad Evans – Yahoo! Sports

12. Erik Kuselias – ESPN

13. Christopher Harris – ESPN

14. Matthew Berry – ESPN

15. Sigmund Bloom – FootballGuys

16. R.J. White – FanHouse

17. Staff – WhatIfSports

18. Scott Engel – RotoExperts

19. Paul Greco – FantasyPros911

20. Staff – CBS Sports

21. Staff – KFFL

22. Staff – SI.com

Top TE Experts

TE Accuracy Rankings

1. Gregg Rosenthal – Rotoworld

2. Staff – FFToolbox

3. Eric Karabell – ESPN

4. Scott Engel – RotoExperts

5. Brad Evans – Yahoo! Sports

6. Andy Behrens – Yahoo! Sports

7. Staff – FantasyCafe

8. Scott Pianowski – Yahoo! Sports

9. Staff – FantasyFootball.com

10. R.J. White – FanHouse

11. Pat Fitzmaurice – Pro Football Weekly

12. Staff – SI.com

13. Christopher Harris – ESPN

14. Brandon Funston – Yahoo! Sports

15. David Dodds – FootballGuys

16. Paul Greco – FantasyPros911

17. Matthew Berry – ESPN

18. Sigmund Bloom – FootballGuys

19. Erik Kuselias – ESPN

20. Staff – WhatIfSports

21. Staff – CBS Sports

22. Staff – KFFL

Top DST Experts

DST Accuracy Rankings

1. Brandon Funston – Yahoo! Sports

2. Scott Pianowski – Yahoo! Sports

3. Brad Evans – Yahoo! Sports

4. Matthew Berry – ESPN

5. Andy Behrens – Yahoo!

6. Erik Kuselias – ESPN

7. R.J. White – FanHouse

8. Eric Karabell – ESPN

9. Staff – CBS Sports

10. Paul Greco – FantasyPros911

11. Gregg Rosenthal – Rotoworld

12. Pat Fitzmaurice – Pro Football Weekly

13. Staff – SI.com

14. Staff – WhatIfSports

15. Staff – KFFL

16. Staff – FantasyFootball.com

17. David Dodds – FootballGuys

18. Christopher Harris – ESPN

19. Staff – FantasyCafe

K Accuracy Rankings

1. Pat Fitzmaurice – Pro Football Weekly

2. Scott Pianowski – Yahoo! Sports

3. David Dodds – FootballGuys

4. Staff – FFToolbox

5. Staff – FantasyFootball.com

6. Erik Kuselias – ESPN

7. Brandon Funston – Yahoo.com

8. Staff – FantasyCafe

9. Staff – SI.com

10. R.J. White – FanHouse

11. Brad Evans – Yahoo! Sports

12. Andy Behrens – Yahoo! Sports

13. Gregg Rosenthal – Rotoworld

14. Christopher Harris – ESPN

15. Eric Karabell – ESPN

16. Matthew Berry – ESPN

17. Paul Greco – FantasyPros911

18. Staff – CBS Sports

19. Staff – KFFL

20. Staff – WhatIfSports
*** SPOILER ALERT! Click this link to display the potential spoiler text in this box. ***");document.close();
Please tell me Dodds was cheating off Herman here. Kicker guy needs love! Actually was curious how Hermann stacked up against these measures and was disappointed not to see him there.-QG

 
I'm not typically a suck up, but for my money David Dodds is the smartest man on this planet.

Fantasy Football aside, this guy just "gets" life. One of the few people I would ever be honored to shake hands with.

I am in no way affiliated with this website, but these guys give it their all and are here as an open book pretty much 365 days a year. Joe keeps a pretty tight ship, and I honestly believe the whole staff would drop what they are doing to help any of their 40,000 members if they could.

11th place? Anything less than #1 is absurd.

 
David,

I was just checking out your Top 300 Draft List that incorporates ADP and I understand how you do it. How can Gore be ranked below the 2 RBs just ahead of him on that list, while both his ADP and your redraft RB rankings (Dave & Joe) both have him ranked ahead of those 2? It surprised me to see it and it doesn't really make sense to me given your explanation at how you came up with the Top 300 Draft List.

Thanks.

 
anyone else notice what i mentioned above? ... that Gore's ranking in the Top 300 Draft List (dated 8/25) makes no sense when you see the ADP data and also Dodds' personal rankings

 
Last edited by a moderator:

Users who are viewing this thread

Back
Top