Loading ...
Sorry, an error occurred while loading the content.


Expand Messages
  • Herb Garbutt
    We have 20 teams done and three that I know are in progress. Still looking for someone to lead ratings discussions for the following teams. Chicago, Dallas,
    Message 1 of 50 , Jul 22, 2008
    • 0 Attachment

      We have 20 teams done and three that I know are in progress.


      Still looking for someone to lead ratings discussions for the following teams.


      Chicago, Dallas, Edmonton, Minnesota, Nashville, NY Islanders, Vancouver.


      If you’ve already got discussions rolling on these teams, let me know.


      If you’ve already rated these teams, get in touch with me and I’ll get you in contact with others in the group or people who can help.


      The sooner we get this done, the sooner Dave can start work on the disk.





    • D. Atkinson
      Great description Herb. And thanks for all the hard work. There would be no disk without you either, as the entire package is definitely too much for one
      Message 50 of 50 , Oct 22 9:38 PM
      • 0 Attachment
        Great description Herb. And thanks for all the hard work. There would be no disk without you either, as the entire package is definitely too much for one person. I appreciate your efforts. It was fun to hang out with you in Toronto and see that Leafs game....I had been to the old Maple Leaf Gardens, but not the new place. The seats were much more comfortable in the new building :)

        One thing to add below....not only did all the teams come in strong on SOG and SOG allowed, but also GF and GA. All less than 5 % out. Individual players also were all less than 5% out on SOG/min played (well, all but 18 of the 750 or so skaters were in, and those 18 were all low use players that are hard to get in line due to low minutes played). The variance on the numbers were smaller this year after testing than last year, or any other year that I can remember. It was a good set. There may be a few ratings in there that could go either way, and even a few that one may argue are off, but they were the borderline reviews that needed to be where they were at to get the disk performance in line. Consider them players that under or overplayed last season. The disk is pretty solid based on the approximately 800,000 test games played.

        There are some issues with the disk that were unfixable, as in last seasons. First and foremost, players with a fairly high assists/min (> 0.015 ast/min) run very low in assist totals in tests, especially if they took quite a bit of shots. This is game engine issue, and has been present forever in the game.
        There are a few goons that get too many penalty minutes. I have all ratings set to zero on a few players (except for fights), and they still get way way too many PIM. This is also a known issue. Good thing is, this is mainly with players that have nearly as many PIM as minutes played, so they have little impact (especially while spending substantial time in the box). So, if you end up playing Bissonette 10-12 minutes per game, he may end up with 500 PIM.

        There are also a few goalies that get too many PIM. I still don't know why this happens with a couple goalies every season. This is the old Jason Muzzati issue (for those of you that remember that one). So, there are 3 or 4 goalies with 2 or 4 PIM in real life that got 15-20 consistently in the test seasons. The pen rates are all set to zero, so I'm not sure what drives this. Impact should be small overall.

        Other than those things, the disk is pretty solid, and leagues should be pretty satisfied with it.

        If anyone finds anymore errors, please report them ASAP so that they can get fixed before leagues start using the files.

        Thanks, everyone for their patience this year. It was a challenging year time-wise for both Herb and I.


        Herb Garbutt wrote:

        Hey everyone,

        Meant to post something earlier but was busy setting up my league and revamping our website. Anyway, I see the ratings questions starting to pop up so I thought I would explain this year’s process.


        First, I sent out the a base set to those who expressed an interest in rating (last year’s ratings, plus my own ratings for rookies).


        Next, once the raters reported back, I went through each team. If the majority of people rating a team had a change for a certain rating, I made the change. At the same time I began doing checks for consistency (basically this entails calculating the average rating on each team for each skill). This is important because with different people rating different teams, they all have different standards as to what is a 2, 3, 4 etc.  The purpose isn’t to get everyone the same but to identify teams that are much higher or lower in a particular rating. Forecheck seemed to have the biggest variance.


        Then Dave ran the first set of tests.


        At this point two things happened. When the results came back it identified teams that needed changes. So I went back to the raters submissions and made further changes (where it was not a majority but suited the purpose of bringing the team in line). The second was I began doing individual player reviews to  check for ratings that deserved changes that may have got by the raters. My individual reviews were for the most part on players chosen at random, so as to not show any bias toward certain players or teams.


        Then Dave tested again.


        By this point, most of the suggested rating changes by the reviewers were exhausted. So I began concentrating my individual reviews on the teams that were still not performing accordingly…. Although between test runs I continued to do reviews on random players throughout the league. This process continued probably through about eight or nine rounds of testings.


        Last year, I did individual reviews on every player. This year, because of the computer problems I got a late start. (In the end there were only three teams where I reviewed every player—NJ, Mon, Ott and there were some where I did as little as 3 -- Det)  I hadn’t updated the scouting report so as I did an individual review on a player, I updated his report. It was a good chance to update the scouting report as well as check ratings with the most current information.


        In the end, I only did individual player reviews on a little under half of the players (386 of 814). So even now as I’m setting up my own league I’m seeing some that I don’t quite agree with (Wade Redden a 3 comes to mind….wish I’d reviewed him). If a team was performing within the parameters (which Toronto was throughout testing), the only reason a rating would change is if I selected a player for an individual review.


        No process is perfect. As I’ve said in past years, we could probably have pro scouts do the ratings and there would still be ratings we don’t agree with.  In terms of testing, Dave said this year’s set produced the best results we’ve ever had. Each year the goal is to bring teams within 5% of shots for and against. This year we got 29 of 30 teams within 4% (20 of 30 inside 3%). That may not seem like a big deal but considering some teams start out as far out as 12%, it’s pretty good.


        I’m not even sure we got everyone within 5% last year. There are always one or two problem teams each year and I think in the past couple of years, we’ve completely exhausted every change we could make on at least one team and basically had to accept them as close as they were. In some years we’ve even tried making changes on their division rivals, hoping that would have enough effect to bring them in line. We didn’t get to that point this year.


        So we did well, but are all 8,140 ratings going to be correct? No. If we get 95% right I’d say we’ve done a remarkable job considering we’re not pro scouts…. But even at that rate, we’d have 407 incorrect ratings. We will never get it perfect.  All we can do is the best we can.


        I’d just like to say thanks to Dave for everything he’s done for APBA hockey leagues all over North America. Without him, we wouldn’t even be discussing whether Gunnarsson is a 3 or 4. I finally got the chance to meet Dave this year and as we were talking we had the realization that we’d been communicating by e-mail for more than a decade. He’s been producing these disks for all of us for 13 years (or is it 14 now?)….using his vacation time, giving up time with his family and running test season after test season while the sun is shining outside. Without him, this game (IMO the best there is and even better since Dave started doing the season disks) would have died a long, long time ago.


        Enjoy your seasons and for all of you with Gunnarsson on your team, enjoy your gift.




      Your message has been successfully submitted and would be delivered to recipients shortly.