Categories
Women's Hockey

The Drawbacks of MyHockeyRankings: A Closer Look at Misuse and Flaws in Youth Hockey Ratings

The following is a post I wrote almost four years ago about MyHockeyRankings on an old blog I used to publish. Most of it is still highly relevant after all this time. I have added a few additional new thoughts at the end of the post. This is the second post rounding out the benefits and drawbacks of MyHockeyRankings – you can read Part I here.

While MyHockeyRankings (MHR) has many benefits, I’ve seen and heard enough parents use the site in a manner for which it was not intended. Instead of being used for good, it can be used for evil to the detriment of player development and their game.

Let’s get the most obvious one out of the way first…

  1. Focusing on your team ranking

Coaching games to maintain or improve your team’s rank/rating goes against the original intent of the site. By putting a focus on ‘goal differential’ over playing games to your team’s fullest capability is essentially poor sportsmanship. One example is not pulling your goalie late in a close game to minimize the risk to lowering your rating even more. Another one is to keep/only play your best players late in the 3rd period even when the outcome is clear. Going into a game knowing the EGD and playing to match or exceed that difference should not be on the mind of any coach before or during a game. 

2. Using MHR rating or ranking as the measurement of team success

Like in business, a team cannot just look at a single metric to see determine the how well it is performing. Usually you’ll need 2 or 3 attributes to get the full picture of how an organization is performing. Things like player development, win/loss record during league or tournament play, and learning to compete are much more important than any single rating metric.

3. Playing for a highly ranked vs middle-of-the-pack team

Coaching certainly plays a role in the development and success of a team. However, the size of the pool of players in an area and the multi-year commitment to player development of a club or region is really the biggest factor in how good a team is. This is why regions like Toronto, Boston and Minnesota have so many strong teams. They have both robust club programs to develop players from Mites to Midget as well as a deep group of players in their programs to choose from. Thus, as a parent, it really shouldn’t matter if your child’s team is highly ranked, what matters is that they continue to develop on a path to help them be the best hockey player they can be.

There are also several weaknesses to the actual algorithm using only goal differential for team ratings. Here are a few of them:

4. Lack of uniformity in game format and duration

Not all games are created equal. While USA Hockey tries to standardize games across divisions, the reality is that a large portion of games that are included in the rankings do not follow those guidelines. These can include games from tournaments, exhibition and pre-season game. The attributes that are not consistent across games can include game time, how regulation ties are handled (e.g. overtime vs. shootout vs. no extra time).  Last year, we were at a tournament with 90 second penalties while the total game was only 75% of a real game. This season our pre-season games were two twenty minute running-time games. There is no way to normalize scores based on the running time of a game.

5. Games scheduled between teams with Expected Goal Differential (EGD) great than 7

Per the original MHR manifesto, only scheduling games between teams that will be competitive makes perfect sense. However, in some regions having division where there is a large discrepancy between the top and bottom teams may occur.  Since MHR max goal differential per game is 7, I have seen several times where the lower rated team’s rating went up even though they lost by 10 goals, since the teams ratings difference was 8 or 9 goals heading into the game. I would recommend changing the algorithm to not include games between teams that have a 7 or more goal differential.

It is my experience that the MHR rating should be taken with a grain of salt and statistically there is probably some reasonably standard deviation between 0.50 and 0.75 rating points. Once again though if you are using the site for its intended purpose, then it shouldn’t matter what the actual rating is for your team. Furthermore, the natural standard deviation makes the rating even more meaningless.  Here are some additional factors that contribute to the standard deviation:

6. Tired teams

Most tier teams regularly play 4 to 6 games in a weekend.  While fatigue is something all the teams need to deal with, when the key metric for MHR is goal differential, it is very likely that final scores between two identical team will not be the same at the end of a 6 game weekend as they would have been on the first day. I have been surprised on many occasions when I expected to see a blow-out between two teams, but it was clear that the higher rated team couldn’t maintain the same level of play for 3 full periods in their final game.

7. Backup goalie dynamic

Ratings are a weighted average of both goalies.  But on many teams there can be a big gap between the top goalie and the second goalie. On others, there may be only one goalie. One season, one my kid’s teams had a goal differential rating of about 1.5 difference between the two goalies.  In this situation, wins vs. losses is a much better indicator of the team’s success instead of goal differential.

7. Asymmetric Actual Goal Differential

In my experience the EGD vs actual goal differential appears asymmetric when the EGD is about 4 or more. Usually this happened when team from different division play each other (i.e. when the higher ranked team has played most of their games against higher ranked teams and the lower ranked team traditionally plays lower ranked teams). So, the rating don’t reflect an apples-to-apples set of teams they have played and when the two teams play, the higher rated team can significantly exceed the EGD.

I am sure there are several other factors I have missed that contribute to the rating not being as precise as possible.

So how should you look at the ratings?

As mentioned above, take it with a grain of salt and don’t focus on the specific number, but more the peer group you are grouped with to see how your team is doing relative to others.  In addition, don’t be concerned about any number rating or ranking, focus instead on player and team development because at the end of the day that is what youth hockey is all about.

2023 Update #1 – last year both my kids played several games against Canadian teams and it seemed that the cross-border ratings weren’t as accurate as within the U.S. Specifically, the Canadian teams were consistently better than their ratings. Am not 100% sure why, but am assuming that the algorithm had insufficient cross-border game data to normalize (calibrate) the true ratings across the two geographies.

2023 Update #2 – There are many reasons I’ve seen that some games should not be included in the MHR calculation. I have even heard both teams agree not to post scores from ‘exhibition’ games to MHR. The most obvious one is where a team needs to play a USA Hockey district qualifying game against a team that is more than 7 goals below them in MHR. Clearly the only reason to play the game is to check a box. However, USA Hockey does use MHR as a way to select at-large invitations to Nationals (playoffs). While probably having negligible impact, these types of games should probably not be included in the MHR calculation.

Categories
College Hockey Recruiting Girl's Showcase Player Development Women's College Hockey Women's Hockey

How to Navigate a Path to Playing Women’s College Hockey

This summer, a podcast listener emailed me a simple question. If I was to do it all over again, what path would I recommend a young girl follow if she wanted to play college hockey?  Obviously, there is no simple answer or a single path for someone to follow to play high level female hockey.  But I thought I would articulate three simple principles I’d recommend and include references to more detailed topics I have covered in the past.

Note: This post focuses primarily on the DI college recruiting process. If a player’s goal is to play other levels of college / university hockey like DIII, CIS or ACHA (club) hockey, you can probably slightly dial down the timing and frequency of the some of the recommendations below.

1. Just Get Good

This is by far the most important principle in this list. At whatever age a player shows a passion for hockey, this is the area to focus on most.  I have written several posts on what it takes to become a really good hockey player and this should be the highest priority. In my opinion, this probably should not change until a player stops playing competitive hockey.  There are over 2000 girls in each birth year playing a high level of hockey in the U.S. and Canada, but only ~250 spots open on DI rosters every year, the math gets quite easy. A player needs to be in the top 10-15% in order to get an offer from one of those 44 teams.

2. Make Sure You Are Seen

Assuming you are a “good” hockey player.  I would recommend that starting at about 14 or 15 years old you play for a team that attends the major girls hockey events  that DI college coaches scout. By playing on such a team, there is the obvious benefit of playing with other good players, receiving good coaching and being pushed by your peers.  But more importantly, in my experience, knowing that college coaches will be watching you play against top teams and players will help them calibrate you to your peers.

Not everyone agrees with this. Many coaches will say, if you are good enough, schools will find you. This is great in theory, but it is not always true. I know of several really good female hockey players who either played boys hockey, lived in non-traditional markets or played on weak AAA teams who were not regularly seen. The reality is, if you don’t play at high profile tournaments (e.g. USA or Canadian national playoffs & other top in-season tournaments ) or are not selected to attend the U18 national camps you won’t get noticed as easily.  So if you aren’t one of the top 30 players in the country, put yourself in the best position to be seen as much as possible.

There is also definitely a bias to regional players for almost all schools. And it is self-reinforcing. This is why you see so many Minnesota players play for Minnesota colleges. And why so many prep players play on the east coast.  While there are exceptions, being able to watch local players, having existing relationships with their coaches, players wanting to stay close to home etc. are all factors in their recruiting process.  Each of these things make it “easier” for college coaches to find talent that is probably just as good as the harder to find alternatives – and why coaches tend to find fish where they’ve fished in the past. So if you aren’t on a team that is regularly seen by DI schools, the mountain is a little steeper to climb, but not impossible. 

Which is why I would recommend for players who aren’t slam-dunk going to play in a Top 10 school, make sure you get seen in the year or two prior to your junior year of high school.

3. Strategically Pick 3-5 Spring/Summer Hockey Events to Attend

Ideally, the older you get, the more you would know how good a player your are relative to your peers.  This should then factor into which events to pick after the winter season ends.  With a little research you can figure out which ones might fit you level of play. Almost all the showcase organizers are very responsive to answering questions and can give you a feel if your daughter would be a good fit for a specific event. 

I would recommend only attending a handful of off-season events (e.g. one per month from April-August).   Such as:

  • USA Hockey or Hockey Canada national camps  (if you are good/lucky enough to be selected)
  • Showcases (Premier Ice Prospects, RUSH, NGHL etc.)
  • College Camps ( Colgate,  and any other school-specific camp that you might be interested in)
  • Popular tournaments (e.g. Beantown Classic, Showcase Hockey, Rose Series etc.)

Check out our full year list of girls hockey events.

 I think it is hard to justify going to more than 5 events unless they are almost all local (e.g. in the Boston area).  The “spray and pray” strategy usually ends up wasting a lot of money.  We have talked ad nauseum on the podcast that you don’t need to go to every event. It is both expensive and unnecessary.  But having a plan based on a players interest and level of play can deliver a reasonable return on your time and financial investment.

If you are 12 and under, in my opinion, you should be picking events for fun (e.g. a hockey trip to Europe) and maybe a little development. But not for recruiting purposes. You will have plenty of time when you are older to attend events that really matter to college coaches.

Summary

I have intentionally tried to simplify my recommendations on how to navigate the world of girl’s hockey and women’s college recruiting.  Player development is most critical. After that, just make sure they are playing at a high level while getting enough visibility.  If you follow these principles, everything else should take care of itself.

Categories
2023 Development Camp Girls Hockey Player Development Women's Hockey

The USA Hockey 2023 Girls 16/17 Camp Feedback Process – Part II

My Recommendations

Read Part I Here

Feedback is a gift.
Giving feedback is hard.

Having led performance feedback to dozens (if not hundreds) of people I’ve managed in business, I recognize it is one of the most challenging interactions to conduct in my career.  At the same time, I was taught how to take it seriously and learned many of the best practices to ensure a positive outcome from the process for both parties.  

It is pretty clear from the parent meeting at the 16/17 Girls camp (and the letter that accompanied the feedback/rating letter) that USA Hockey  wants to make no doubt that they are providing a variety of different levels of feedback for each player at the national camps. The details of this feedback were clearly explained in Part I on this topic.

And it is important to recognize that they really do care about giving feedback – because they have dedicated time and resources to the process.  I also wanted to also acknowledge that is takes a non-trivial amount of effort to provide detailed feedback to about 400 players across 4 major camps each summer.

At the same time, I’ve spent a ton of time thinking about this topic trying to figure out why almost everyone I have spoken with is disappointed with the USA Hockey Girls National Camp selection and feedback process. And here is what I came up with…

At the end of the day, the current process does not solve the unmet need of the players – which is to have actionable direction on their highest priority development areas. This is because the robustness of the feedback is not commensurate with the level of commitment and investment the players put into making, preparing and attending the camp.

And my reason for this is the following:

The feedback is too generic. For almost all the players, it’s just too simplistic/superficial without personalized examples and not actionable enough.

Here are my recommendations:

  1. Standardize a More Robust Process – The coaches should go through a training session on how the process works and what the expectations are from the coach on the process, content & delivery. All players should receive player-specific information using a common format, but with player-specific examples in the review. While the coaches should have flexibility to adapt the process to their style, each performance review (in addition to the attribute ratings mentioned in Part I) would require the feedback to include each of the following….
  2. Include Player-Specific Key Statistics (e.g. pass completion rates or turnover rates). Nothing is more powerful than data. Being able to show a player how they compared on key attributes compared to their peers makes things much clearer. This became quite evident to me in my analysis of the 16/17 Camp forwards and defenders.
  3. Support with Player-Specific Video Clips  –  showing a player exactly what they do well and how/when they make mistakes provides “hard-to-argue” credibility to the stats and the coach’s feedback. This would likely use a video analytics system like Instat/Hudl so each player’s shifts could be coded.
  4. Prioritize Key Areas to Focus OnDarryl Belfry consistently talks about High Frequency – Low Success Rate Situations.  Video and statistical analysis will surface these situations. Then a coach should be using them to focus on a limited number of these game patterns to prioritize (3-5) situations/skills for a player to work on.

These four recommendations would require a significantly greater amount of time and resources than the current effort being done at the USA Hockey girls camps. There may not be time to aggregate everything during that week.  But the feedback session does not need to occur at the camp. It can be done a week or two after the camp via a video-call.  What matters most is that the players are getting their needs met as to where to focus and improve as a player.  Ideally, there would be someone in leadership who was solely responsible for player development and not directly associated with the selections for the U18 camp or team. I know it can be done, because I have seen first-hand more robust feedback processes on the boys side at both the USA Hockey and junior hockey levels.

Final Thoughts

The best organizations focus relentlessly on their customers. One of the biggest ways to ensure these organizations are meeting the needs of their customers is to ask them for feedback. Specifically their overall satisfaction with a question like “Would you recommend [product/service] to a friend or colleague?” followed by “Why?”. In my few years interacting with USA Hockey both as a coach and a parent, I have never been asked for my feedback on the programs I’ve been been engaged with. In essence, USA Hockey has a monopoly on the national team programs so it is understandable that they may not need to be as customer-centric as an Amazon or an Apple. But, if leadership for USA Hockey female national camps wants to continuously improve their program, just like their players do, it would be great if they solicited their own feedback on areas they can improve as an organization. Who knows…maybe getting the gift of feedback on themselves may translate to improved performance on the ice?

Feel free to send feedback on our posts or Champs App to feedback@champs.app

Categories
2023 Coaching Girls Hockey Player Development Women's Hockey

The USA Hockey 2023 Girls 16/17 Camp Feedback Process – Part I

I have a lot of passion about feedback when it comes to hockey player development, because I think it is probably the most important factor to improve player performance.  Darryl Belfry, who is regarded as one of the best player development coaches in the world, uses actual game analysis as the primary way to provide feedback on improvement areas for players.

As the governing body of hockey in the U.S., USA Hockey understands the importance of player feedback. At the USA Hockey 16/17 Girls Camp which took place in Oxford, OH this past June, feedback was highlighted in the parent meeting as a key component of the camp.  In Part I of this post about the USA Hockey Girls Camp feedback, I wanted to focus on understanding the three levels of feedback  utilized during and after the camp.  Part II of this topic will discuss my thoughts on how effective the feedback process has been.

1. On-Ice Feedback  – During Practice and Games

Just like with their regular teams, coaches were quite consistent in talking to players individually and in groups during practices to share their thoughts on specific, tactical ways to improve a drill or situation.  Same for a player coming to the bench during one of the games after a shift – coaches would lean over to players and give advice on what adjustments could be made to improve a player effectives.  These situations are quite comfortable for all the coaches at an event like this since most were DI coaches or previous DI players.  As I mentioned in my previous post about player feedback, in-game comments are the easiest for a coach to communicate.

2. One-on-One Feedback with one of the Team Coaches

All teams had two head coaches.  On about the fourth day of week-long camp, each player had a 10-15 minute conversation with one of their coaches.  It is my understanding that most players were asked to do a self-review in anticipation of the meeting.  From talking to several parents, the coach-player conversation was then highly dependent on the coach. Some coaches were well-prepared and had video clips to show players as a way to communicate their feedback, some coaches had simple basic priorities for players to focus on while others relied on the player’s self-evaluation as the primary source of the feedback conversation.  Given the variance in feedback methods, I suspect the feedback meeting process was not highly structured by the camp organizers.

3. Letter Grade and Player Development Performance Criteria

About four weeks after the end of the 16/17 Girls Camp, my daughter received by snail mail a form letter which included an evaluation which is supposed to serve as a benchmark for a player’s performance at the camp.  This entails a letter grade and a rubric on the “Player Development Performance Criteria”.  Here are the details.

At the top of the player evaluation sheet, the players was provided a rating of A, B or C with the following explanation

“A” grade = Excellent – ranks in the top 1/3 of players at camp

“B” grade = Good – ranks in the middle 1/3 of players at camp

“C” grade = Below average – ranks in the bottom 1/3 of players at camp.

The Player Development Performance Criteria had 5 possible selections (from best to worst):

  • Excellent
  • Very Good
  • Good
  • Fair
  • Poor

Each skater then had attributes selected within two categories.  General and position-specific attributes with a selection in one of those five boxes (“X” for each attribute).  Here are those attributes:

General:

  • Makes Possession Plays (i.e. keep team on offense; limited turnovers)
  • Angling: pressure to take away time/space; dictate play with body/stick
  • Stick Positioning
  • Deception
  • Quick Transitions
  • Off-Puck Habits & Puck Support
  • Scoring Ability
  • Physicality
  • Athleticism
  • 200-Ft Player
  • Skating Ability (north/south; agility; speed)

Defenders:

  • DZone Execution First
  • Puck Retrievals
  • Good First Pass or Exit
  • Win Race Back to D-Side of Play/Net
  • Wine Board Battles
  • Deter Offensive Opportunities
  • Scan to Make Exit Play; Fast Transition to Breakout
  • Work Well with D-Partner
  • Gap Control: (North/South & East/West)

Forwards:

  • Puck Retrievals & Ability to Stay Off the Wall
  • Ability to Leave Perimeter and Gain Inside Ice
  • Owning Space with Puck
  • Scanning/Awareness of Teammates & Opponents
  • Use Teammates to Make Plays
  • Zone Entry: Ability to create depth/layers/lanes
  • Create & Maintain Offense

I don’t know the process that was used to aggregate the evaluators feedback, but am assuming they collected a populated rubric from all the evaluators for a position and then aggregated the data to take an average of the selections.  (I hope they used some online tool to aggregate this all, because there are lots of ways to simplify collecting this information).  Then I suppose this compiled data was used as the rating for each player’s Development Performance Criteria. I would then assume the average across all Development Performance Criteria was calculated and the each player was force ranked into one of the three tiers to give the letter rating of A,B or B based on which third they ranked.

Other than the rating and the rubric box selection – no other personalized information was included in the feedback. No short paragraph summary (like you would see in a student report card) from the coach or evaluators to provide additional context was provided.  

It is important to note that the ratings are based on the criteria described above.  If different criteria were used (which will be discussed in the next post), then a player’s rating might be different if those criteria were closer or further away from the capabilities of a player.

In Part II on this topic I will share my perspective on the good, the bad and the ugly of this feedback process.

Read Part II Here

Categories
2023 Development Camp Hockey Tryouts Junior Hockey

What I Learned Attending My First Junior Hockey Main Camp

Last month, my 15-year old son was invited to the main camp of a NAHL team in Minnesota.  This was the follow-on event from a Summer Tryout showcase in June, hosted by several NAHL teams, in which my son was invited to attend the July main camp at the end of the showcase.  Here are some details that I learned from the camp:

  • The camp started with 8 teams of up to 22 players – each with 12 or 13 forwards, 6 or 7 D and 2 goalies
  • All the players at the camp were 2003-2007 birth years.  My son is a late 2007, so obviously, he was one of the youngest players at the camp.
  • Each team played 3 games consisting of two 25 minute periods with a running clock.
  • To keep things flowing, icings and most offsides were almost never called. And any puck which touched the netting and returned to the ice did not stop the play.  When the very odd penalty was called, a penalty shot was granted.
  • With 13 forwards, unless you started the game, a forward typically only got 6-8 shifts per game.  This was because everything eats into the running time, goals, faceoffs, penalty shots etc.  Most players were taking ~75 second shifts. So when you do the math with four lines, a forward only received 3-4 shifts per period. Not a lot of time to show what you can do.
  • After 3 games, the first players cut took place with the list of players making it to the next day posted on Instagram and Twitter.  The announcement just showed the team colors and numbers of players who made the cut – no names were listed.
  • Out of the ~160 players who started the camp, 99 players made the first cut. My son was on the list, so he would play 2 more games the next day.
  • The next day the players were then assigned to one of 6 teams. Once again each team had up to 22 players (13F, 7D, 2 G).
  • What was confusing to me was if there were only 99 players who made the cut and the teams reduced to 6, how could there be so many players on each of the 6 teams? That’s when things got real. What I learned was that the first part of camp did not include all the players on the team’s protected list. So about 30 players – made up of players from the past season, draft picks etc. were then added to the rosters of the 6 teams.
  • While there were a few players with half-shields (usually you have to be over 18 to play with a visor) playing the first couple of days of the camp, that number more than doubled for the second round.  In addition, I was impressed by how many of the players also had moustaches to enhance their hockey player look.
  • The age, skill and size of the average player went up dramatically from the first round of games. For one of the games, my son’s linemate was a 2003 player committed to play DI hockey.   The difference in size and skill was obvious. Man vs. boy.
  • While my son played well in both games, including a solid assist to set up the DI player’s goal, he clearly did not have the size or speed of the top players on the ice.  As a result, he made a few mistakes turning over the puck along the boards or missing passes when under pressure.
  • Following the two second-round games, we once again looked online at the end of the day and reviewed who made the next set of cuts. 
  • 81 players (out of ~130)  made it to the All-Star games with a separate 20 players moved to something called the Young Guns Game at the end of the next day.  My son was not selected for either games, so his tryout was done.
  • The next day the All Star games took place.  After they were done, 27 All Star cuts were assigned to the Young Guns Game.
  • We didn’t stick around for the last two days, so I can’t provide any details about the games (LiveBarn feed was blacked out after the first cut).
  • However, I do know that 44 players made the cut for the final All Star Game which included 6 goalies (couldn’t tell how many F vs. D).  Not sure if the team was finalized after the last game or if about 30 of the players then were invited to training camp for the final team cut.

 Some additional thoughts:

  • One real positive aspect of the camp was that the coaches made it clear that all players who were cut could reach out for feedback when the camp was over.  My son had a phone call with one of the coaches and received  helpful feedback (which was much more specific than what my daughter received has from the USA Hockey Camps)
  • This was a great learning experience for my son to see the level of play of the NAHL. The NAHL is known to be an older Tier II junior league with the average player age of about 19.5 years old. So seeing where my son needs to be in the next 2-3 years was an eye-opening moment for him and seems to be quite motivating to him.
  • This spring/summer my son received dozens of invitations to a variety of junior camps at every level (USHL, NAHL, Tier 3 etc.).  I have heard that unless you were drafted by the team or know that a team has specific interested in you, that most of these invites are just a way for teams to make money. I felt I could see this at the camp.  There probably wasn’t a need to have so many players at the first round of cuts with so many players on each team.  Just doing the math on the ~60 players who did not make the first cut @ $375/player is over $20K in the team’s pockets. That is on top of the money they made at the 3 summer tryouts.
  • Each player needs to decide what is right for them, but it is very easy to waste a lot of money (and time) attending multiple events. You need to be realistic about your odds of making a team and self-aware about how close you are to the level of play needed.  We went to see what the level was and learn from the experience – and now we know. And for the next couple of years I don’t expect my son to be trying out for many teams until we think he is ready and there is a reasonable chance he could at least make it to the final round of cuts (if not make the team).
Categories
2023 Development Camp Girls Hockey Player Development Women's College Hockey Women's Hockey

Analyzing the USA Hockey Girls 16/17 Camp Forward Selections for the U18 Camp

USA-Hockey

The is the second analysis I have done about the selections for the USA Hockey Girls U18 Camp which took place last week. The first was about the defenders picked to go to the U18 Camp. Now that the selections of the forwards from U18 Camp to go to the Women’s Festival were announced a few days ago, it makes this analysis even more interesting because none of the top 3 point-getters from either the Girls 16/17 Camp nor the U18 Camp were selected to advance to the next stage in the process.

WHAT?

Similar to the previous post, rather than engage in a subjective discussion on who was selected, I thought it might be helpful to collect some analytical data and metrics to understand how top players performed at the 16/17 camp and compare them to a couple of the players who weren’t selected.

WHY?

When you don’t select the top 3 point-getters from either Girls 16/17 Camp or the U18 Camp, there are bound to be a lot of folks who wonder what the selection criteria is for making it to the next stage of USA Hockey. I don’t know the answer to that question. But I can analyze the video of each shift for several of the top players picked and not picked to see if there is an obvious difference between the two segments. The purpose of this post is not to say who did or did not deserve to be selected to the U18 Camp. Instead, it is to help provide perspective and context to other players and parents the types of metrics that demonstrate the level of play needed to be selected.  And ideally, individual players do their own self-analysis to see how they compare.

HOW?

I watched and coded specific attributes for every shift in all 4 games for every player in this analysis using the USA Hockey TV footage. I collected more metrics than are listed below, but I feel that the attributes shown, provide the right amount and level of data to gain an understanding of the level of play for this position. Note: Sometimes the live stream footage didn’t always focus on the area of the ice where the play was taking place, so it is very likely the odd play may have not been accounted for.

WHO?

Here is the list of the 13 players selected to go to the 18’s camp

Since I only had the time to watch 5 players – I watched 3 selected forwards plus 2 top players who weren’t selected. Those 3 forwards represented a mix of the forward selections.  I am not identifying the names of any players because singling out any individual player is not my objective.  For full transparency, in this analysis I do know the parents of one of the players.

SO WHAT?

Do I think the 5 selected were in the Top 10 forwards at the camp, almost certainly. Do I think there are 3-5 other players that could easily have been selected instead – also, almost certainly. There is no algorithm to calculate and rank the top players. I don’t know the selection criteria, so whatever they may be (whether well-structured or not) at the end of the day what matters is results. As stated in the parents meeting, the results of the last two U18 World Championships was not the result USA Hockey wanted – so we will see if the current process yields better results.

THE ANALYSIS

2023 USA Hockey Girls 16-17 Camp Analytics for Forwards Selected to Advance to the U18 Girls Camp

Note: Players 1-3 were selected to go to the U18 Girls Camp – Players 4 & 5 were not selected

Some notes on the tracked attributes:

  • Takeaways = a one-on-one situation where the player gains control of the puck from directly challenging the other player
  • Giveaways = full change of possession to the other team (e.g. a missed pass, dump in/out, rim or redirected puck)
  • OZone entries = skating across the blue line with full possession of the puck
  • Team Shots For/Against do not include shot attempts that did not reach the net. Only SOGs were included.
  • I am not including the point stats or PIMs for any player since they can already be found on the USA Hockey website
  • There were additional attributes I tracked like “faceoffs won” but they indirectly show up in other higher-order key metrics. Since not all the forwards played center, I didn’t include the faceoff attribute.  But I did want to note, that one player was very good at faceoffs while another was not.  The one that won most of their faceoffs did see that reflected in other measurement areas since many faceoff wins led to greater possession time.

OTHER THOUGHTS

  • From all the players and games I’ve watched, it seems (and it’s only natural) that really good plays are rewarded disproportionately more than their equivalent poor plays are punished (e.g. creating a “wow” scoring chance vs. causing a “wow” scoring chance for the other team). Forwards tend not to surrender many negative scoring chances unless they are somewhat negligent defensively.  So, it seems likely that creating offense is highly disproportionately weighted in player evaluation.
  • Not all players gave the same defensive effort throughout a game, whether it is being tired or laziness.  But over the course of four games, it was pretty clear who consistently tried to play a 200-foot game (vs. cheating a little defensively or taking some shortcuts).
  • Scouting and evaluating is not an exact science.  In my humble opinion, most of the scouts/coaches don’t watch any player enough to really get the full picture.  It is sampling data – and while it is directionally correct, when there are many players within a close band it is hard to discern who is absolutely the “best” player. And who you pick may vary when you are building a team for a short tournament and need different types of players. 
  • After watching over 20 hours of individual game footage, this process is somewhat exhausting. It takes a lot of work to watch and tag each type of play. I can’t imagine being a scout and trying to watch 10 skaters live on the ice throughout an entire game.  At the same time, the insights are quite valuable.  I hope that college scouts leverage Instat to watch players individual shifts (if a club/prep team uses Instat) to evaluate the full body of their work rather than just sampling one or two periods of a game during a tournament or showcase weekend. To me, it is hard to watch multiple players in a game rather than on just one player at a time.
  • Note: We are still waiting to on the written feedback and letter rating that we were told all players would receive.  If you are a player or parent from 16/17 Camp who has a received this feedback, please reach out and let me know. Update: We did receive the USA Hockey Feedback on July 27th – I will be writing up my thoughts on the feedback process in a upcoming post.
Categories
2023 Development Camp Girls Hockey Player Development

Analyzing the USA Hockey Girls 16/17 Camp Defense Selections for the U18 Camp

As I mentioned in my previous post about USA Hockey Girls 16/17 Camp, there was a a mix of perspectives on the selections for 18’s camp.

WHAT?

Rather than engage in a subjective discussion on who was selected, I thought it might be helpful to collect some analytical data and metrics to understand how top players performed at the camp.

WHY?

The purpose of this post is not to say who did or did not deserve to be selected to the U18 Camp. Instead, it is to help provide perspective and context to other players and parents the types of metrics that demonstrate the level of play needed to be selected.  And ideally, individual players do their own self-analysis to see how they compare.

HOW?

I watched and coded specific attributes for every shift in all 4 games for every player in this analysis using the USA Hockey TV footage. I collected more metrics than are listed below, but I feel that the attributes shown, provide the right amount and level of data to gain an understanding of the level of play for this position. Note: Sometimes the live stream footage didn’t always focus on the area of the ice where the play was taking place, so it is very likely the odd play may have not been accounted for.

WHO?

Here is the list of the 13 players selected to go to the 18’s camp

For D analysis, I included the 3 players selected plus another ‘top D’ player who was not selected. I am not identifying the names of any players because singling out any individual player is not my objective.  However, I can say, that I personally do not know any of the players or their parents that were included in this analysis.

SO WHAT?

Based on my analysis, I don’t have any issues with the D selections since measuring defense is not an exact science.  I am sure there were other players for whom there is an argument they could have been selected instead – but the differences are hard to discern in just 4 games and I don’t expect the selection committee to be perfect in only picking players based on their game performance.

All selected players made several really good plays (both offensively and defensively) in their four games – many of which were ‘highlight worthy’.  At the same time, these same players made multiple, significant turnovers/mistakes which resulted in high scoring chances for the other team. This goes to show you that none of the D were anywhere close to being perfect. But overall their consistency over 4 games is what you can see in the metrics.

THE ANALYSIS

Note: Players 1-3 were selected to go to the U18 Girls Camp – Player 4 was not selected

Some notes on the tracked attributes:

  • Offensive Shot Attempts does not mean the shot made it to the net – as mentioned in my previous post, I estimate almost 80% of all point shots were blocked or missed the net.
  • Turnover = full change of possession to the other team (e.g. a missed pass, dump in/out, rim or redirected puck)
  • I am not including the point stats or PIMs for any player since they can already be found on the USA Hockey website
  • Note: With only 4.1 goals per game combined between both teams, all the top players played strong defensively and were not on the ice for many goals.  This can be seen in their “On-ice goals for/goals against ratio” (this is different from the traditional +/- stat). 
  • There were additional attributes I tracked like offensive zone entries and good defensive plays. In addition metrics like pass attempts or turnovers could be segmented further by situation – however, given the outcome based of the measurements presented here, I feel they are a good representation of how each player played.

Finally, yes, I did a similar analysis for my daughter’s games (for her eyes only). And we are using the results to prioritize her summer development plan.

NEXT ANALYSIS

I have already started working analyzing the Forwards who were picked for the U18 Camp. This is a little more complicated since there were 5 forwards selected. I will not be doing goalies, because I don’t feel qualified to do so – and as mentioned previously, from what I’ve been told by goalie experts, there is a huge weight given to one-on-one time spent with an evaluator to judge goalies.

Categories
2023 Girls Hockey Player Development Women's College Hockey

More Thoughts on the 2023 USA Hockey 16/17 Girls Development Camp

This is the second post about the 2023 USA Hockey Girls 16/17 Development Camp

You can read the first post here

I wanted to get this our right after the camp, but didn’t have time before taking a week-long vacation.  But here are some additional thoughts that I compiled during my time in Oxford:

  • The operational excellence of the camp was consistent the entire week – kudos to the organizers for such a well-run event. Especially when compared to many other camps, showcases and tryouts I have seen on both the girls and boys side of hockey
  • Unlike the previous camp I attended, I now appreciate all the different paths to hockey excellence there are in the U.S. I could now see where all players came from that I learned about over  the last couple of years – hockey academy, top clubs, prep and Minnesota
  • The last two days of games brought a whole slew of additional DI and DIII coaches to Oxford. I personally saw coaches from just about every school  (over 30 DI teams) – however there were some top programs where I didn’t see a representative (e.g. Wisconsin, UMD, Colgate, Northeastern) and several NEWHA schools (note:  they may have been there, but I just didn’t see them).
  • One DI coach did tell me that some of the players looked tired for the fourth game – while another thought there was better team play the last two games compared to the first two games.
  • 7 players who were at 16/17 camp this year were at 18s camp last year.  1G , 2D  &  4Fs.  Two of them were players who were selected from the 2022 16/17’s Camp to go the 2022 18’s Camp – the other 5 went direct to 18s last year.
  • Here is the list of the 13 players selected to go to the 18’s camp
  • 5 of those 7 2022 18’s Camp players were selected to return to the 18’s camp this year from the 16/17s Camp
  • In  general,  I noticed a big difference between the average 2006 and average 2007 player. seemed to be weaker. That one extra year of development is noticeable not just size, but hockey IQ
  • Interesting stat – the Girls 16/17 Camp averaged 4.1 goals per game (combined both teams) while the Boys 17 Camp averaged 10.0 goals/gm and the Boys 16 Camp averaged 8.6 goals per game. Significantly less scoring on the girls side.
  • Unofficially, I estimated that about 80% of point shots were blocked/never reached the net – surprisingly low for this level of play
  • It seems that just watching games isn’t sufficient to judge players – while important – there really are a lot of nuances you can get from practices that you can’t see from a live stream that likely factored into players selected for the 18s camp
  • Depending on position and length of shifts, most players only had between 40 and 50 shifts to demonstrate their abilities over the course of 4 games. Which isn’t a lot, all things considered.
  • Based on talking with multiple parents and players there was certainly a mix of perspectives on the selections for 18’s camp.  I will hold off judgment on skaters until I spend more time reviewing video of the players selected in comparison to other top players who were not selected.  I do not feel qualified to analyze goalies, especially based on past conversations with expert goalie coaches – but I do know that you can’t just rely on game performance in goalie evaluations.
  • I can’t include everything I want to discuss in this post, so I am going to publish additional posts sourced from the camp including:
    • A candid conversation with a DI coach on their detailed recruiting process for their 2025 recruiting class
    • Applying some analytics to the players selected from the 16/17 camp for the 18’s Girls Camp
    • My thoughts on the 16/17 Camp feedback process – which is dependent on receiving the official player feedback report via snail mail expected sometime this week.

Next:

Analyzing the USA Hockey Girls 16/17 Camp Defense Selections for the U18 Camp

Categories
Player Development Strength and Conditioning Women's College Hockey Women's Hockey Youth Hockey

Summer Hockey Development Plans

How I helped create a summer training plan for my kids

Since both my kids returned from school, I have been very focused on helping them figure out what to work on this summer.  Each of them has a big tryout that they need to prepare for – in addition to continued development for next season.   My kids are completely different players. One is a forward, the other defense. One is above average in size, the other is slightly under-sized. One is a lefty, the other a righty.  

After re-watching 4 or 5 games for each kid from mid-to-late season I was able to identify several key areas that they had a pattern of underperforming. But then, since I am not really a hockey coach, I needed to figure out how they could improve their performance in those areas. Specifically, I followed the methodology I previously discussed about tracking high-frequency events and success rates based on the teachings of Darryl Belfry.

I am not sure we figured out the secret sauce, but I wanted to share my research methodology and how it translated into an action plan.

For each of my kids, I chose 2 or 3 players who I knew were clearly more successful in those key areas. All of them would be considered top players at the USA Hockey national level. As a result, finding historical video from those players either on HockeyTV, LiveBarn or from the recent USA Hockey Nationals was not difficult.  Once again, I watched 3-5 games for those benchmark players to see how they handled the same key situations as my son or daughter.  What I learned was enlightening.

To provide one specific example, I watched video via HockeyTV of Caroline Harvey (Olympic medalist and recent rookie of the year at Wisconsin) way back during her time at Bishop Kearney Selects through to her games at the U18 USA Hockey Development camp in 2019. Seeing how she handled similar game situations provided excellent contrast to my daughter’s play.  The way KK could handle the puck and find time and space at that young age was truly impressive – and makes it very easy to understand why she is a generational talent.

For each player under analysis, patterns and insights emerge after 2 or 3 games. Each player is different, and I found there was at least one attribute for each player that made them special and worth emulating.

Note: this was not a one-day exercise watching all the games and collecting video snippets to review/ edit at a later time. It took several days to watch the video for each player.

I then spent time individually with my kids over the course of a few days discuss with them the areas I recommended they focus on (most of them they already knew). This included showing them video of themselves not succeeding (which they did not enjoy) and then showing them clips of the benchmarked players completing similar situations successfully.  We are still early in the summer, but both kids have been working on these areas by themselves and with their skills coaches. 

We shall see how effective this whole process is when we get to the fall, since I have no expectations that my kids will see immediate results.  But one of the key learnings for me about this whole exercise was not to depend on my kids’ team coaches for their development plans and how to implement them (as I have alluded to in a previous post about hockey development plans).

Categories
Development Camp Girls Hockey Player Development

Observations from the 2023 USA Hockey Pacific District Camp

Earlier this month my daughter attended the USA Hockey Pacific District Camp for the third and final time (she’s aging out of the U18 events).  Now that the results have been posted, I am posting my thoughts on this year’s event. Feel free to read my previous summaries from the 2021 camp and 2022 camp to understand the three year experience.

Overall, operationally speaking, this was clearly the best run district camp of the three she attended.

Just like previous years, there were three practice/skills sessions and three games. The practice/skills sessions were well organized and structured – and in my opinion, allowed the evaluators to see how players performed both offensively and defensively beyond just the games.

More Teams

There were some significant changes from previous years.  First, the number of teams for the 16/17 age group was increased from 4 teams to 6 teams (the 15’s age group had 4 teams similar to last year). There are arguments to be made on both sides about the pros and cons of increasing the number of players invited to attend. However, on-balance, as we try to grow the girls game on the west coast, I think it worked out just fine. The overall level of play may have been a little diluted, but the goodwill from attending the event works for me. Plus, the extra money it generated allowed more USA Hockey staff to attend from all over the country. 

More Coaches

Unlike the last couple of years where it seemed to be only 2-4 coaches watching from the stands while another 2 coached from the bench. There seemed to always be at least ~6-8 coaches scouting from the roped-off coaches section in the stand.  Another big change, as referenced above, was not only the number of participating coaches, but also the list of coaches and their role during the weekend was shared with all attendees via email.  In the past, I had to work hard to identify who all the coaches were and decipher the role they played. The day after camp ended, we were emailed the full list of coaches, where they were from and what role they played (evaluator, volunteer, USA Hockey Staff) – which was awesome.  No more guessing.

The only complaint I heard via several parents (from their daughters) was that it seemed that some of the coaches were over-coaching on the ice. There were lots of times coaches would stop drills and call everyone over or a coach would give detailed feedback to a specific player.  Feedback is good – I love player feedback – but at an event like Districts, players don’t want to get drill-related  feedback from every coach they interact with. What players really want is feedback on how to improve their overall game.

Same Number of National Camp Spots

I am not sure what players and parents expected in terms of realistically making the USA Hockey National Camps, but the odds aren’t good for most players.  Here are the numbers of National invites (based on % of registrations of girls in the Pacific District):

Notes:

  1. Only 1 2008 forward was selected to go straight to the 18s Camp (last year 1F and 1 D went straight to 18s)
  2. Goalies are selected at the national level and not dependent on the proportion of district registrations

So hopefully, most players, especially those who were invited from the alternate lists (or not even originally selected) understood they were long shots to make it National Camp and were just happy to go to Las Vegas.

Goalie Development

Another positive from the event was when I talked to the goalie coaches for the district and she explained how they evaluate goalies, the process of providing goalies feedback and tracking their development from year-to-year.  I wish they would have done something similar for skaters – because in the 3 years we’ve gone, there has been no pro-active mechanism to receive feedback from the event for skaters.

A few other points:

  • Games were two 32 minute running-time halves – which was 2 minutes more than last year
  • The refs were less noticeable this year compared to last year.  Which is a good thing.
  • The jerseys were 100 times nicer than previous years (not embarrassing to have mismatched jerseys and socks like last year) – with a number scheme which made it clear who were 2006s and 2007s.
  • It would have been nice to also have the jersey #s included in the roster lists that were sent out so parents didn’t need to try to figure who the players were by themselves
  • Everyone had to travel to Vegas for the weekend, with many coming from out-of-district.  I hope parents and players felt that the total cost of the weekend was worth it. Unless you were driving from California, the weekend had to be super-expensive.