Authors: Gaetan Martini, M.Sc., JF Brunelle, M.Sc., François Trudeau, Ph.D., & Jean Lemoyne PhD

Corresponding Author:
Jean Lemoyne, Ph.D.
Department of Human Kinetics [Sciences de l’activité physique]
Université du Québec à Trois-Rivières
3351, des Forges, Trois-Rivières (Québec) Canada G9A 5H7
jean.lemoyne@uqtr.ca

Gaetan Martini is a graduate student (master degree in exercise) and works in the field of fitness testing and sport training. Jean Lemoyne is professor at the Department of Human Kinetics at Université du Québec à Trois-Rivières (Canada), and work in the domain of quantitative research in sport sciences. JF Brunelle is a graduate student and physical preparation specialist who work with the UQTR varsity teams. François Trudeau is a professor at UQTR (Human Kinetics), and a certified exercise physiologist.

Measuring ice hockey skills in a repeated measures testing context: The effects of fatigue on skating efficiency, passing, agility and shooting

ABSTRACT
Purpose: Ice hockey testing traditionally consists of isolated, skills-specific tests that are performed in less realistic contexts. Global testing approaches should offer an improved assessment of players’ skills and performance fluctuations during a hockey game. This study aims to measure ice hockey players’ skills and analyze their fluctuations via a protocol that reproduces the demands of a hockey game. Methods: Fifty-nine hockey players (14.6 ± 2.1 years) participated in the study. The protocol involved four repeated measures assessing five components: speed, acceleration, passing, agility, and shooting, with supervised, 2-minute rest periods. Descriptive statistics and repeated measures ANOVAs were used to analyze performance fluctuations. Results: Findings revealed that the best scores were obtained at the first and second repetitions. A significant decline in performance was observed for speed, acceleration, and shooting (p < .01). Inversely, participants seemed to adapt to puck control and passing stations, as they became faster without decreasing skating abilities. Perceived exertion and recovery time increased during the protocol. Conclusions: In summary, performance was affected by fatigue starting the third repetition of the testing protocol, and should be considered when assessing players’ skills. This study demonstrated the feasibility of an on-ice testing protocol to evaluate players in a hockey-specific context. Applications in sport: This study demonstrated the feasibility of an “on-ice” testing protocol that represents a more realistic context for measuring players’ abilities. Such protocols allow coaches to evaluate the effects of fatigue on multiple determinants associated with performance in ice hockey.

Keywords: Ice Hockey, Testing Protocol, Fitness, Fatigue, Skills assessment

INTRODUCTION
Ice hockey is a fast-playing team sport characterized by multiple direction changes, resulting in numerous transitions during play (24). Weineck (46) maintains that each sport’s performance parameters are inseparable when sports organizations seek to help their athletes develop or perform at their best. Because of the broad diversity of tasks it involves, ice hockey is a physiologically, technically, and tactically demanding sport (2,3). In Canada, hockey skills (e.g., skating, shooting) and the related technical aspects account for between 35% (in 16 year-olds) and 85% (in 7-12 year-olds) of the national player development program (16,17). Ice hockey’s multidimensional demands go far to explain why coaches and stakeholders often assess players on specific aspects of the game. In general, the purpose of ice hockey testing is to evaluate players’ current level of performance, potential, and developmental status. The results of on-ice testing can therefore guide stakeholders (coaches, managers, etc.) in selecting players and/or in drafting prospects (7,11,43). Moreover, on-ice testing can also be useful for gauging the effects of off-ice training, and is a helpful tool for conditioning coaches and trainers.

Over the last two decades, “on-ice” testing has attracted considerable attention in many fields of research. Before then, most hockey testing studies focused on exercise physiology and the anthropometric measures of hockey players (26,31,37,42). For example, Green and colleagues established relationships between on-ice performance and the physiological profile of elite collegiate players (11). Antecedent research also focused on the transferability of physical attributes to on-ice performance. In this regard, Burr showed that results obtained from fitness testing could be an indicator of on-ice hockey performance potential (7). Furthermore, it appears these indicators varied based on the position of the players. In the Burr study (7), for example, players who performed better in the vertical jump test were mainly those demonstrating good on-ice skating speed and acceleration. In 2015, Janot and colleagues (20) showed that some off-ice parameters were good indicators of skating performance among Division III collegiate players.  However, some contradiction remains, especially in the purpose to transfer from off-ice physical attributes to on-ice performance. It was shown that off-ice testing performance was not the best indicator of the National Hockey League (NHL) draft ordering, which reveals a problem when attempting to establish relationships between off-ice testing results and on-ice potential (43). One reason is explaining such gap may reside in the specific context of on-ice performance. For example, testing a player’s ability to make effective passes in an isolated test procedure may produce a biased picture of his capabilities instead of demonstrating his capabilities in a more realistic context (e.g., game situation).  In that sense, performing in isolated tests may not reflect the multi-task context of hockey games (43). Such conclusions invite researchers to establish fitness/skills standards that may be closer to on-ice performance.

In many team sports such as hockey, an aspect of the specific game context is the ability to repeat on-ice sprints (3,8,39, 45). Carey et al. (8) found that forward skating times increased progressively from the first repetition to the last, which appeared to indicate on-ice fatigue when repeat sprints were performed. As observed with physiological testing, however, a gap exists between on-ice tests and on-ice performance, whereas test results may not be representative of a typical shift on the ice (28). More specifically, a repeated sprint protocol does not represent a typical full shift, because high intensity sprints occur for only 5% of the total on-ice time of professional ice hockey players (3). More recently, Peterson et al. (29) compensated in part for the test/performance gap by creating a repeated shift ability test for ice hockey players that mimics more specific physical demands and reproduces the patterns of on-ice movements. These findings are promising and encourage the inclusion of repeated sprints during testing protocols.

Additional performance-related dimensions need to be taken into account, especially if coaches aim to produce a detailed picture of a player’s aptitudes in hockey. The technical aspects of hockey (agility, shooting, etc.), for example, must be considered with particular regard to identifying talent and assessing players’ status of development in their sport. To this end, a battery of field-tests for measuring puck handling and skating agility was developed to facilitate performance assessment for coaches and hockey organizations (14-16,18,19,22,23,27). Despite the feasibility of such tests, however, few were developed through an exhaustive validation process, and little research was done on the assessment of on-ice skills in hockey players. The Cornering S-turn agility test (12) and the on-ice pro agility test (27) were developed and validated as on-ice agility tests but are not widely used at high playing levels such as the NHL and other professional leagues (27). To the researchers’ knowledge, no previous research has focused on the evolution of agility in a repeated context.

A major limitation of on-ice tests, moreover, is that they tend to be done separately (e.g., one after the other), which may represent an “isolated” way of assessing hockey players’ attributes. Many experts therefore recommend testing sport-specific abilities more dynamically (29). Studies have already been conducted to create and validate field tests reproducing physiological and technical sport demands in repeated and non-repeated contexts in sports such as handball (45), rugby (1, 36), basketball (35), and soccer (34). Accordingly, we believe that developing a protocol that takes into account the physiological and technical demands of ice hockey would help address this knowledge gap. More thorough testing (more repetitions, multi-task tests) is likely to provide more information about players’ abilities. Furthermore, many of these tests are based on empirical data alone, and few have employed rigorous validation procedures. In the researchers’ view, the reproduction of typical on-ice shift time and intensity and off-ice recovery is justified as a way to test the validity of a more dynamic testing protocol adapted to the demands of ice hockey (24). This procedure should offer appreciable information to coaches and stakeholders concerned with player assessment. In summary, the researchers believe that, in order to be more representative of a specific game context, on-ice testing protocols should: 1) include repeated sprints to induce a sufficient level of fatigue, and 2) assess multiple technical abilities to provide a multidimensional appreciation of players’ performance potential.

The purpose of this study is to implement an on-ice testing protocol to determine hockey players’ abilities in a more realistic context than traditional isolated testing procedures. This protocol aims to describe how hockey players perform during a testing protocol resembling a real-time game context. More specifically, this investigation has two objectives, which are to determine: 1) if the suggested protocol shows similarities with the demands of the game by using time and intensity as indicators, and 2) if and how repeating the protocol affects performance with regard to ability. Indeed, the testing protocol aims to demonstrate how players’ abilities evolve, in a context that simulates repeated efforts and replicates the demands of hockey matches.

METHODS

Participants
Fifty-nine male hockey players (14.6 ± 2.1 years old) agreed to take part in the testing protocol. The project was accepted by the institution’s ethical board for research (CER-15-210-07.06). An information letter and consent form were sent to the parents of those under 14 years old. Participants were required to sign the consent form and return it to the research staff on testing day. Participants were enrolled in different minor hockey programs in the province of Quebec (Canada). Testing was done in four different sessions, but each individual data was obtained during a single hockey practice. As Table 1 shows, anthropometric measures and details on level of participation were collected before administration of the testing protocol. Participants belonged to different hockey categories (from Pee Wee [11-12 years old] to Junior [17-18 years]). For feasibility reasons (e.g., staff availability), tests were administered on different days.

Table 1
Measures and instruments
The testing protocol (illustrated in Figure 1) was designed to assess four hockey skills: 1) skating speed, 2) puck control and passing efficiency, 3) skating agility, and 4) shooting accuracy. An additional component, rate of perceived exertion (RPE), was included in the testing procedure for the purpose of comparing the protocol with hockey shift demands (based on antecedent studies). The testing protocol consists of four stations and is illustrated in Figure 1. To achieve its full objectives, participants were required to complete each station (continuously) for a total of four repetitions separated by a 2-minute supervised rest period. This scenario made it possible to assess the possible effect of fatigue on players’ abilities and skills. More detailed information on the procedures is given following the description of each measure.

Table 2

Skating speed and acceleration
As shown on Figure 1 (Station 1), skating abilities were assessed by using the skating acceleration test developed by Bracko (4). Intermittent ice skating tests like these showed very good validity and reliability in antecedent studies (6). A photoelectric cell and timing system were installed on ice (Brower Timing system, Matsport Training, 2010) at the starting (0m), intermediate (6.1m), and finish lines (44.8 m). Data were collected with the timing system, and time was recorded in seconds. For each repetition, two measures were recorded: total distance (44.8 m), and acceleration time (time for the 6.1 m segment).

Puck control and passing abilities
These two abilities (Figure 1, Station 2) were measured using an adapted version of the Hockey Canada protocol (15). It is part of the Hockey Canada battery of tests for assessing player development. In the puck control-passing test, participants are required to execute two tasks: first, skating with the puck and completing two tight turns by turning around two cones; second, executing a pass towards a fixed target. Participants were required to complete two circuits for each repetition of the protocol. In summary, three measures were performed in the puck control/passing station. First, total time for the total circuit (2 repetitions: puck handling + passes) was recorded by a coach. Second, the same coach assessed each player’s ability for puck control on a 5-point scale (1 (poor), 2 (fair), 3 = average, 4 = good, and 5 = excellent). Finally, the coach recorded for passing accuracy on a two-point scale for each passing situation (0 (missed), 1 (partially attained), 2 (on target) and for a maximum of 4 points for each passing repetition (2×2 points). All coaches were trained to facilitate the utilization of skating and passing evaluation scales.

Skating agility
Agility was assessed by using the International Ice Hockey Federation (IIHF) skating agility protocol (18). This protocol (Figure 1, Station 3) is frequently used by coaches from multiple levels, and serve as a good basis to asses players’ skating agility. Two measures were collected for skating agility. First, a coach responsible for this station timed the skating circuit. Secondly, the coach had to rate skating agility on a five-point scale (1 (poor), 2 (fair), 3 = average, 4 = good, and 5 = excellent). Criteria for good skating execution were distributed for coaches before testing.

Shooting accuracy
Shooting abilities (Figure 1, Station 4) were measured with the static shooting test developed by Hockey Canada (15). In the last station, participants took five shots towards the net with specific targets (four corners). The number of shots was categorized based on three possible results: 1) missed shot (0 point), 2) touched net but not target (1 point), and 3) was right on target (2 points). The coach recorded the player’s results for each repetition.

Rate of perceived exertion
The rate of perceived effort (RPE) was recorded at the end of each circuit using two methods. First, participants had rated their perceived exertion level after completing each circuit. Perceived effort was rated using the Borg CR-10 perceived effort scale (5). The 10-point Borg scale has shown good validity/reliability with similar populations (30).

Figure 1

Statistical analyses

All statistical analyses were conducted with the Statistical Package for Social Sciences software (SPSS IBM, version 23, Chicago, Illinois). First, the researchers verified the normality assumptions for each measure by conducting the Shapiro-Wilk test. In summary, each measured category respected normality assumptions (p > 0,05). Skewness and kurtosis values were also observed for each measure, with a score under 1.0, suggesting normal distribution (39). Missing data and outliers were negligible (excepted for heart-rate measures), representing less than 1% of total data. Next, descriptive statistics were calculated for each measure. Descriptive statistics, presented in Table 2, represent participants’ level of achievement in each station and correspond to the first objective of this study.

To achieve this study’s second objective, we assessed the repetition-effect on  participants’ abilities by performing repeated measures analyses. For measures with discrete values (e.g., speed, acceleration and agility), Bonferroni corrections were used to conduct a repeated-measures analysis of variance (RM-ANOVAs). For qualitative assessments (puck control, agility), we conducted the non-parametric equivalent to RM-ANOVA, which is the Friedman test. Probability values of “p <.05” were used to assume significant effects (rejection of the null hypothesis).

RESULTS

Descriptive statistics
Descriptive statistics are presented in Table 3, and represent scores from the first repetition. In summary, participants took between 80 and 136 seconds to complete one repetition of the entire protocol (M + SD) and about 1minute. 45 seconds (± 15 seconds) to complete the four stations. As expected, the “speed-acceleration” station was the shortest test, with an average of 6.8 seconds (± 0.6). The second station (puck control) was longer, with about 22 seconds (21.40 ± 3.68). Table 3 shows that participants spend an average of 1 minute in action, with about 45 seconds (± 14 seconds) in transitions (going from one station to another). Time of recovery was well supervised, with an average of 2 minutes per rest period, except for one case, in which one participant was injured during his first repetition. In this case, additional time was needed for recovery (this explains the 58 seconds standard deviation).

Table 3

Table 4 presents the results obtained from each station during the first assessment. In the second station, participants were able to complete an average of 2.6 (with four tries). The qualitative ratings (Stations 2 and 3) were similar in both stations, with average scores of about 3.0 on a 5-point scale (Mcontrol = 3.18; Mskating = 2.81). Shooting efficiency averaged 2.82 shots on a total of five tries. At the end of the first circuit, perceived effort had an average value of 6.55 points (on a 10-point scale).

Table 4

Repeated measures analyses
Table 5 presents the results of the repeated measures analyses, which show that performance during the sprint test declined significantly during the testing protocol (F values varying between 5.2 and 13.6, at p < .01). Post hoc analyses reveal that the decline occurred at the third and fourth repetitions. Interestingly, an inverse result was observed in Station 2.  In the puck control station, participants took less time to complete the puck handling circuit (F = 3.04, p < .01) without decreasing their passing accuracy and puck control ability (both F values were at p > .05). These results suggest that participants adapted to the second station by becoming faster but remaining just as efficient. In Station 3, skating agility (time and overall skating agility) remained stable with no significant changes between repetitions (p > .05). However, in Station 4, shooting accuracy declined (F = 3.02, p < .01), with a decrease in efficiency from the third set. Passing and skating agility (time and efficiency) remained stable throughout the protocol.

Table 5

In summary, the average time to complete each station remained stable with the exception of the second set (F = 3.79, p < .01). Transition time between stations remained stable (F = 1.26, p > .05), and more recovery was observed at the second repetition (because of the injured player). Post hoc analyses revealed no major differences between recovery periods 2, 3, and 4. The large increase observed in the second repetition is consistent with measures of perceived effort, which increase significantly from one repetition to another (c2= 55.03, p < .001). Figure 2 illustrates the trajectory of change for each measure of the testing protocol.

Figure 2

DISCUSSION

In ice hockey, on-ice testing has attracted attention for the particular purpose of establishing relationships between testing and on-ice performance. Research has shown that isolated testing may not necessarily indicate a player’s on-ice performance potential. As mentioned earlier, one of the main challenges here is the development of valid testing procedures to more accurately reflect players’ potential to perform in a game-specific context. The goal of the present investigation was to develop an on-ice testing protocol that reproduces hockey game intensity while simultaneously allowing for the assessment of on-ice abilities. To this end, we developed a repeated measure protocol that assesses four hockey specific abilities: skating speed, puck control, skating agility and shooting. This study makes it possible to analyze young hockey players’ performance in these four categories of abilities. In addition, our testing protocol allowed us to assess the impact of repeated testing on these hockey skills.

Does the testing protocol represent a real-game context?
Results show that the protocol used in this study had demands that resembled those of a game context. In terms of exertion duration, the 1-minute effort that players take to complete the entire circuit compares with what is observed during a full shift during a hockey game (4). In terms of intensity, our heart-rate data, viewed as an index of cardiovascular demand, shows that the protocol’s effort is similar to that of a game context, measures that have been obtained in other studies (38,39). Furthermore, the rate of perceived exertion was similar to values reported in previous studies (21). In this regard, we can assume that the testing protocol’s demands with regard to duration, intensity and perceived effort are relatively similar to those of the games. The researchers must be cautious of their interpretations, however, and take other objective measures into consideration. For example, play analysis, like number of direction changes and presence of opposition, would be needed to assume that the testing protocol exactly represents the demands of the game.

The protocol’s “repetition effects” on physical and technical attributes
With regard to repeated-measures testing, results are partly consistent with those reported by Carey (8), which showed a decline in performance starting at the second repetition. In the current study, the researchers observed a decrease in speed, acceleration, and shooting, whereas the statistically significant decline was observed starting at the third repetition. In 2007, Hagg and colleagues (13) showed a decrease in skating efficiency among a cohort female hockey players. Interestingly, the decrease was observed from the third repetition. Past research (10) suggests that such a decrease in performance is to be expected, whereas the appearance of fatigue (higher perceived exertion) was associated with less speed and a significant decline in shooting accuracy. Surprisingly, performance in the second station (puck control and passing) improved during the testing protocol. Such increases in level of performance suggest that players apparently learned how to perform more efficiently in the station. Accordingly, players took less time to complete puck control, and their passing performance was stable over time (e.g., non-significant changes over time). This adaptive reaction was observed in other repeated measure tests (41). In such cases, it is plausible to believe that regarding certain technical skills, more repetitions are needed to affect (diminish) performance level. In contrast, other abilities such as skating agility remained stable during the testing protocol, with no significant changes in performance. Different responses in motor abilities may vary owing to task complexity, whereas participants focus more attention on certain complex tasks (puck control) and consequently maintain (or increase) their abilities during the protocol period. The decrease in shooting efficiency indicates that shooting accuracy was affected in a repeated measure protocol. These findings confirm those of previous researches conducted among basketball (25) and water polo players (33).

Contribution of the study
The present study offers more in-depth knowledge of the potential of on-ice, “context-specific” testing protocols. First, it identifies a context that enables the assessment of many attributes at once. In consequence, this assessment protocol is less time-consuming and allows players to be tested at multiple stages of their development. Thanks to the simulation of a specific game context, stakeholders can address their players’ level in four categories of abilities in a repeated-effort context. Regarding methodology, this study proved helpful in a variety of ways. The size of the sample demonstrated the study’s feasibility and offered a good example of young players’ abilities in different levels of competition. Nevertheless, further analyses are needed to establish specific categories and/or position standards.

Limitations and future directions
Despite its contribution, this study has certain limitations. The first relates to methodology and involves the validation of the testing circuit. The researchers assumed the validity of each station by using existing tests for ice hockey; however, for reasons of feasibility and participant availability, researchers conducted this protocol without assessing each station’s reliability. A test-retest reliability test, especially with different sub-populations (e.g., age groups, level of expertise, etc.), should be done for further testing. Another limitation concerns ice quality during the testing protocol. Quality of ice tends to decline during such protocols and may affect the way participants respond to skating tests regarding, for example, tight turns, acceleration and changes of direction. Ideally, ice should be resurfaced after each testing sequence (e.g., once a group has completed its 4-repetition circuit) to facilitate skating responsiveness. This was not realistic, however, in a context in which tests were conducted in the real world. A further limitation is players’ motivation to execute the testing protocol. This aspect should not be overlooked, because some players were enthusiastic about completing the circuit to the best of their ability, while others appeared more or less interested. A good warm-up session before testing combined with clear explanations about the importance of performing as well as possible are needed to resolve this issue. This study points to different possibilities for continuing this avenue of research. Further analyses should focus attention on the variables likely to influence participants’ responses to a testing protocol of this nature. As in the field of exercise physiology (9), standards could be established based on players’ positions (e.g. forward, defence). An examination of gender differences would also be relevant owing to the emergence of women’s hockey. Another aspect to consider is age group and competition level comparisons. In consequence, a more specific on-ice testing protocol looks to be a promising approach to player selection and evaluation processes.

CONCLUSIONS
Developing sport-specific testing protocols remains a challenge for researchers. However, this study is the first to demonstrate that an on-ice protocol makes it possible to assess multiple aspects of players’ abilities relative to performance. Repeated measures testing showed that players’ performance tends to decline over time, although not in regard to all attributes. Coaches may find that a repetitive simulated hockey shift is a good protocol for specifically evaluating the physical performance and skills of on-ice players.  Additionally, these findings could help coaches make decisions about on-ice training and games, strength and conditioning. Using this specific protocol, they could measure on-ice players’ response to fatigue, that is, whether or not they are capable of maintaining physical capabilities such as acceleration, power and speed. Coaches could also learn if fatigue negatively impacts skills like skating, puck control, passing and shooting accuracy. Accordingly, they could select players and change line configuration during key moments of the game by sending onto the ice players who are capable of sustaining good overall abilities despite the appearance of fatigue. Finally, testing results could allow coaches to adjust training to players’ weaknesses, whether on the ice or as regards to strength and conditioning, in order to improve sport-specific fitness and reduce the risk of injury.

APPLICATIONS IN SPORT
In ice hockey, on-ice testing is traditionally used in more isolated contexts, which sometimes does not represent what is observed in game situation. Developing a testing protocol that simulates game specific demands has an interesting potential to evaluate how players react in more realistic conditions.  The administration of a « repeated measures » testing protocol allow coaches to assess their players’ abilities in a different way. In this study, results showed that three abilities tend to decrease from the third stage of assessment: 1) skating speed, 2) acceleration, and 3) shooting performance. Such results are promising for coaches who are interested to evaluate their players’ abilities in contexts where fatigue happens. It is also useful for team trainers who could design training programs oriented from on-ice testing results.

ACKNOWLEDGMENTS
We wish to thank all coaches, schools, and athletes for their participation in our study and in the testing protocols. No funding was obtained for this study. The authors confirm having no conflict of interest regarding this research.

REFERENCES

  1. Austin, D. J., Gabbett, T. J., & Jenkins, D. G. (2013). Reliability and sensitivity of a repeated high-intensity exercise performance test for rugby league and rugby union. Journal of Strength and Conditioning Research, 27(4), 1128-1135. doi:10.1519/JSC.0b013e31825fe941
  2. Bishop, D., Spencer, M., Duffield, R., & Lawrence, S. (2001). The validity of a repeated sprint ability test. Journal of Science and Medicine in Sport, 4(1), 19-29.
  3. Bracko, M. R. (2001). On-ice performance characteristics of elite and non-elitewomen’s ice hockey players. Journal of Science and Medicine in Sport, 15(1), 42-
  4. Bracko, M. R., & Fellingham, G. W. (2001). Comparison of physical performance characteristics of female and male ice hockey players. Pediatric Exercise Science, 13(1), 26-34.
  5. Borg, G. (1998). Borg’s perceived exertion and pain scales. Human Kinetics.
  6. Buchheit, M., Spencer, M., & Ahmaidi, S. (2010). Reliability, usefulness, and validity of a repeated sprint and jump ability test. International Journal Sports Physiology and Performance, 5(1), 3-17.
  7. Burr, J. F., Jamnik, R. K., Baker, J., Macpherson, A., Gledhill, N., & McGuire, E. J. (2008). Relationship of physical fitness test results and hockey playing potential in elite level ice hockey players. Journal of Strength and Conditioning Research, 22(5), 1535-1543. doi:10.1519/JSC.0b013e318181ac20
  8. Carey, D. G., Drake, M. M., Pliego, G. J., & Raymond, R. L. (2007). Do hockey players need aerobic fitness? Relation between VO2max and fatigue during high-intensity intermittent ice skating. Journal of Strength and Conditioning Research, 21(3), 963-966. doi:10.1519/r-18881.1
  9. Cox, M. H., Miles, D. S., Verde, T. J., & Rhodes, E. C. (1995). Applied physiology of ice hockey. Sports Medicine, 19(3), 184-201.
  10. Girard, O., Mendez-Villanueva, A., & Bishop, D. (2011). Repeated-sprint ability – part I: factors contributing to fatigue. Sports Medicine, 41(8), 673-694. doi: 10.2165/11590550000000000-00000
  11. Green, M. R., Pivarnik, J. M., Carrier, D. P., & Womack, C. J. (2006). Relationship between physiological profiles and on-ice performance of a National Collegiate Athletic Association division I hockey team. Journal of Strength and Conditioning Research20(1), 43-46.
  12. Greer, N., Blatherwick, J., Serfass, R., and Picconato, W. (1992) The effects of a hockey specific training program on the performance of bantam players. Canadian Journal of Applied Sports Sciences 17(1),65-69.
  13. Hagg, K., Wu, T., & Gervais, P. (2008). The effects of fatigue on skating mechanics in ice hockey. Journal of Biomechanics, 40 (S2), S761.
  14. Hermiston, R. T., Gratto, J., & Teno, T. (1979). Three hockey skills tests as predictors of hockey playing ability. Canadian Journal of Applied Sports Sciences, 4(1), 95-97.
  15. Hockey Canada. (2010). National Skills Standards & Testing Program.
  16. Hockey Canada. (2010). Plan de Hockey Canada pour le développement à long terme du joueur [Hockey Canada’s guidelines for long term athlete development].
  17. Hockey Québec. (2008). Plan de développement de l’excellence en hockey sur glace 20092013, exigences du sport de haut niveau [Developping excellence in hockey: 2009-2013: high level performance].
  18. International Ice Hockey Federation. (2008). Skills Challenge Manual.
  19. International Ice Hockey Federation. (2014). Youth Olympics Games Skills Challenge 2016 : Tests Protocol and Operations Manual.
  20. Janot, J. M., Beltz, N. M., & Dalleck, L. D. (2015). Multiple off-ice performance variables predict on-ice skating performance in male and female division III ice hockey players. Journal of Sports Science and Medicine14(3), 522-529.
  21. Lachaume, C. M., Trudeau, F., & Lemoyne, J. (2017). Energy expenditure by elite midget male ice hockey players in small-sided games. International Journal of Sports Science & Coaching12(4), 504-513.
  22. Leblanc, G. (2012). L’impact d’un entraînement pliométrique sur l’accomplissement d’un parcours représentatif d’une présence sur glace au hockey. [Outcome of a plyometric training program for ice hockey], Université du Québec à Montréal.
  23. Lee, C., Lee, S., & Yoo, J. (2014). The effect of a complex training program on skating abilities in ice hockey players. Journal of Physical Therapy Science, 26(4), 533-537. doi:10.1589/jpts.26.533
  24. Léger, L. Physiological aspects of ice hockey. In: Almstedt, J. (ed.), Proceedings: 1978 National Coaches Certification Program Level 5 Seminar, University of Montreal, June 1978, Ottawa, Canadian Amateur Hockey Association, 1979, p. 109-141.
  25. Lyons, M., Al-Nakeeb, Y., & Nevill, A. (2006). The impact of moderate and high intensity total body fatigue on passing accuracy in expert and novice basketball players. Journal of Sports Science and Medicine5(2), 215-227.
  26. Montgomery, D. L. (2006). Physiological profile of professional hockey players – a longitudinal comparison. Applied Physiology, Nutrition, and Metabolism, 31(3), 181-185. doi: 10.1139/h06-012
  27. Nightingale, S., Miller, S., & Turner, A. (2013). The usefulness and reliability of fitness testing protocols for ice hockey players: a literature review. Journal of Strength and Conditioning Research, 27(6), 1742-1748. doi: 10.1519/JSC.0b013e3182736948
  28. Noonan, B. C. (2010). Intragame blood-lactate values during ice hockey and their relationships to commonly used hockey testing protocols. Journal of Strength and Conditioning Research, 24(9), 2290-2295. doi:10.1519/JSC.0b013e3181e99c4a
  29. Peterson, B. J., Fitzgerald, J. S., Dietz, C. C., Ziegler, K. S., Ingraham, S. J., Baker, S.   E., & Snyder, E. M. (2015). Aerobic Capacity is Associated with Improved Repeated Shift Performance in Hockey. Journal of Strength and Conditioning Research, 29(6), 1465-1472. doi:10.1519/jsc.0000000000000786
  30. Pfeiffer, K. A., Pivarnik, J. M., Womack, C. J., Reeves, M. J., & Malina, R. M. (2002). Reliability and validity of the Borg and OMNI rating of perceived exertion scales in adolescent girls. Medicine and Science in Sports and Exercise, 34(12), 2057-2061. doi: 10.1249/01.mss.0000039302.54267.bf
  31. Quinney, H. A., Dewart, R., Game, A., Snydmiller, G., Warburton, D., & Bell, G. (2008). A 26 year physiological description of a National Hockey League team. Applied Physiology, Nutrition, and Metabolism, 33(4), 753-760. doi: 10.1139/h08-051
  32. Rostgaard, T., Iaia, F. M., Simonsen, D. S., & Bangsbo, J. (2008). A test to evaluate the physical impact on technical performance in soccer. Journal of Strength and Conditioning Research, 22(1), 283-292.
  33. Royal, K. A., Farrow, D., Mujika, I., Halson, S. L., Pyne, D., & Abernethy, B. (2006).The effects of fatigue on decision making and shooting skill performance in water polo   players. Journal of Sports Sciences24(8), 807-815.
  34. Russell, M., Benton, D., & Kingsley, M. (2010). Reliability and construct validity of soccer skills tests that measure passing, shooting, and dribbling. Journal of Sports Sciences, 28(13), 1399-1408. doi:10.1080/02640414.2010.511247
  35. Scanlan, A. T., Dascombe, B. J., & Reaburn, P. R. (2012). The construct and longitudinal validity of the basketball exercise simulation test. Journal of Strength and Conditioning Research, 26(2), 523-530. doi:10.1519/JSC.0b013e318220dfc0
  36. Singh, T. K., Guelfi, K. J., Landers, G., Dawson, B., & Bishop, D. (2010). Reliability of a contact and non-contact simulated team game circuit. Journal of Sports Science and Medicine, 9(4), 638-642.
  37. Spiering, B. A., Wilson, M. H., Judelson, D. A., & Rundell, K. W. (2003). Evaluation of cardiovascular demands of game play and practice in women’s ice hockey. Journal of Strength and Conditioning Research, 17(2), 329-333.
  38. Stanula, A., & Roczniok, R. (2014). Game intensity analysis of elite adolescent ice hockey players. Journal of Human Kinetics, 44, 211-221. doi: 10.2478/hukin-2014-0126
  39. Stanula, A., Roczniok, R., Maszczyk, A., Pietraszewski, P., & Zajac, A. (2014). The role of aerobic capacity in high-intensity intermittent efforts in ice-hockey. Biology of Sport, 31(3), 193-199. doi:10.5604/20831862.1111437
  40. Tabachnick & Fidell (2006). Using multivariate statistics. Allyn & Bacon/Pearson Education. Boston, MA.
  41. Tomac, Z., Hraski, Z., & Sporis, G. (2012). The assessment of preschool children’s motor skills after familiarization with motor tests. Journal of Strength and Conditioning Research, 26(7), 1792-1798. doi: 10.1519/JSC.0b013e318237ea3b
  42. Twist, P., & Rhodes, T. (1993). Exercise physiology: A Physiological Analysis of Ice Hockey Positions. Strength & Conditioning Journal, 15(6), 44-46.
  43. Vescovi, J. D., Murray, T. M., Fiala, K. A., & VanHeest, J. L. (2006). Off-ice performance and draft status of elite ice hockey players. International Journal of Sports Physiology and Performance, 1(3), 207-221.
  44. Wagner, H., Orwat, M., Hinz, M., Pfusterschmied, J., Bacharach, D. W., von Duvillard, S. P., & Muller, E. (2016). Testing Game-Based Performance in Team-Handball. Journal of Strength and Conditioning Research, 30(10), 2794-2801. doi:10.1519/jsc.0000000000000580
  45. Watson, R. C and Sargeant, T.L. (1986). Laboratory and on-ice test comparisons of anaerobic power of ice hockey players. Canadian Journal of Applied Sports Sciences, 11(4), 218–224.
  46. Weineck, J. (1992). Biologie du sport. Vigot Editeurs.
Print Friendly, PDF & Email