Olympic Edition 2011

### The Second Annual Olympic Edition

**International Olympic Academy: 11th Joint International Session for Presidents or Directors of National Olympic Academies and Officials of National Olympic Committees**

#### Table of Contents

1. [Foreward – The Sport Journal](#forward)
2. [Introduction to the Vision, Mission and History of the International Olympic Academy](/article/introduction-vision-mission-and-history-international-olympic-academy)
3. [Opening Remarks of the 11th Joint International Session for Directors of National Olympic Academies by Mr. Isidoros Kouvelos](/article/ioa-president-s-opening-remarks-11th-joint-international-session-directors-national-olympic-)
4. [Medicine and the Olympic Games of Antiquity by Dr. Spyros Retsas](/article/medicine-and-olympic-games-antiquity)
5. [The Importance of New Forms of Technology in the Dissemination of Humanistic Ideas by Dr. T.J. Rosandich](/article/importance-new-forms-technology-dissemination-humanistic-ideas)
6. [The Digital Revolution Impact to Olympic Education by Dr. Axel Horn](http://thesportjournal.org/article/digital-revolution-impact-olympic-education)
7. [Interdisciplinary Approach of the Teaching of Olympic Principles to the Students by Dr. A M. Najeeb](/article/interdisciplinary-approach-teaching-olympic-principles-students)
8. [Teaching the Olympic Values within the Educational System by Dr. Yohan Blondel](/article/teaching-olympic-values-within-educational-system)
9. [Youth Olympic Games – From Vision to Success by Mr. Ng Ser Miang](/article/youth-olympic-games-vision-success)
10. [The Role of Olympic Education in Today’s Sport World by Dr. Margaret Talbot](/files/olympic-edition/2011/The_Role_of_Olympic_Education_in_Today_s_Sport_World_by_Dr._Margaret_Talbot.pdf)
11. [Two United States Olympic Committee Olympism Programs](/article/two-united-states-olympic-committee-olympism-programs-team-usa-ambassador-program-and-olympi)
12. [IOA Closing Remarks on Behalf of the Lecturers by Dr. T.J. Rosandich](/article/closing-remarks-behalf-lecturers)
13. [IOA President’s Closing Remarks on the 11th Joint International Session for Directors of National Olympic Academies by Isidoros Kouvelos](/article/ioa-president-s-closing-remarks-11th-joint-international-session-directors-national-olympic-)


#### Forward

This second annual special Olympic Edition from the United States Sports Academy’s _The Sport Journal_ is dedicated to the International Olympic Academy (IOA) and its worldwide programs.

In May, the International Olympic Academy held its 11th Joint International Session for Presidents or Directors of National Olympic Academies and Officials of National Olympic Committees in Ancient Olympia, the birthplace of the Olympics. Dignitaries from around the world gave presentations on the issues vital to “Olympism,” with a special focus on the youth and the future of Olympism in an ever-changing global world.

The seven presentations at the conference spanned some 3,000 years of human existence, discussing medicine in the ancient Olympic Games to reaching youth about Olympism in today’s Information Age. A common theme was the role of education in spreading the Olympic ideals and values of fair play, respect, meritocracy and peace.

Students at the IOA use _The Sport Journal_ as reference material for their work more than any other journal. With more than 500,000 unique visitors per year, _The Sport Journal_ is the most read sport journal in the world.

These pieces are based on live presentations by experts from a variety of academic disciplines. Because they are being introduced here to a wide, general audience, _The Sport Journal_ has relaxed its standard rule requiring entries to adhere to American Psychological Association style.

We hope you enjoy reading the second annual Olympic Edition and learn valuable insights from the presentations by the Olympic scholars at the May 2011 session in Greece.

Duwayne Escobedo
_The Sport Journal_ Editor
United States Sports Academy

2016-04-01T09:14:29-05:00June 28th, 2011|Contemporary Sports Issues, Sports Management|Comments Off on Olympic Edition 2011

Olympic Edition 2010

### International Olympic Academy: 10th Joint International Session for Presidents or Directors of National Olympic Academies and Officials of National Olympic Committees

#### Table of Contents

1. [President’s Forward – Dr. Thomas P. Rosandich](#forward)
2. [Introduction to the International Olympic Academy – Anne Kent Rush, Editor](/article/introduction-international-olympic-academy)
3. [National Olympic Academies – National Olympic Committees Parallel Paths, Intertwined Paths – Mr. Isidoros Kouvelos](/article/national-olympic-academies-national-olympic-committees-parallel-paths-intertwined-paths)
4. [The National Olympic Committee: Its Role and Position at the Dawn of the 21st Century – Mr. Giannis Papadogiannakis](/article/national-olympic-committee-its-role-and-position-dawn-21st-century)
5. [The Place and Role of Olympism in Higher Education – Prof. Dr. Antonin Rychtecky](/article/place-and-role-olympism-higher-education)
6. [The Institutional Framework for the Development of Olympic Education and the Role of the National Olympic Academy – Mr. Alexandre Mestre](/article/institutional-framework-development-olympic-education-and-role-national-olympic-academy)
7. [How to Spread and Develop Joint International Programs about Olympic Education: Cultural and Communication Problems – Mr. Henry Tandau](/article/how-spread-and-develop-joint-international-programs-about-olympic-education-cultural-and-com)
8. [The Position of the Athlete in the Social Structure of Ancient Greece – Prof. Mark Golden](/article/position-athlete-social-structure-ancient-greece)
9. [The Use of Sport Art for the Development of Olympic Education: Passing the Visual Torch – Dr. Thomas P. Rosandich](/article/use-sport-art-development-olympic-education-passing-visual-torch)
10. [Closing Address and Olympic Anthem – Mr. Isidoros Kouvelos](http://thesportjournal.org/article/closing-address)
11. [IOA Master’s Degree Program Specifications](/article/international-olympic-academy-masters-degree-program-specifications)
12. [Olympic Values Education Programme (OVEP) Progress Report: 2005-2010](/article/olympic-values-education-programme-ovep-progress-report-2005-2010)


#### President’s Forward

This special issue of the Academy’s _Sport Journal_ is dedicated to the International Olympic Academy (IOA) and its worldwide programs.

This past May, I delivered a presentation in Greece at the International Olympic Academy for the 10th Joint International Session for Presidents or Directors of National Olympic Academies and Officials of National Olympic Committees. My presentation topic was the use of Olympic posters as a reflection of the role of sport art in Olympic culture. Dignitaries from around the world gave presentations on the issues vital to education in Olympism. Special emphasis was placed on challenges in collaboration among the National Olympic Academies, the National Olympic Committees, and the IOA.

Located in historic Olympia, south of Athens on the Peloponnese Peninsula, the IOA functions as an international Academic Centre for Olympic Studies and is an exceptional new resource for students around the globe. Operated jointly by the International Olympic Committee (IOC) and the Greek government, the IOA offers a wide variety of research studies and educational programs aimed at spreading the vision of Olympism.

As President, founder and CEO of the United States Sports Academy, and as a member of the International Olympic Committee’s Commission on Culture and Education, I share the IOA’s vision of Olympism. During my visit, I met with Isidros Kouvelos, President of the IOA; with Professor Konstantino Georgiadis, IOA Honorary Dean; and with Professor Dionyssis Gangas, IOA Director. We discussed the IOA, its projects, and the impressive, new master’s degree program: Olympic Studies, Olympic Education, and Organization and Management of Olympic Events.

Students at the IOA use _The Sport Journal_ as reference material for their work more than any other journal. With more than 500,000 unique visitors per year, _The Sport Journal_ is the most read journal in the world.

Since these pieces, the bases of live presentations by experts from a variety of academic disciplines, are hereby introduced to a wide, general audience, _The Sport Journal_ has relaxed its standard rule requiring entries to adhere to American Psychological Association style. This _Olympic Edition_ offers the presentations given by Olympic scholars at the May 2010 session in Greece.

Dr. Thomas P. Rosandich
President and CEO
United States Sports Academy

2015-10-30T13:27:48-05:00June 28th, 2011|Contemporary Sports Issues, Sports Management|Comments Off on Olympic Edition 2010

Do static-sport athletes and dynamic-sport athletes differ in their visual focused attention?

### Abstract

The goal of this study was to evaluate current attention tests in sport psychology for their practical use in applied sport psychology. Current findings from the literature suggest that measures of visual focused attention may show different performances depending on sport type and test conditions (33). We predicted differences between static- and dynamic-sport athletes (17) when visual focused attention is tested with random (unstructured) versus fixed (structured) visual search in two experimental conditions (quiet environment versus auditory distraction). We analyzed 130 nationally competing athletes from different sports using two measures of visual focused attention: the structured d2 test and the unstructured concentration grid task. Compared to static-sport athletes, dynamic-sport athletes had better visual search scores in the concentration grid task in the condition with auditory distraction. These findings suggest that the results of attention tests should be differentially interpreted if different sport types and different test conditions are considered.

**Key words:** d2 test, concentration grid task, auditory distraction

### Introduction

The study reported here was motivated by recent calls within the applied field of sport psychology for a broad diagnostic framework in the domain of talent selection (7,35) as well as the ongoing evaluation for professional standards of the techniques that are used by practicing sport psychologists (14).

An increasing number of researchers have argued that psychological variables remain often unnoticed within talent identification models (1). However, among a range of other physical and technical variables, psychological variables have been identified as a significant predictor of success (18,27,34). For instance, during athletic performance attention is seen as one of the most important psychological skills underlying success because of the ability to exert mental effort effectively is vital for optimal athletic performance (12,22,27).

In cognitive psychology, attention is seen as a multidimensional construct. According to different taxonomies of attention, at least three distinct dimensions of attention have been identified (21,28,39). The first is _selectivity_. It includes selective attention as well as divided attention. The second dimension of attention refers to the aspect of _intensity_, which can include alertness and sustained attention. The third dimension is _capacity_ and refers to the fact that controlled processing is limited to the amount of information that can be processed at one time.

Individuals’ attentional performance in one or more of the aforementioned dimensions can be assessed in several ways (3, for an overview see 39). The selectivity aspect can, for instance, be approached with tasks involving either focused or divided attention. In focused attention tasks there are usually irrelevant stimuli, which must be ignored. In divided attention tasks, all stimuli are relevant, but may come from different sources and require different responses (39). Intensity requirements can be approached with tasks involving different degrees of difficulty, or with tasks that have to be carried out over longer periods of time. Finally, dual-task procedures, memory span tests, or other processing tasks are used to approach the capacity aspect (26). Practicing sport psychologists most often use standardized tests, which are easily administered in a paper-pencil form and therefore are easy to use in the field.

However, several authors (38) as well as diagnosticians in youth talent diagnostic centers in Germany have expressed a number of subjective impressions concerning the performance of athletes on attention tests (e.g., influence of sport type, test context, or expertise level) that are insufficiently indicated by the existing test norms. Therefore, the goal of the present study was to examine the influence of two essential factors (sport type and environmental context) on athlete’s performance in two different attention tests.

Boutcher’s multilevel approach (3) integrates relevant aspects of research and theory on attention from different perspectives. In his framework, internal as well as external factors, like enduring dispositions, demands of the task, and environmental factors, interact with attentional processes during performance. These factors are thought to initially influence the level of physiological arousal of the individual, which in turn influences controlled and automatic processing. When performing a task, the individual either uses controlled processing, automatic processing, or both, depending on the nature and the demands of the task. An optimal attentional state can be achieved by reaching or attaining the exact balance between automatic and controlled processing, essential for a particular task (3).

A sudden external distraction (e.g., auditory noise) is expected to hamper performance because it may disrupt the current attentional state by causing the individual to reach a level of arousal such that an imbalance in controlled and automatic processing occurs. However, individual differences may exist regarding the effect of internal or external distractions on attentional state. For instance, a gymnast normally performs his or her routine in a quiet environment in competition whereas during a basketball game the player is confronted with auditory noise. Unexpected auditory distractions may disrupt the attentional state of the gymnast but not the state of the basketball player because he is used to it.

There has been extensive research on different aspects of attentional performance in athletes. For instance, researchers examined attentional differences between athletes and non-athletes (5,20,23), between athletes on different expertise levels (8), as well as with regard to other factors, such as athlete type, sport type and gender (17,19,24,33) by using a variety of attentional tasks. Athletes are able to distribute their attention more effectively over multiple locations and better able switch their attention rapidly among locations than non-athletes (25). Furthermore, attentional performance seems to vary with the kind and amount of training provided by a sports environment so that athletes trained in more visually dynamic sports show better attentional control than athletes trained in less visually dynamic sports (24).

When using specific tests to assess attention performance, one should expect differences in test performance between athletes that vary in one or more of the aforementioned factors. In this context, Lum et al. highlight the need to examine athlete’s visual attention by using a variety of visual attention tasks (17, see also 20). Furthermore, existing test norms should account for the aforementioned differences to provide athletes with a reliable feedback on their individual attention performance.

For instance, to evaluate the visual focused attention performance of athletes, two common tests are used in the field of applied sport psychology, the d2 test and the concentration grid test (3, 4). Visual focused attention is usually operationalized as visual search so that target stimuli have to be found in a field of distractor stimuli (39). For instance, in the d2 test, participants need to select “d” letters with two dashes above them in an array of “d” and “p” letters with zero, one, or two dashes over or under each letter. The structure of reading letters from left to right provides an environment in which relevant stimuli need to be selected and irrelevant stimuli need to be ignored. The gaze searches throughout the visual array not in a random way but rather in a structured fashion. In contrast, in the concentration grid task, participants see a block of randomly distributed numbers, in which they need to search for numbers in sequence, such as number 01, then 02, 03, and so on. The concentration grid task is often administered as a training exercise in the field of applied sport psychology, and it has been proposed, that it works by developing the athlete’s ability to scan a visual array for relevant information, and to ignore irrelevant stimuli (11).

Given the different demands of these two tasks and the empirical evidence so far, one may speculate that athletes who have experience performing visual searches for relevant cues and making decisions in dynamic environments (which is typical for team sport athletes), will do better on the concentration grid test than on the d2 test (29). Athletes from individual sports who are exposed to a mostly static environment with one or a small number of stimuli should do better on the d2 test than on the concentration grid test.

Maxeiner compared, for instance, 30 gymnasts and 30 tennis players in their performance on the d2 test and on a reaction time task in which they were asked to press a pedal with their foot as soon as a square appeared on a computer monitor (19). Participants were tested under either a single-task condition, such that only the d2 test or the reaction time task had to be performed, or a multiple-task condition, in which both the d2 test and the reaction time task had to be carried out simultaneously. Reaction times showed a significantly stronger increase under the multiple-task condition for the gymnasts (about 28%) whereas no differences between gymnasts and tennis players were found for single-task conditions. The author concluded from this result, that tennis players have a better distributive ability of attention than gymnasts. However, the total number of items worked on the d2 test as well as the error rates did not differ between gymnasts and tennis players in either the single-task or multiple-task condition.

Tenenbaum, Benedick, and Bar-Eli conducted a similar study and found opposing results (33). The authors compared 252 young athletes from different sports disciplines in their d2-test performance. All athletes performed the d2 test in a quiet classroom with no distractions. Results indicate that the number of d’s the subjects have crossed (quantitative capacity) differed significantly by type of sport in females. High quantitative capacity scores in the d2 test were found for female athletes from sports such as tennis or volleyball, but not for female athletes from gymnastics. A similar pattern of results was found in male athletes, although only showing a tendency for rejecting the null hypothesis (p = .06). The authors found an additional effect for type of sport on error-rate. The largest error-rates were found in tennis and volleyball players whereas the smallest error-rates were found in track and field athletes. The authors concluded that concentration is individual and sport-type dependent and state that “Concentration should be further investigated with relation to motor performance” (p. 311).

Maxeiner and Tenenbaum et al. found opposing results in athletes from different sport domains in the d2 test (19,33). First, the authors assessed different parameters of the d2 test. Maxeiner quantified the total number of items worked on the d2 test, whereas Tenenbaum et al. quantified the number of d’s the subjects have crossed. The number of items worked on the d2 test is a reliable criterion for working speed (4), whereas the number of crossed d’s is related to both working speed and working accuracy. Assessing different parameters in the d2 test could lead to different results, therefore masking possible differences between participants from different sport domains. Following the suggestions of Brickenkamp, the practitioner should assess the concentration-performance score (number of marked d’s minus the number of signs incorrectly marked) in the first instance, because this value is resistant to tampering, such that neither the skipping of test parts nor the random marking of items increases the value (4).

Furthermore, Tenenbaum et al. had participants from tennis, fencing, volleyball, team-handball, track and field, and gymnastics indicating an unequal distribution of participants with regard to other criteria like kind of training provided by a sports environment (33). As mentioned above, attentional performance seems to vary with the kind and amount of training provided by a sports environment (24); the question arises whether athletes should be classified according to kind of training provided by a sports environment, rather than sport discipline per se when assessing their attentional performance.

Greenlees, Thelwell, and Holder examined the performance of 28 male collegiate soccer players in the concentration grid exercise (13,15). The players were assigned to either a 9-week concentration grid training or a control condition. During three test sessions the athletes were asked to complete a battery of concentration tasks, including the aforementioned concentration grid test. The results showed a significant main effect for training condition but not for test session, indicating that the concentration training group was superior to the control group but did not exhibit any improvement during the 9-week training interval. However, Greenless et al. assessed only soccer players with a playing experience of 10.45  2.31 years, which indicates that they already possess substantial experience in performing visual searches for relevant cues in dynamic environments (13). This could at least in part explain why the participants of the concentration training group did not improve their performance on the concentration grid task as compared to the participants of the control group. Additionally, the two groups were not homogeneous in their concentration grid performance at the study onset, which may in part explain the main effect for training condition. The findings of Greenless et al. highlight the need for further research on the concentration grid test, especially examining the extent to which the task reflects sport-specific concentration skills and therefore support the need for ongoing evaluation of this technique in diagnostics and intervention.

Taken together, we can identify two main factors that need to be considered when assessing athletes’ visual focused attention. First, a broad application of attention tests that are sensitive to the athlete’s experience in different types of sports should be made. This means, in particular, recognizing that different sport environments (static vs. dynamic), encouraging different visual search and decision strategies (fixed or structured vs. random or unstructured), and realizing that the same tests do not necessarily capture both types of strategies. Second, the environmental context (with or without distraction) can increase or decrease performance, respectively.

We adapted the dichotomy of Lum et al. and hypothesized that static-sport athletes and dynamic sport-athletes would not differ in d2 scores but would differ in concentration grid scores due to their different perceptual experiences (17). This finding would not only help to clarify previous results (19,33) but would extend them to different concentration tasks (d2 test vs. concentration grid) following the conclusions of Greenlees et al. as well as Tenenbaum et al. (13,33). We furthermore hypothesized that auditory distraction would have a detrimental effect on performance in both the d2 test and the concentration grid test because it may disrupt the current attentional state (3). We therefore compared performances in the d2 test and the concentration grid test with and without auditory distraction.

### Method

#### Participants

A sample of 130 athletes (students of Sport Science, German Sport University) were recruited to participate in the study (n = 44 women, mean age = 22 years and n = 86 men, mean age = 22 years). Ages ranged from 19 to 33 years, with a mean age of 22 years (SD = 2.4 years). Of these, 66 students (n = 15 women and n = 51 men) competed in 6 different sports with a dynamic visual environment (i.e., soccer, volleyball) and 64 (n = 29 women and n = 35 men) competed in another 6 different sports with mostly static visual environment (i.e., track and field athletics, gymnastics). All students had been performing their sport for at least 7 years with 19.2% (n = 25) of them reporting national experience (German championships or national league) and 11.5% (n = 15) also reporting international experience. All participants were informed about the purpose and the procedures of the study and gave their written consent prior to the experiment. Participants reported to have no prior experience with either the d2 test or the concentration grid test.

We recruited an additional sample of n = 25 students of sport science in order to evaluate the reliability of the d2 test and the concentration grid test and to estimate the validity of the concentration grid test. This was necessary because, first, we applied modified versions of the original tests and second, there were no reliability or validity statistics available in the current literature for the concentration grid test.

#### Tasks and Apparatus

##### d2 Test of Visual Focused Attention.

The d2 test was used to assess visual focused attention (4,39). It is seen as a reliable and valid instrument, most commonly being used in the fields of cognitive, clinical, and sport psychology. In the standardized version of this task, 14 lines consisting of 47 letters each are presented to the participant. The letters can be a “p” or a “d” with zero, one, or two small dashes above or below it. The task is to process all items (letters) of a line in a sequential order and to mark every “d” with two dashes above or below. All other letters are to be left unmarked.

The visual search pattern in the d2 test is guided by the structure of the stimulus field (fixed visual search). To avoid ceiling effects, there is a temporal restriction of 15 seconds to process each line. After 15 seconds there is a verbal instruction to proceed to the next line. Norms are available for age groups between 9 and 60 years. Reliability coefficients of the test range from r = .84 to r = .98 (4).

In the present study, 7 lines of the d2 test had to be dealt with under each experimental condition with each line consisting of 47 letters. This test reduction was applied for practical reasons, particularly to match the working time of the concentration grid task. Prior to the study, we analyzed d2-test results of 7 lines (Version A) and 14 lines (Version B) in a test–retest design with a temporal delay of 1 week. The results indicate a significant product–moment correlation between the two versions of the test in a sample of 25 students of sport science (r = .80; p < .05). Therefore, we believed that the use of 7 instead of 14 lines should be adequate for the purposes of this study. From the performance of each participant in the d2 test, two parameters were obtained: a concentration-performance score and the error rate. The concentration-performance score is the number of d letters the subject marked minus the number of signs (dashes) incorrectly marked. The error rate is the number of signs incorrectly marked plus the number of correct signs missed.

##### Concentration Grid Task

Two versions of the concentration grid test were used as a second measure of visual focused attention, and in particular, visual search (15,21). They were modified from the concentration grid exercise, which can be found in Harris and Harris (1984). The first version (CG1) used in this study consisted of 7 horizontal and 7 vertical squares arranged in a grid of 49 squares altogether. A unique two digit-number (from 00 to 49) was placed randomly in the center of each square. The second version (CG2) of the concentration grid was identical to the first except for a different placement of the numbers. To ensure comparability, the relative distance from each number to the following number was the same in the two grids. We also examined the reliability of the concentration grid task. In a test–retest design with a temporal delay of a 1-week interval, a significant product-moment correlation of r = .79 (p < .05) was found in a sample of 25 students of sport science.

In the concentration grid task the participants were instructed to mark as many consecutive numbers (starting from 00) as possible within a 1-min period under each experimental condition. The resultant number of correctly processed items was used for further data analysis. In comparison to the d2 test, the participants’ visual search pattern in the concentration grid is not entirely guided by the structure of the stimulus field; instead, the participant is advised to scan the grid (random visual search). We calculated the product-moment correlation between the concentration grid scores and the d2 test results in the aforementioned sample of 25 students of sport science to estimate the construct validity of the concentration grid. The analysis revealed a non-significant product-moment correlation of r = .10 (p = .62), indicating that the concentration grid test captures a different aspect of visual focused attention than the d2 test.

#### Procedures

A trained research assistant introduced the experimental tasks to each individually tested participant. The participant was given a practice trial of 20 seconds for the concentration grid exercise (altered version of the original CG1) and a practice trial of two lines for the d2 test to become familiarized with the two experimental tasks. The participant had to perform each of the two tasks under two different experimental conditions, that is, in different environmental contexts (for a total of four experimental phases: d2 test and concentration grid task under normal and auditory distraction conditions, respectively). In one condition no sensory distractions were present. The participant completed the tasks in the quiet laboratory environment. In the other condition an auditory distraction was present. The participant wore headphones that enclosed the whole ear. A mixture of distracting, sport-specific environmental sounds was played back at 90 dB. We used ambient sound recordings of the audience and the players from the last 3 minutes of two first division basketball matches in which both teams played head to head until the end of the match. We compiled the sound recordings to fit the two 1-min periods for the auditory distraction condition (d2 test and concentration grid task) in such a way that the played back sound recording comprised the audience’s and the player’s sounds of three offense and three defense situations. In all tasks the participant sat at a worktable with a head–table distance of 40 cm. The test order was counterbalanced for the participants and the experimental tasks required approximately 20 minutes to complete.

### Results

A significance criterion of α = .05was established for all results reported (9). Prior to testing the main hypothesis, moderating effects of age, sex, and experimental sequence were assessed. We conducted separate analyses of variance on the dependent variables, first, with sex as categorical factor (male versus female), second, with age as continuous predictor, and third, with experimental sequence as categorical predictor (auditory distraction following no distraction versus no distraction following auditory distraction). There were no significant effects of sex, age, or experimental sequence on any of the dependent variables (p < .05).

A correlation analysis indicated that there was no significant product–moment correlation between the concentration-performance score of the d2 test and the number of correctly processed items in the concentration grid task (r = -.01; p = .68), nor between the concentration-performance score and the error rate in the d2 test (r = -.02; p = .47). To assess differences in the dependent variables, we conducted 2 × 2 (Environmental Context × Sport Type) univariate analyses of variance (ANOVAs) with condition being the repeated measure. Post hoc analyses were carried out using the Tukey HSD post hoc test. Cohen’s f was calculated as an effect size for all analyzed F values higher than 1 (6). Additionally, we conducted single sample t-tests to compare our study sample to the age matched normative sample. This was done for each participant’s d2 test performance (concentration-performance score and error rates) but not for the concentration grid task, because norms were available only for the d2 test. Cohen’s d was calculated as an effect size for all analyzed t values higher than 1.

#### d2 Test of Visual Focused Attention

Descriptive statistics for the concentration-performance scores and the error rate of the d2 test are shown in Table 1. First, we assumed that d2 scores would not differ between the two groups reflecting static-sport athletes and dynamic-sport athletes. A 2 × 2 (Sport Type × Environmental Context) ANOVA with repeated measures on the second factor was conducted, taking the concentration-performance score as the dependent variable. The results showed that the two groups did not differ in their concentration-performance scores, F(1, 128) = .004, p = .94, achieved power = .94. Our second assumption was that auditory distraction would have a detrimental effect on concentration performance. To our surprise, the ANOVA revealed a significant main effect for environmental context, F(1, 128) = 66.02, p < .05, Cohen’s f = 0.72, reflecting higher concentration-performance scores for the auditory distraction condition for both dynamic-sport and static-sport athletes (see Table 1). The effect size indicates a large effect (6). Furthermore there was no significant interaction effect for Sport Type × Environmental Context, F(1, 128) = .01, p = .76, achieved power = .98.

To determine if participants from our study sample differed from the general population in concentration performance, we calculated single sample t-tests. The results show that in the normal condition, neither static-sport athletes, t(63) = 1.56, p = .12, Cohen’s d = 0.19, nor dynamic-sport athletes, t(65) = 1.81, p = .07, Cohen’s d = 0.22, differed in their concentration performance from the normative sample’s mean. However, in the auditory distraction condition both groups differed significantly from the normative sample’s mean (static-sport athletes, t(63) = 3.17, p = .002, Cohen’s d = 0.39; dynamic-sport athletes, t(65) = 3.37, p = .001, Cohen’s d = 0.42).

Second, a 2 × 2 (Sport Type × Environmental Context) ANOVA with repeated measures on the first factor was conducted, taking the error rate in the d2 test as the dependent variable. There were no significant main effects, neither for sport type, F(1, 128) = 3.71, p = .06, Cohen’s f = 0.17, achieved power = .61, nor for environmental context, F(1, 128) = 1.50, p = .22, Cohen’s f = 0.11, achieved power = .75. In addition, the interaction effect Sport Type × Environmental Context showed no statistical significance, F(1, 128) = 2.02, p = .16, Cohen’s f = 0.13, achieved power = .95. Dynamic-sport athletes did not make more mistakes on the d2 test in comparison to static-sport athletes, neither in the normal nor in the auditory distraction condition.

To determine if participants from our study sample differed from the general population in error rate, we calculated single sample t-tests. The results show that in the normal condition, dynamic-sport athletes, t(65) = -2.88, p = .005, Cohen’s d = 0.35, but not static-sport athletes, t(63) = -1.41, p = .16, Cohen’s d = 0.17, made on average fewer mistakes than the participants from the normative sample. The same pattern of results was found for participant’s error rates in the auditory distraction condition (static-sport athletes, t(63) = 0.36, p = .71, Cohen’s d = 0.05; dynamic-sport athletes, t(65) = -3.17, p = .002, Cohen’s d = 0.39).

#### Concentration Grid Task

We assumed that concentration grid scores would differ between the two groups reflecting static-sport athletes and dynamic-sport athletes. The second assumption was that auditory distraction would have a detrimental effect on concentration performance. A 2 × 2 (Sport Type × Environmental Context) ANOVA with repeated measures on the second factor was conducted, taking the concentration grid score as the dependent variable. The ANOVA revealed no significant main effects for either sport type, F(1, 128) = 1.40, p = .24, Cohen’s f = 0.11, or environmental context, F(1, 128) = 0.27, p = .60. We assume that we can rely on the two findings because of a test power greater than .90. To our surprise the interaction effect Environmental Context × Sport Type showed statistical significance, F(1, 128) = 4.54, p = .04, Cohen’s f = 0.19. Post hoc analysis revealed that participants in the dynamic-sport group scored higher in the concentration grid task under the auditory distraction condition, whereas participants in the individual-sport group scored lower under the auditory distraction condition, compared to the normal condition (see Figure 1).

### Discussion

The goal of this study was to evaluate two attention tests in sport psychology in terms of their application in athletes who are trained in more visually dynamic sports compared to athletes trained in visually less dynamic sports with regard to different environmental contexts. Visual focused attention was examined with random (concentration grid task) versus fixed (d2 test) visual search in a quiet environment and under auditory distraction (4,15).

The results extend current findings on attention performance of athletes with regard to sport type, environmental context, and task dependency. Dynamic-sport athletes did not differ in their concentration performance from static-sport athletes, neither in the d2 test nor in the concentration grid task under quiet laboratory environmental conditions. This result confirms our first hypothesis with regard to the d2 test and supports the findings of Maxeiner (19). We assume that the different perceptual experience of dynamic-sport athletes does not account for their visual search performance in the d2 test. On the one hand, this implies a fairly stable underlying ability to focus attention in simple tasks when a fixed (structured) visual search is a constraint of the task. On the other hand, it can be speculated that attention abilities manifest themselves in a sport-specific way on a more strategic level when integrating basic (attention) abilities in different skills that are not assessed by the d2 test.

Our second hypothesis was that auditory distraction would have a detrimental effect on attention performance in both the d2 test and the concentration grid task. To our surprise the results of the d2 test indicate higher concentration performance scores for the auditory distraction condition for dynamic-sport athletes as well as static-sport athletes. The scores were not only higher when compared between both experimental conditions but also when compared with the corresponding normative sample of the d2 test. This finding supports the assumptions of Tenenbaum et al. and Wilson, Peper, and Schmid, that visual search performance in unstructured contexts is task dependent, especially under auditory distraction conditions (33,38).

From the viewpoint of Boutcher’s multilevel approach to attention, it seems possible that in the auditory distraction condition the participants’ attentional states were optimized (3). This optimization helped the participants achieve higher scores in the relatively simple d2 test, regardless of their sport type. However, whether the supposed optimization was due to changes in arousal level, changes in controlled or automatic processing, or both, cannot be concluded from our results. In addition, the results of the concentration grid task (where an unstructured visual search is an inherit component of the task) show that participants in the dynamic-sport group scored higher in the auditory distraction condition in comparison to the participants in the static-sport group. Changes in arousal level and therefore in attentional state are known to influence visual control (16,32). It is reasonable that an increased amount and/or increased amplitude of saccades, when scanning the concentration grid, can lead to ignoring the actual target or finding it later than under normal conditions. This could explain the decrease in performance in the concentration grid task for static-sport athletes, because they are normally not trained to deal with such a situation in their sport. To further examine the gaze behavior in performing different attention test, eye-tracking methodology should be integrated into the experimental design.

The increase of the concentration grid scores of the dynamic-sport athletes in the auditory distraction condition could also be explained by differences in information processing. Dynamic-sport athletes seem to be able to allocate their attention capacity to more crucial aspects of the task (37). When scanning the concentration grid they could, for instance, pre-cue remaining numbers in specific areas of the grid in advance, in order to find these numbers faster at a later point in time. However, this aspect is open for further investigation. We assume that dynamic-sport athletes benefit from their sport-specific perceptual experience especially in the concentration grid task under auditory distraction conditions.

We are aware of some critical issues in our design that need to be taken into account in further experiments, and want to highlight three specific aspects. First, the differentiation of dynamic- versus static-sport athletes could be more closely specified. This could be done by examining athletes from different sport disciplines that have different sport-specific structures (e.g., coactive vs. interactive sports). One can, for instance, hypothesize that athletes in coactive sports such as bowling or rowing may differ in their attention ability from athletes of interactive sports such as basketball or soccer due to different task demands. Subsequent analyses could also focus on different team positions, especially in interactive sports. For instance, it is likely that a goalkeeper differs in concentration ability from a playmaker (30,31).

Second, the type of distraction could be more differentiated. Athletes have to deal with different distractions in competition such as comments from the coach and other athletes, or different forms of either expected or unexpected noise. These distractions could have different effects on attention performance. One could, for example, examine the impact on attention performance of different distractions with different structures, such as visual versus auditory distraction with a sport-specific structure versus no structure. One can hypothesize that structured distractions of a sport-specific nature would have no impact on concentration performance at all, because athletes are normally habituated to such distractions. In our study we speculated that the impact of the auditory distraction on the attentional state of the athletes would be to enhance their performance in the d2 test. To control this aspect, measurements of arousal level (e.g., heart rate or galvanic skin response) should be integrated into further studies.

Third, we adopted the concentration grid test as a measure for visual focused attention, because visual focused attention is usually operationalized as visual search (39). Research suggests a close link between working memory capacities and the selectivity dimension of attention (10). We acknowledge that when performing the concentration grid test, a participant could potentially optimize his or her visual search by selectively memorizing the position of stimuli that have to be found after preceding stimuli have been marked. However, participants were not instructed to memorize the position of the stimuli but rather to actively scan the grid and mark as many consecutive numbers (starting from 00) as possible within a 1-min period. Subsequent studies could compare participant’s performance in working memory tests (10), as well as in other tests of visual attention (39), with their concentration grid test scores to evaluate if the concentration grid is more a measure of visual focused attention or working memory.

### Conclusions

The findings of the current study suggest that the results of attention tests should be differentially interpreted if different sport types and different test conditions are considered in the field of applied sport psychology or applied sport science. Their predictive power for sport-specific attention skills, however, may only be seen with regard to different factors such as sport type, environmental context, and task.

### Applications in Sport

There are some practical consequences and implications of this study. First, non-specific concentration tests only seem to be able to differentiate between athletes from more visually dynamic sports and athletes from more visually static sports when they mimic a sport-specific environmental context together with sport-specific demands of the task. Therefore, one may need more specific tests for specific sports to diagnose not only fundamental aspects of attention, but attention abilities on a more strategic level (2). These tests should then be integrated in a systematic talent diagnosis with test norms for specific sports (7). In a talent diagnostic, however, psychological variables remain often unnoticed (1), even if they have been identified as significant predictors of success (27). They could serve as an intrapersonal catalyst in the developmental process of talented youngsters (35). However, their impact on performance may change throughout the development process of the individual. When administering attention tests, this development needs to be taken into account. It is, for instance, questionable whether young gymnasts can be compared to young soccer players in their ability to focus attention, because of different attentional demands in both sports. Second, it would be very useful to conduct longitudinal or to combine analysis of performance in tests with analysis of performance criteria (33). A final issue that should be addressed is the impact of specific interventions on attention performance, especially if attention training is used that is similar to the structure of the concentration test itself (13, 38).

### Acknowledgments

The author thanks Mr. Konstantinos Velentzas and for assistance with data collection and Mrs. Lisa Gartz for her critical and helpful comments on the manuscript.

### References

1. Abbott, A., & Collins, D. (2004). Eliminating the dichotomy between theory and practice in talent identification and development: considering the role of psychology. Journal of Sports Sciences, 22(5), 395-408.
2. Abernethy, B. (2001). Attention. In R. N. Singer, H. A. Hausenblas & C. M. Janelle (Eds.), Handbook of Sport Psychology (2nd ed., pp. 53-85). New York: John Wiley & Sons Inc.
3. Boutcher, S. H. (2008). Attentional processes and sport performance. In T. S. Horn (Ed.), Advances in Sport Psychology (3rd ed., pp. 325-338). Champaign, IL: Human Kinetics.
4. Brickenkamp, R. (1994). Test d2. Aufmerksamkeits-Belastungs-Test. Göttingen: Hogrefe.
5. Casteillo, U., & Umiltà, C. (1992). Orienting of attention in volleyball players. International Journal of Sport Psychology, 23, 303-310.
6. Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences (2nd ed.). USA: Lawrence Erlbaum Associates.
7. Durand-Bush, N., & Salmela, J. H. (2001). The development of talent in sport. In R. N. Singer, H. A. Hausenblas & C. M. Janelle (Eds.), Handbook of Sport Psychology (2nd ed., pp. 269-289). New York: John Wiley & Sons Inc.
8. Enns, J., & Richards, J. (1997). Visual attentional orienting in developing hockey players. Journal of Experimental Child Psychology, 64, 255-275.
9. Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175-191.
10. Fukuda, K., & Vogel, E. K. (2009). Human variation in overriding attentional capture. The Journal of Neuroscience, 29(27), 8726-8733.
11. Gill, D. L. (2000). Psychological Dynamics of Sport and Exercise (2nd Ed.). Champaign, IL: Human Kinetics.
12. Gould, D., Dieffenbach, K., & Moffett, A. (2002). Psychological characteristics and their development in olympic champions. Journal of Applied Sport Psychology, 14, 172-204.
13. Greenlees, I., Thelwell, R., & Holder, T. (2006). Examining the efficacy of the concentration grid exercise as a concentration enhancement exercise. Psychology of Sport and Exercise, 7, 29-39.
14. Hardy, L., Jones, J. G., & Gould, D. (1996). Understanding psychological preparation for sport: Theory and practice of elite performers. Chichester, UK: Wiley.
15. Harris, D. V., & Harris, B. L. (1984). The Athlete’s Guide to Sport Psychology: Mental Skills for Physical People. New York: Leisure Press.
16. Janelle, C. M. (2002). Anxiety. arousal and visual attention: a mechanistic account of performance variability. Journal of Sports Sciences, 20(3), 237-251.
17. Lum, J., Enns, J., & Pratt, J. (2002). Visual orienting in college athletes: Explorations of athlete type and gender. Research Quarterly for Exercise and Sport, 73(2), 156-167.
18. Mahoney, M. J., Tyler, J. G., & Perkins, T. S. (1987). Psychological skills and exceptional athletic performance. The Sport Psychologist, 1, 181-199.
19. Maxeiner, J. (1987). Concentration and distribution of attention in sport. International Journal of Sport Psychology, 18, 247-255.
20. McAuliffe, J. (2004). Differences in attentional set between athletes and nonathletes. Journal of General Psychology, 131(4), 426-437.
21. Moran, A. P. (1996). The Psychology of Concentration in Sport Performers: A Cognitive Analysis. Hove: Psychology Press.
22. Moran, A. P. (2009). Attention in sport. In S. Mellalieu & S. Hanton (Eds.), Advances in Applied Sport Psychology: A Review (pp. 195-220). London: Routledge.
23. Nougier, V., Ripoll, H., & Stein, J.-F. (1989). Orienting of attention in highly skilled athletes. International Journal of Sport Psychology, 20, 205-223.
24. Nougier, V., Rossi, B., Alain, C., & Taddei, F. (1996). Evidence of strategic effects in the modulation of orienting attention. Ergonomics, 9, 1119-1133.
25. Nougier, V., Stein, J.-F., & Azemar, G. (1992). Covert orienting to central visual cues and sport practice relations in the development of visual attention. Journal of Experimental Child Psychology, 54(315-333).
26. Oberauer, K., Sus, H.-M., Schulze, R., Wilhelm, O., & Wittmann, W. W. (2000). Working memory capacity — facets of a cognitive ability construct. Personality and Individual Differences, 29(6), 1017–45.
27. Orlick, T., & Partington, J. (1988). Mental links to excellence. The Sport Psychologist, 2, 105-130.
28. Posner, M. I., & Rothbart, M. K. (2007). Research on attention networks as a model for the integration of psychological science. Annual Review of Psychology, 58, 1-23.
29. Raab, M., & Johnson, J. G. (2007). Expertise-based differences in search and option-generation strategies. Journal of Experimental Psychology: Applied, 13(3), 158-170.
30. Raviv, S., & Nable, N. (1988). Field dependence/independence and concentration as psychological characteristics of basketball players. Perceptual and Motor Skills, 66(3), 831-836.
31. Sharma, V., Khan, H. A., & Butchiramaiah, C. (1986). A comparative study of reaction time and concentration among recreational and competitive volleyball players. SNIPES Journal, 9(4), 40-46.
32. Tenenbaum, G. (2003). Expert athletes: an integrated approach to decision making. In J. L. Starkes & K. A. Ericsson (Eds.), Expert Performance in Sports: Advances in Research on Sport Expertise (pp. 191-218). Champaign: Human Kinetics.
33. Tenenbaum, G., Benedick, A. A., & Bar-Eli, M. (1988). Quantity, consistency, and error-rate of athletes’ mental concentration. International Journal of Sport Psychology, 19, 311-319.
34. Thomas, P. R., & Over, R. (1994). Psychological and psychomotor skills associated with performance in golf. The Sport Psychologist, 8, 73-86.
35. Vaeyens, R., Lenoir, M., Williams, A. M., & Philippaerts, R. M. (2008). Talent identification and development programmes in sport. Current models and future directions. Sports Medicine, 38(9), 703-714.
36. van Zomeren, A. H., & Brouwer, W. H. (1994). Clinical Neuropsychology of Attention. Oxford: Oxford University Press.
37. Weber, N., & Brewer, N. (2003). Expert memory: The interaction of stimulus structure, attention, and expertise. Applied Cognitive Psychology, 17, 295-308.
38. Wilson, V. E., Peper, E., & Schmid, A. (2006). Strategies for training concentration. In J. M. Williams (Ed.), Applied Sport Psychology: Personal Growth to Peak Performance (5th ed., pp. 404-424). New York: McGraw-Hill.
39. Zomeren, A. H. v., & Brouwer, W. H. (1994). Clinical Neuropsychology of Attention. Oxford: Oxford University Press.

### Tables and Figures

#### Table 1
Means (M) and standard deviations (SD) for the concentration-performance scores and the error rate of the d2 test with regard to environmental context and sport type (n=130). The terms of static and dynamic refer to the visual environment in which the athletes from different types of sport usually perform.

Environmental context
Normal Auditory distraction
M SD M SD
Concentration-performance score
Static sports 137.28* 69.26 153.05*+ 73.93
Dynamic sports 138.59* 66.34 153.23*+ 71.05
Error rate
Static sports 11.21 8.79 13.26 10.72
Dynamic sports 9.52+ 9.16 9.36+ 8.72

* p < .05 (according to Tukey HSD post hoc test).
+ p < .05 (according to single sample t-test between the study sample and the corresponding normative sample, cf., 4).

#### Figure 1
![Mean concentration grid performance as a function of sport type and environmental context](/files/volume-14/415/figure1.jpg)
Mean concentration grid performance as a function of sport type and environmental context (error bars represent the standard error of the mean; * = significant difference at p < .05 between experimental and control group according to Tukey HSD post hoc analysis).

### Corresponding Author

Dr. Thomas Heinen
German Sport University Cologne
Institute of Psychology
Am Sportpark Müngersdorf 6
50933 Cologne
GERMANY
Tel. +49 221 4982 – 5710
Fax. +49 221 4982 – 8320
Email: <t.heinen@dshs-koeln.de>

### Author’s Affiliation and Position
German Sport University Cologne, Institute of Psychology

2013-11-25T16:23:44-06:00June 3rd, 2011|Contemporary Sports Issues, Sports Coaching, Sports Management, Sports Studies and Sports Psychology|Comments Off on Do static-sport athletes and dynamic-sport athletes differ in their visual focused attention?

Imagery Use and Sport-Related Injury Rehabilitation

Submitted by Matthew L. Symonds1* and Amanda S. Deml2*

1* Associate Professor, Department of Health and Human Services, Northwest Missouri State University

2* Intramural Sports Coordinator, University of Oregon

Amanda Deml is the Intramural Sports Coordinator at the University of Oregon. She earned both her BS and MS Ed degrees from Northwest Missouri State University in Maryville, Missouri. Matthew Symonds is an Associate Professor in the Department of Health and Human Services at Northwest Missouri State University and also serves as Department Chair.

ABSTRACT

This study sought to investigate mental imagery use among college athletes during the rehabilitation process, specifically examining the use of three functions of imagery – motivational, cognitive, and healing. The Athletic Injury Imagery Questionnaire-2 (AIIQ-2) was administered to varsity athletes representing 12 varsity sports at public, regional, Masters I institutions in the Midwestern United States. From the convenience sample, survey respondents included 61 males and 82 females.  The study examined imagery use by: (a) sport and gender of current varsity athletes at the institution, and (b) between groups of respondents self-reporting as injured on uninjured. Results indicated that motivational imagery was more commonly employed than cognitive and healing imagery in the rehabilitation process. In addition, males used each function of imagery more than females. Furthermore, differences among sports concerning cognitive and healing imagery existed. No significant differences among injured and uninjured athletes and imagery use were found. The results of this study provided insight and additional perspective as to imagery use in the rehabilitation process. We recommend athletes, coaches, and athletic training personnel develop and implement imagery practices to improve athletic performance and the effectiveness of the injury rehabilitation process.

Key words: imagery, injury, rehabilitation

(more…)

2015-05-22T10:50:57-05:00May 21st, 2011|Contemporary Sports Issues|Comments Off on Imagery Use and Sport-Related Injury Rehabilitation

A New Test of the Moneyball Hypothesis

### Abstract

It is our intention to show that Major League Baseball (MLB) general managers, caught in tradition, reward hitters in a manner not reflecting the relative importance of two measures of producing offense: on-base percentage and slugging percentage.  In particular, slugging is overcompensated relative to its contribution to scoring runs.  This causes an inefficiency in run production as runs (and wins) could be produced at a lower cost. We first estimate a team run production model to determine the run production weights of team on-base percentage and team slugging.  Next we estimate a player salary model to determine the individual salary weights given to these same two statistics.  By tying these two sets of results together we find that slugging is overcompensated relative to on-base percentage, i.e., sluggers are paid more than they are worth in terms of contributing to team runs. These results suggest that, if run production is your objective, as you acquire talent for team rosters more attention should be paid to players with high on-base percentage and less attention to players with high slugging percentage.

**Key words:** Moneyball, strategy, quantitative analysis, economics

### Introduction

It is our intention to show the Major League Baseball (MLB) general managers did not immediately embrace the new statistical methods for choosing players and strategies that are revealed in the 2003 Michael Lewis Moneyball book. In particular we will show that three years after the Moneyball publication, a player’s on-base percentage is still undercompensated relative to slugging in its contribution to scoring runs.  This contradicts a study by two economists (3) who claim Moneyball’s innovations were diffused throughout MLB only one season after the book’s publication.

#### Background

In the 2003 publication of _Moneyball_, Michael Lewis (4) describes the journey of a small-market team, the Oakland Athletics, and their unorthodox general manager, Billy Beane. This team was remarkable in its ability to attain high winning percentages in the American League despite the low payroll that comes with the territory of being a small-market team. Lewis followed the team around to discover how they managed to utilize its resources more efficiently than any other MLB team. Moneyball practice included the use of statistical analysis for acquiring players and for evaluating strategies in a way that was allegedly not recognized prior to 2003 by baseball players, coaches, managers, and fans. Central to this statistical analysis is determining the relative importance of on-base percentage versus slugging percentage. By buying more undervalued inputs of on-base percentage, Billy Beane could put together a roster of hitters that would lead them to more wins on the field while still meeting its modest payroll. Although there are many other aspects of Moneyball techniques discussed in the book (e.g. scouting, drafting players, and game strategy), in this paper we will focus on whether a team can increase its on-field performance for a given budget by sacrificing some more expensive slugging performance for more, but less expensive, on-base performance. This is what we will call the Moneyball test: efficiency in the use of resources requires the equality of productivity per dollar for on-base percentage versus slugging percentage.

Hakes and Sauer (3) were the first researchers to use regression analysis to demonstrate at the MLB level just what Beane and Lewis had suggested: 1) slugging and on-base-percentage (more so than batting average) are extremely predictive in producing wins for a team, 2) players before the current Moneyball era (beginning around 2003) were not paid in relation to the contribution of these performances. In particular, on-base percentage was underpaid relative to its value. They used four statistics to predict team wins: own-team on-base percentage, opposing-team on-base percentage, own-team slugging percentage, and opposing-team slugging percentage. The regression coefficients for the team on-base percentage and slugging percentage assign the weight each factor has in determining team wins. A second regression for player salaries assigns a dollar value to each unit of a hitter’s on-base percentage (OBP) and slugging percentage (SLUG). The following statistics were used in player salary equation: OBP, SLUG, fielding position, arbitration and free agent status, and years of MLB experience. They estimated salary models each year for the four MLB seasons prior to the release of the _Moneyball_, and the first season after. The regression coefficients of OBP and SLUG assign the weight each factor has on player salary. By comparing the salary costs of OBP versus SLUG with the effect each factor has on wins the authors determined whether teams are undervaluing OBP relative to SLUG. Their results showed that in the years before the _Moneyball_ book, managers/owners undervalued on-base percentage in comparison to slugging average. In other words, a team could improve its winning percent by trading some SLUG inputs for an equivalent spending on OBP inputs. However, the year after the publication of the _Moneyball_ book, Hakes and Sauer report that on-base percentage was suddenly no longer under-compensated. A team could no longer exploit the higher win productivity per dollar of OBP because now the ratio of win productivity to cost was the same for both OBP and SLUG factors. They concluded that this aspect of Moneyball analysis was diffused throughout MLB.

The speed of this diffusion is surprising, and it does raise questions as to their methodology. For example, what if this test of the Moneyball hypothesis is misdirected? Hitters are paid to produce runs, not wins. A mis-specified statistical model can lead to erroneous conclusions. In this paper we propose a more direct test of the Moneyball hypothesis: comparing the run productivity per dollar of cost for both OBP and SLUG factors. In other words, will an equivalent dollar swap for a small increment of slugging percentage in return for a small increment of on-base percentage lead to the same increase in runs scored? If this is not the case, then a team can exploit this difference and score more runs for the same team payroll by acquiring more units of OBP in place of SLUG units. On the other hand, if the ratios are equal, MLB is in equilibrium with respect to the run productivity for the last additional units of OBP and SLUG.

### Methods

This study differs from Hakes and Sauer in three ways: 1) the focus is on run production rather than win production, 2) the designated hitter difference between the National League and the American League will be controlled, and 3) more recent data from the MLB website is used.

#### Team Run Production Model

An MLB general manager should attempt to gain the most effective combination of the on-base and slugging attributes given the amount of money the MLB team is able to spend. This will maximize the team’s run production subject to its budget constraint. The run production model on a team basis will be of the form:

RPSit = β1 + β2OBPit + β3SLGit + β4NL + eit

– RPSit = **number of runs produced by team i in season t.** This takes the total number of runs by each team for the 162 games in a season. If fewer than 162 games are played, this number is adjusted to make it equivalent to a 162 games season.
– OBPit = **on-base percentage of team i in season t.** This is found by taking the total number times the hitters reached base (or hit a homerun) on a hit, walk, or hit batsman and dividing this by the number of plate appearances (including walks and hit batsmen) for the season. This proportion is then multiplied by 1,000 in order to make it more relatable. For example, a team that reached base 350 times per one thousand plate appearances would have a 350 “on-base percentage.”
– SLGit = **slugging percentage of team i in season t.** This is the number of bases (single, double, triple, or home run) that a team achieves in a season divided by the number of at bats (excluding walks and hit batsmen). This proportion is multiplied by 1,000 in order to make it more relatable. For example, a team that achieved 175 singles, 40 doubles, 5 triples and 35 homeruns per 1000 at bats would have 410 bases per 1000 at bats and therefore a 410 “slugging percentage.”
– NLi = **dummy variable = 1 if team i is in the National League, 0 otherwise.** The American League and National League do not have exactly the same set of game rules. One difference is the American League Designated Hitter rule that allows a non-fielding hitter to bat for the pitcher.
– eit = **random error for team i in season t.** This component allows for the fact that runs produced cannot be perfectly predicted using the above variables.

#### Player Salary Model

The second regression will show how much each of the two statistics, on-base percentage and slugging percentage for individual players, is rewarded by team management for their proficiency in each category. Position dummies were employed but only the catcher and the shortstop had statistically significant increases in pay due to their contributions to fielding. The other dummy variables for position were dropped. The other factor that is included is player experience as measured by lifetime MLB game appearances. The experience factor will appear in quadratic form to allow for diminishing returns toward the end of the player’s career. This model follows the economic literature on salary models starting with Mincer (1974):

Mj = β1 + β2Gj + β3G2j + β4OBPj + β5SLGj + β6CTj + β7SSj + ei

– Mj = **salary of player j.** 2006 MLB salary in thousands of dollars.
– Gj = **MLB career games played by player j.** This measures the improvement in a player due to experience.
– Gj2 = **MLB career games squared.** In conjunction with G, a negative coefficient for G2. This will allow for a diminishing rate of improvement as more and more experience is achieved, and will even permit a decline in performance at the end of a player’s career.
– OBPj = **on-base percentage of the player.** This is compiled as an average of the 3 MLB seasons prior to the beginning of the season in which the player’s salary is put into effect (2003-2005).
– SLGj = **slugging percentage of the player.** This is compiled as an average of the 3 MLB seasons prior to the beginning of the season in which the player’s salary is put into effect (2003-2005).
– CTj = **dummy variable = 1 if the player is a catcher, 0 otherwise.** This variable is included to see if any special value is attributed to this fielding skill position.
– SSj = **1 if the player is a shortstop, 0 otherwise.** This variable is included to see if any special value is attributed to this fielding skill position.
– NLi = **dummy variable = 1 if player j is in the National League, 0 otherwise.**
– ei = **random error.** This component allows for the fact that player salaries produced cannot be perfectly predicted using the above variables.

#### Sample Selection

For the team run production, five seasons of data (2002-2006) are collected for each of the MLB teams, for a total sample size of 150 observations. Descriptive statistics for five years of 16 National League teams and 14 American League teams are given in Table 1. The mean runs scored per team during this time period is 765 per season, or 4.7 per game. The standard deviation is 76 runs, which is saying that from one team to the next the typical difference in runs per season is 76 or about 0.5 runs per game. Of particular note are the means and standard deviations of on-base percentage and slugging percentage. The mean team OBP is 334, with a typical change from one team to another of 12. For SLUG the mean is 423 and the standard deviation is 23.5.

Batting statistics from players are averaged over the course of the last three MLB seasons in order to match recent performance and salary more closely. To be selected as a player in the salary regression, the athlete must play in at least two of the last three MLB seasons (2003-2005) and play in at least 100 games each season. Another important restriction was that all players in the sample needed to have played at least six seasons at the Major League level. Before six seasons, MLB players are unable to become free agents, a very important concern for their salary. As free agents, players are permitted to seek employment from any team, commonly resulting in competitive bidding for the player’s services and a free market determination of wages. With this we have our sample of 154 hitters (free agent eligible starting players). The 2006 salaries of players and their three year MLB performance averages (prior to 2006) are given in Table 2. The highest salary in the sample is $25,681,000 and the lowest is $400,000. The mean salary is $6.2 million with a standard deviation from one player to the next of $4.89 million. The mean OBP for the players is 347, with a typical change of 34 from one player to the next. The average SLUG is 450 with a standard deviation of 65.5.

### Results and Discussion

#### Team Run Production Model

Applying ordinary least squares, the following team runs regression was estimated for the five seasons:

RPS = -908 + 2.85 OBP + 1.74 SLG – 23.0 NL + e

In Table 3 the more statistical details for the above equation (Model 1) and other versions of the run production model are shown. Model 1 is the one used in the Moneyball hypothesis, and it explains 92 percent of the variance in team runs scored. This verifies that team OBP and SLUG are extremely predictive of team runs scored. It should also be noted that the runs scored equation fit is better than the one Hakes and Sauer have for their winning equation. Model 2 drops the dummy for the National League and Model 3 adds interaction terms of NL with OBP and SLG. The differences from the first model are small. This sensitivity analysis confirms that Model 1 is the most appropriate.

We will now interpret each slope coefficient in Model 1, holding the other included factors constant. A 10 unit change in team OBP (e.g., going from 330 to 340), brings an additional 10(2.85) =28.5 team runs scored per season, on the average. A 10 unit change in SLUG brings a 10(1.74) = 17.4 more runs, on the average. Each regression coefficient, including the one for NL, is statistically significant at a 1% level. This identifies the relative importance of each hitting factor. For an incremental 10 unit change, getting on base more frequently has a bigger impact on scoring runs than getting more bases per hit. What is needed now is a determination of what these factors cost the team in salary.

#### Player Salary Model

Applying ordinary least squares the following player salary regression was estimated for the 156 starting free agent players in 2006:

SAL = -30164 + 10.28 G – 0.00321 G2 + 37.05 OBP + 36.98 SLG + 1748.1 CT + 2024.87 SS – 876.96 NL + e

In Table 4 the more statistical details for the above equation (Model 4) and other versions of the player salary model are shown. In Model 4 we see the estimated coefficients from the player salary model—the one used in the subsequent test for the Moneyball hypothesis. This model explains 55% of the variance in salaries, roughly the same as the salary equations for Hakes and Sauer. In Model 5 the NL dummy is removed, and in Model 6 the position dummies are removed. There were only small changes in the remaining coefficients compared to Model 4. This sensitivity analysis confirms that Model 4 is the most appropriate.

We will now interpret each slope coefficient of Model 4, holding the other included factors constant. A 10 unit change player’s OBP for increases 2006 salary on average by 37.05(10) = 370.5 ($370,500), and 10 unit increase in a player’s SLUG increases salary on average by 36.98(10) = 369.8 ($369,800). The coefficients for G and G2 show that experience increases salary at a decreasing rate. Both the catchers and shortstops earn higher salaries, holding OBP and SLUG constant, than the other fielding positions. The experience and hitting coefficients are statistically significant at a 1% level. The position dummies are statistically significant at a 5% level. The NL dummy is statistically significant at a 10% level.

#### The Moneyball Hypothesis

In the _Moneyball_ book small market teams like the Oakland Athletics can compete against larger market teams if they can acquire run production factors that provide more runs per dollar spent. This occurs when OBP is undervalued relative to SLUG. To see if this is the case in 2006 we will compare the two main models (Models 1 and 4). A 10 unit increase in team OBP is brings an additional 28.5 runs and a 10 unit increase in team SLUG yields an additional 17.4 runs. The salary equation reveals that a 10 unit increase in individual OBP costs $370,500, and a 10 unit increase in individual SLUG costs $369,800. At essentially the same increase in team salary (at the player level) an increase in OBP brings in 11.1 more runs than SLUG. This means that teams can achieve a higher run production at essentially the same cost by swapping 10 units of SLUG for 10 units of OBP. The ratio of run production to cost favors OBP. The Moneyball hypothesis of slugging percentage being overvalued relative to on-base percentage remains in effect three seasons after the _Moneyball_ book.

Why did our results differ from Hakes and Sauer, who argue that slugging was no longer overvalued one season after the _Moneyball_ book? We repeat our differences in methodology here: 1) using a run production model instead of a winning production model because players are paid to produce runs, not wins; 2) including a variable to differentiate the National League from the American League; and 3) using more recent data.

### Conclusions

In this paper we propose a new test of the Moneyball hypothesis using team run production in place of team wins. We clearly show that in producing runs baseball managers continue to overpay for slugging versus on-base percentage. In the 2006 MLB season, for the same payroll, a team could generate more runs by trading some SLUG for OBP. The question is, why don’t general managers recognize these results in their roster and payroll decisions? We propose several possible reasons:

1. Only small revenue market teams need to be efficient in their labor decisions.
2. Sluggers are paid for more than just their ability to score runs.
3. Moneyball techniques will take time before all teams adopt them.

Each of these answers will now be discussed. Large-revenue market teams are profligate partly in response to the pressure they feel by the fan base to produce a winner at whatever cost. By acquiring well-known free agents at high cost rather than bargain free agents who are not recognized by home fans seems a safe way to operate, even if it cuts into some profits. These well-known players tend to be the sluggers. The second reason for slugger overcompensation is that they are crowd-pleasers, and it may be more profitable (higher gate attendance and television viewership) to have more homerun hitters. This study does not attempt to measure this alternative hypothesis. Finally, Hakes and Sauer believed equilibrium between OBP and SLUG in the player market occurred in just one year after the _Moneyball_ book was published, but it is doubtful such innovation can spread throughout MLB so quickly.

> “Given the A’s success, why hasn’t a scientific approach come to dominate baseball? The answer, of course, is the existence of a deeply entrenched way of thinking….Generally accepted practices have been developed over one-and-a-half centuries, practices that are based on experience rather than analytical rigor.” (1, p. 80)

The behavioral patterns in MLB change slowly. For example, it took twelve years after Jackie Robinson joined the Brooklyn Dodgers before every team in MLB acquired African-American players on their roster, despite the large pool of talent in the Negro Leagues. The slow pace of diffusion can also be claimed for the more recent immigration of Asian players in MLB. And more to the point, batting average still receives more attention than on-base percentage in the evaluation of talent.

Finally, the adoption of Moneyball is not limited to baseball. General managers in hockey (6), basketball (8), football (5), and soccer (2) are beginning to see the same advantages in using statistical analysis to supplement or replace conventional wisdom in making decisions on personnel and strategy. Despite the Oakland Athletics’ more recent lackluster performance, Moneyball is here to stay.

### Applications in Sport

The increased use of quantitative analysis in the coaching and management of sports teams allows colleges and professional teams to make decisions based more on data driven results rather than merely tradition. “Moneyball” is often the term used to convey this decision-making apparatus, particularly when money resources, if allocated efficiently, can improve on-field performance (scoring, wins) on a limited budget.

The advantage of adopting Moneyball techniques before your rival teams may be short term, however, widespread adoption eliminate opportunities (e.g., acquisition of under-rated players) that are not also seen by other teams in your sport. But this study shows that the diffusion of Moneyball techniques is taking place slowly, creating advantages for managers who are open to this approach.

### References

1. Boyd, E. A. (2004). Math works in the real world: (You just have to prove it again and again). Operations Research/Management Science, 31(6), 81.
2. Carlisle, J. (2008). Beane brings moneyball approach to MLS. ESPNsoccer. Retrieved from <http://soccernet.espn.go.com/columns/story?id=495270&cc=5901>
3. Hakes, J. K., and R. D. Sauer (2006). An economic evaluation of the moneyball hypothesis. Journal of Economic Perspectives, 20, 173-185.
4. Lewis, M. (2003). Moneyball: the art of winning an unfair game. New York: W.W. Norton & Company.
5. Lewis, M. (2008) The blind side. New York: W.W. Norton & Company.
6. Mason, D. S. and W. M. Foster (2007). Putting moneyball on ice? International Journal of Sport Finance, 2, 206-213.
7. Mincer, J. (1974). Schooling, experience, and earnings. New York: Columbia University Press.
8. Ostfield, A. J. (2006). The moneyball approach: basketball and the business side of sport. Human Resource Management, 45, 36-38.

### Tables

#### Table 1. Descriptive Statistics for the Team Run Production Sample

RPS OBP SLG NL
Mean 765.04 332.927 423.27 0.53
Median 760.34 332.000 423.00 1
Standard Deviation 76.43 12.168 23.52 0.50
Range 387.00 63.000 123.00 1
Minimum 574.00 300.000 368.00 0
Maximum 961.00 363.000 491.00 1
Count 150 150 150 150

#### Table 2. Descriptive Statistics for the Player Salary Sample

G OBP SLG NL CT SS
Mean 1146.1 347.3 450.0 0.552 0.130 0.12
Median 1070.5 346.5 446.5 1 0 0
Standard Deviation 462.1 34.0 65.5 0.499 0.337 0.322
Range 2345.0 237.9 432.0 1 1 1
Minimum 385.0 276.1 310.7 0 0 0
Maximum 2730.0 514.0 742.7 1 1 1
Count 154 154 154 154 154 154

#### Table 3. Coefficients for the Team Run Production Models

MODEL 1 MODEL 2 MODEL 3
Variable Coefficient s t Stat Coefficient s t Stat Coefficient s t Stat
Intercept -908.00*** -17.16 941.72*** -18.46 -861.67*** -13.73
OBP 2.85*** 11.21 2.69*** 13.22 2.86*** 10.26
SLG 1.74*** 15.42 1.92*** 15.30 1.62*** 10.37
NL -23.00*** -6.26 -134.3* -1.34
(NL)(OBP) 0.275 1.06
(NL)(OBP) 0.241 0.06
Adj. R-Squared 0.921 0.900 0.923
F 568.9 661.6 343.3

*** .01 level ** .05 level * .10 level

#### Table 4. Coefficients for the Player Salary Models

MODEL 1 MODEL 2 MODEL 3
Variable Coefficient s t Stat Coefficient s t Stat Coefficient s t Stat
Intercept -30164*** -9.38 -30952*** -9.67 -27182.6*** -8.73
G 10.28*** 4.21 10.24*** 4.18 9.75*** 3.95
G2 -0.00321*** -3.67 -0.00323*** -3.68 -0.00304*** -3.42
OBP 37.05*** 3.32 38.08*** 3.39 35.30*** 3.10
SLG 36.98*** 6.47 37.01*** 6.43 33.58*** 5.88
CT 1748.10** 2.14 1798.21** 2.19
SS 2024.87** 2.34 2048.73** 2.35
NL -876.96* -1.65 -929.14* -1.71
Adj. R-Squared 0.557 0.552 0.532
F 28.48 32.39 44.44

*** .01 level ** .05 level * .10 level

### Corresponding Author

#### Thomas H. Bruggink, Ph.D.
Department of Economics
Lafayette College
Easton PA 18042
<bruggint@lafayette.edu>
610-330-5305

### All Authors

#### Anthony Farrar
Brinker Capital
Berwyn, PA

#### Thomas H. Bruggink
Lafayette College
Easton, PA

2013-11-25T16:28:11-06:00May 20th, 2011|Contemporary Sports Issues, Sports Coaching, Sports Management|Comments Off on A New Test of the Moneyball Hypothesis
Go to Top