OT: Mental Analytics and longterm implications for recruiting.

Submitted by Blue_sophie on

SB Nation has a fascinating article about a new technology for tracking athletes' mental capabilities. The article in question is specifically about split-second decision-making in baseball, but obviously this would have bearing on all sports. The researchers (a company called deCervo) basically use video games to determine how quickly athletes can make decisions in response to visual stimuli. The article has a link to a short Youtube video that explains the technology pretty succinctly.

We have long discussed athletes' split-second decision-making in terms of "vision" (a batter's ability to "see" a pitch, "court-vision" in basketball, or a running back's ability to "see" an emerging hole in the line). However, vision is a murky and subjective term that causes pointless heated debate amongst fans and coaches (see Trent Richardson's NFL career), so I applaud the effort to make "vision" more analytically rigorous.

At the same time, I wonder if analytics can go too far in quantifying the capabilities of athletes? Could we one day see standardized tests for prospective student-athletes that are comparable to the SAT subject tests?

restive neb

July 15th, 2015 at 6:03 PM ^

My issue with this is that training can dramatically improve performance on these tests, and it would be difficult to know if someone's score was before or after practice. If you're comparing scores of someone who is well prepared to someone who isn't, you could be reaching the wrong conclusion as to ultimate ability.

Bodogblog

July 15th, 2015 at 6:14 PM ^

Asimov was right, they are going to test the shit out of us, and slot us into jobs for life. Remember to keep insisting that your scores were wrong...

Noleverine

July 15th, 2015 at 8:40 PM ^

Hi guys, it's me again, your resident sport psychology person. This is a really exciting topic and direction in the field recently. In our work with athletes, we have been doing a lot of visuoperceptual training. At IMG Academy, they even have a whole curriculum based on visual training, including things like a Dynavision. We did some work with athletes using this technology to help train vision, reaction time, etc. One of the cool things is it can help train response inhibition as well.

The way it works is a light flashes, and you have to hit the light. It gets more difficult when you introduce a secondary task (math problems that flash on the screen) to distract primary attentional focus, relying on peripheral vision to see the lights. You can even have different color lights, and tell them "only hit the red, ignore the green" which incorporates an inihibatory response.

There is also the Neurotracker, often used in concussion research, that can help train vision as well. There are a number of balls, and you follow them on a 3D screen. Kind of like the ball-in-a-cup game at the carnival. It seems simple, but when depth is introduced, it can be really difficult. It can help you train to keep track of a number of different moving parts (such as a quarterback needing to attend to everything unfolding in front of them.

There has also been a great deal of research on the role of videogames in developing visual and attentional skills. Really interesting stuff. Feel free to ask if you'd like to learn more.

Noleverine

July 15th, 2015 at 8:46 PM ^

That's just visual and perceptual training. We also use research to determine the most information-rich sources are, and train athletes where to look.

For example, in tennis, novice players take longer to respond to an incoming serve than experts. That's because novices gather their information regarding where a serve is going based on late-serve, early ball flight paths. On the other hand, experts gather their information earlier, from arm and hand position before the ball even makes contact. This allows them to figure out where the ball is going much sooner than novices. So we can teach them to focus on those aspects in order to train them to see the game more like an expert.

As a side note, experts are LESS confident in their rsponses (i.e. where the serve is going) until much later than novices, but also are correct significantly more often. It goes to show that the more you know, the more you realize you don't know.

Blue_sophie

July 15th, 2015 at 9:02 PM ^

Super interesting, Noleverine. I think perceptual training is valuable, but I also think there is a problem when testing is used as something more deterministic than diagnostic. Using technology to enhance performance makes sense, but high-stakes testing has been problematic in academic settings and may prove similar in an athletic context as well.

Noleverine

July 15th, 2015 at 9:40 PM ^

I agree entirely. It's similar to the use of personality testing to determine likelihood of success in sports. There is definitely a correlation between perceptual ability and success. It's oversaid, but a correlation is not causation. By relying too heavily on testing, we can miss out on a lot of important attributes an athlete can bring that may not show up in these tests. These skills can be trained and improved, so it's like it's something you have or you don't.

Noleverine

July 16th, 2015 at 2:26 AM ^

Thanks. I just got my masters (is there an apostrophe?) in Sport Psychology, so I spent some time studying this topic. My advisor was really big on cognitive aspects of sport performance. I'd love to talk more about it.

Shameless self-promotion, since you asked: I am actually starting a blog about sport and exercise psychology, where I will talk about things similar to this topic. I'm almost ready to launch, but I wanna make sure I do it right. We should be up and running soon.

If you do the twitters, you can find me @AthletesMindSF, or you can email me at AnAthletesMindSF -at- gmail. 

Ghost of Fritz…

July 15th, 2015 at 10:12 PM ^

..than as a testing device. 

I could see how this might be used to help most anyone to improve their instant decision making and reaction time.  But as a way to evaluate and predict?  It is a very blunt instrument.

Whether it is a sport, business, academics, or whatever, testing X trait or capability as a metric and predictor of high capacity for a complex, embedded, real-life activity usually has a very high false positive and false negative rate. 

Tests are least inaccurate when used for weeding out those who have a very low probability of success in a particular real-life activity.  But we instead often make the mistake of using tests as distinguishing predictors even among those who score on the very good part of the bell curve.  So a university will want a kid in the 93rd percentile over a kid in the 90th percentile, which has very little relevance in real life activities and outcomes. 

We use test this way because they are relatively cheap and easy to administer, and  because we are intellectually lazy enough to mistake small test result differences as predictive of things that are actually important.  And we conveniently ignore the very high false positive and false negative rates in terms of predicting real world performance. 

In other words, if you want to evaluate someone for a complex, embedded, real-life activity (say, for example, football),  test them at the activity.  Don't measure video game reaction times.  Hold satellite camps and watch prospects play football