Science and Practice

This was posted on the Supertraining forum by Steven Plisk. I would be interested in the opinions of the vast group of minds here including Charlie, Derek and others!

Colleagues,

I just posted an article entitled “Science & Professional Practice: Unfinished Business” on our blog page. It addresses some intriguing questions - along with unpublished answers - from a terrific series of roundtable articles that was discontinued 2 years ago:
http://excelsiorsports.blogspot.com/2009/09/science-professional-practice.html

In my experience, “experience” plays a much more important role than exposure to the results of lab based experiments. I think Charlie has said, “I know what works. It’s the job of the scientists to confirm why it works.” He’s not waiting for scientists to tell him what works. He already knows - based on his experience - what to do with his athletes.

Here’s a couple of scenarios to ponder:

Scenario 1 - Method 1 has been proven by science, but you don’t see results in your own practice.

Scenario 2 - Method 2 has not been proven by science, but you consistently get results when applying it in your practice.

Scenario 3 - Method 3 has been proven by science and by your personal experience in practice.

Scenario 4 - Method 4 doesn’t work in a lab and doesn’t work in practice. But they have really good marketing and advertising to pump it up and promote it to sell more product.

Obviously you would use method 3 as a coach. But how about methods 1 or 2. I assume most of you would go with 2 over 1. So what does that tell us?

Of course, Method 4 is what we see flooding the internet and guru-type training (as well as the supplement industry). It is a moron-magnet though.

I think we all employ basic scientific principles when we undertake our training. Whether or not we guide our training based on the results of lab experiments is quite a different issue.

When I get my quarterly Strength and Conditioning Research Journal, I basically flip through the studies to see what has been researched and studied. In most cases, results confirm what I already know, or the results are inconclusive. I often find myself looking at a study and saying, “Well, duuuhhhh!!!”

Some examples include:

  • Elite level soccer players had faster sprint times than non-elite level soccer players. Duuuhhhh!!!

  • Depriving endurance athletes of sleep actually hurt their ability to compete at endurance events. Duuhhhhh!!!

  • Pre-fatiguing athletes with endurance work actually negatively affected their vertical jump. Duhhhhhhhhhh!!!

I still believe that studies can be manipulated to elicit results that the researchers are hoping to get. So you have to make sure you examine the methodology of the study to make sure it is sound. You also have to look at dozens of studies on the same topic to see what the general trend is. I think there are enough studies on creatine now to say that this compound actually works. The question is, does it work in the way you want it to work?

Few points to keep in mind with studies:
1: In order for a study to isolate in on one item, only that item can be varied.
In the real world, you would never vary one training element without making compensatory adjustments elsewhere (I hope anyway!)
2: The duration of the study defines results. Let’s say you do a study on the latest flavor du jour- ‘muscle confusion’. A short term study might show success. A long term study would show failure.
3:The degree of variability of elements determines success. Limited variability can avoid a premature plateau but significant variability will create a plateau in short order.
will studies show this? I’m not sure.

Very good points!

Great input from two of the best! I am proud to say I know them personally!

Thank you Charlie and Number Two! :smiley: