Neural Networks: Possibly the Most Important Training Tech You’ve Never Heard of—with Alan Couzens

We live in an era of data overload, so knowing how to interpret that data is key. Alan Couzens talks with us about how neural networks might be the answer.

Neural networks and artificial intelligence.
Photo: Shutterstock

Long gone are the days of going out for a workout and having nothing to report to your coach except how it felt. We now live in an era of information overload: we track heart rate, power, speed, distance, body temperature, HRV, sleep, blood glucose, TSS, and more. It feels like every month there’s a new metric to monitor—and coaches can sometimes struggle to keep up with them all.

In this week’s show, we’re joined by coach and exercise physiologist Alan Couzens as we talk about data and how to interpret it in a meaningful way. Couzens is way ahead of the curve on this one, pioneering the use of neural networks for training. For the uninitiated, neural networks are a sophisticated form of artificial intelligence that learns the way the human brain does. By taking in large amounts of data, they can learn what that data means and provide interpretations. Those interpretations, based on thousands of data points, can be simple, accurate, and highly useful—especially, for Couzens, when it comes to coaching athletes to their peak performance.

RELATED: What Are Neural Networks and How Can They Help Your Training?

Joining Couzens on this episode, we have two highly respected coaches: Ryan Bolton, the owner of Bolton Endurance Sports Training, and Lauren Vallee, the owner of Valiant Endurance. We also have two-time cyclocross U.S. national champion Stephen Hyde.

So, dump your data into a neural network and let’s see if it recommends we make you fast!

References

Couzens, A. (2018). Why Neural Networks are better than the old Banister/TSS model at predicting athletic performance. Retrieved October 25, 2022, from https://alancouzens.com/blog/Banister_v_Neural_Network.html#:~:text=The%20Banister%20model%20has%20a,the%20reality%20of%20diminishing%20returns.

Jobson, S. A., Passfield, L., Atkinson, G., Barton, G., & Scarf, P. (2009). The Analysis and Utilization of Cycling Training Data. Sports Medicine, 39(10), 833–844. Retrieved from https://doi.org/10.2165/11317840-000000000-00000

Kumyaito, N., Yupapin, P., & Tamee, K. (2018). Planning a sports training program using Adaptive Particle Swarm Optimization with emphasis on physiological constraints. BMC Research Notes, 11(1), 9. Retrieved from https://doi.org/10.1186/s13104-017-3120-9

Taha, T., & Thomas, S. G. (2003). Systems Modelling of the Relationship Between Training and Performance. Sports Medicine, 33(14), 1061–1073. Retrieved from https://doi.org/10.2165/00007256-200333140-00003

Wallace, L. K., Slattery, K. M., & Coutts, A. J. (2014). A comparison of methods for quantifying training load: relationships between modelled and actual training responses. European Journal of Applied Physiology, 114(1), 11–20. Retrieved from https://doi.org/10.1007/s00421-013-2745-1

Wallace, Lee K, Slattery, K. M., Impellizzeri, F. M., & Coutts, A. J. (2014). Establishing the Criterion Validity and Reliability of Common Methods for Quantifying Training Load. Journal of Strength and Conditioning Research, 28(8), 2330–2337. Retrieved from https://doi.org/10.1519/jsc.0000000000000416

Episode Transcript

Rob Pickels  00:04

Hello, and welcome to Fast Talk. Your source for the science of endurance performance. I’m your host Rob Pickels here with Coach Connor. Long gone are the days of going out for a workout, and having nothing to report to your coach except how it felt. We now live in an era of information overload. We track heart rate, power, speed, distance, body temperature, heart rate variability, sleep, blood glucose, TSS, and a variety of other acronyms. It feels like every month there’s a new metric and coaches struggle to keep up with them all.

Rob Pickels  00:35

The challenge that faces us is no longer getting enough data, it’s how to interpret it into a meaningful way. This is where neural networks come in. They are a sophisticated form of artificial intelligence that learns the way the human brain does. By taking in large amounts of data, they can learn what the data means and provide interpretations. Those interpretations, based on 1000s of data points can be simple, accurate, and highly useful. Pioneering the use of neural networks for training is top coach and mad scientist Alan Couzens, who is leading the way in using science to effectively coach athletes to their peak performance. It was his quest to find better ways to use the data that led him to explore neural networks.

Rob Pickels  01:18

Joining Couzens in this episode we have two highly respected coaches Ryan Bolton, the coach of Bolton Endurance Sports Training, and Lauren Valley, the owner of Valiant Endurance. We also have two times cyclocross, US national champion, Steven Hyde. So dump your data into a neural network, and let’s see if it recommends that we make you fast.

Trevor Connor  01:42

As a cycling coach, it’s really easy, even tempted to focus on the workouts and the training plans. After all, this is the bread and butter of being a coach, but there’s much more that affects an athlete’s performance. So new this week for fast talk labs module eight of the craft of coaching with Joe Friel unpacks, the black box of sports psychology, tapping a diverse group of experts from around the world, including Dr. Andy Kirkland, Julie Emerman, Rob Griffiths, and Jeff Troesch. By applying the biopsychosocial model to endurance sports performance, these experts show better ways to consider an athlete stress, how to engage and motivate athletes, and how to help athletes build confidence, resilience, motivation and enjoyment of their sports. So see what’s new and endurance coach into fast Doc labs.com. Well, Alan, welcome to the show. We’re excited to have you you’re kind of a brother in and that we all love talking about the science of training here and I’m actually a little embarrassed it’s taken us this long to get you on the show, but I’m personally really excited we’ve got a what do you say a science nerds delight episode today?

Rob Pickels  02:57

I think so. But let’s make it delightful for everybody.

Trevor Connor  03:01

You’re never any fun.

Rob Pickels  03:03

Always the Buzzkill. Don’t want to have fun with the science geeks want everybody invited to the party? Ya know, Alan, definitely thank you for being here. I love following you on Twitter, you do a great job of sharing some information there and engaging with people. And so I think it’s a real treat that we’re able to engage with you here and talk about neural networks. You know, tell us a little bit kind of about yourself, and how did you end up even working with this field?

Alan Couzens  03:26

Yeah, it’s been a pretty circuitous, sort of similar to where I am now. I am Australian. So I started my my studies a little over 30 years ago. Now back in, back in Sydney, Australia, did a sport science degree. And through that, or through the master’s program, I was able to get a gig working with the Australian swim team. And that was at the Australian Institute of sports. They were gearing up for the 96 Olympic Games. And that was sort of my first introduction to I guess applied sport science you know, really seeing how sport science can be integrated at the highest performance levels. And I was with with those guys all the way through to to the games and that was awesome. And after that I ended up moving to the US a couple of years after that I worked in a few swim programs in Australia and then decided that move might be nice so I was in Florida for a while there and kind of accidentally got got more involved in triathlon through one of the places that I was working at the time. And so I figured I’d better get certified and all that good stuff in the US. And during the certification. There was one of the one of the presenters was a guy by the name of gear Fisher. And this was right around the time that training peaks was just getting started. So it was a little bit of a pitch, I guess, on his part, to these new triathlon coaches that we should start kind of getting a little bit more tech savvy with the approaches that we were using. And it’s been really interesting to see how the whole paradigm has shifted since then, you know, and remote coaching and coaching with software has really become sort of the standard, I think, I think it’s how most triathlon coaches these days sort of make a living spending a lot of time in front of a computer. But it wasn’t long that I was sort of getting into that and learning about it, that there were questions that I had, and things that I wanted to do, that weren’t really being answered by the software. So I thought always wanted to learn to code. So I just kind of started picking up some, some random stuff and started, you know, writing some scripts that plugged into training peaks and plugged into some of the other software that I was using. And really, I guess I’ve been doing that in various iterations ever since. And through that developed an interest in the predictive side of things, you know, we have all of this yummy data, what can we do with it in terms of understanding athletes better? And in terms of predicting athletes performances based on the data? So yeah, I guess that’s what I’m still doing to this day,

Trevor Connor  06:30

I actually found this really fascinating what I was reading your articles on the topic, and it gets at something that we’ve addressed on the show without really having an answer, which is, a lot of these models that we use, like TSS CTL, they require you to or coach or somebody to say, well, if this happens, then that and it’s very hard to program all the various if ands like, what if you don’t get a good night’s sleep, what happens if you argue with your spouse, all these things can affect your training, but you can’t really build those into a model. So the issue that you have with something like a CTL model is, it’s actually quite simple. It can’t factor in all those variables. And we brought that up the fact that it’s got some use to it, but don’t think that if it says you’re 110, that means your fitness is going to be this, it’s just not that accurate. The neural networks, what I found really fast, and I know you’re really going to dive into this is it’s basically training a computer to learn the way a human mind learns. And then you can put all these inputs, and you can put all these different factors and have it figure out how they all wait, how they all impact your training, and come up with a much better model of here’s where you’re actually at that a fairly accurate, I hope, one two minute description.

Alan Couzens  07:55

Yeah, absolutely. You know, I think there’s one thing that separates machine learning from sort of our traditional approach to coming up with an answer to a question, and that is in the traditional approach, you have your inputs, and you have your rules, or you formula, and you get this output based upon whatever your formula or whatever your rules of thumb are at the time. Whereas with with machine learning, we have the inputs, and we have outputs that we know. And then we ask the computer to come up with its own formula. An example I like to use, and hopefully an example that resonates with the listeners, if we wanted to get the power output of a cyclist riding at a certain speed up a hill, we could use physics, right? We know the speed. We know the incline. We know gravity we need we know all of those various sort of inputs that we need to come up with a physics equation that could tell us what the power of the cyclist is likely to be that that would be one approach. Another approach would be we could just get this group of cyclists of all different sizes and shapes and on different bikes to ride up the same hill. And we could get their power output at the top. And then we could feed that information to a computer and have it come up with the physics rules itself. Have it come up with the relationships that matter? You know, if a ride is this much heavier, what is it going to do to his power and those sorts of things. So it’s really two totally different approaches to the same way of figuring something out.

Trevor Connor  09:42

Coach Ryan Bolton talked with us about this issue that we live in an age of information overload but have limited ability to interpret it. Let’s hear what he has to say.

Rob Pickels  09:51

Ryan with the the training and the tracking tools that are available to athletes, is there something missing or is there room for technology to help you, as a coach do a better job knowing exactly how that athlete is feeling exactly what you should be writing, ultimately, to coach the athlete better. Is there room for technology?

Ryan Bolton  10:12

Yeah, I think there’s, I mean, there’s a ton of room for technology when, you know, coaching athletes or being a better coach for athletes and helping athletes become more successful. I do think we’re in this information age right now, where we’re getting so much, and some of it is helpful. And some of it’s not. And I think that the biggest, some of the hardest part of it is actually finding what is useful, what’s effective. And then if it is, how to use it properly, there’s a lot of talk around different monitors, etc, etc. But I think some of them are highly effective. And then other ones aren’t in, you know, baby fads and everything. So I think that just as a coach, digesting what is out there, and what can be used, and then its efficacy, and then how to use it properly, it becomes difficult. And I mean, you could spend eight hours a day on one athlete with a lot of this stuff. So I think as as a coach to is monitoring, you know, what is most effective, and how to best use it and how to be most efficient with it, while also not forgetting kind of the key coaching component of it, and teaching an athlete to listen to their body, and not rely on the technology as much as well. So there’s kind of an art here, and a balance with it as well.

Rob Pickels  11:26

And that’s really one of the difficulties with the TSS model that we’re working with now, right, we’re trying to define exactly what that equation is just like we are with our equations of motion. But it’s also being done, Trevor, as you alluded to, in a very, very simple manner. We’re taking exactly two factors, time and intensity, and we’re multiplying them together. And that probably doesn’t explain just like that equation of motion can be really sort of complex with aerodynamics, and weight and rolling resistance. Those two factors alone, probably never, ever able to really give us the full picture of what’s happening inside the athlete, so that a coach actually can make a decision that ultimately helps that athlete reach a better performance, which is the end goal, right? It’s not about the metrics, it’s about using those metrics.

Trevor Connor  12:15

And Alan, you made a really good point in one of your articles that the sense is just time and intensity, it creates a linear equation, it basically says, the more time or the more intensity, you do the fitter you’re going to be, and ever says at a certain point, this is going to cook you this is going to kill you. So I want to throw that back to you as a coach, what made you finally say, tss CTL is just not cutting it. For me. It’s not doing what I needed to do for my athletes.

Alan Couzens  12:44

Yeah, I think I think it was exactly what you said, you know, that really, the TSS approach just brings it to more is always better approach, you know. And it basically says that if you throw more load at the athlete, that their performance is going to improve. And beyond that, as you as you also said, it suggests that performance is going to improve linearly. So if I go from a CTL, of 50, to 60, I should theoretically improve the sign amount if I get from a CTL of 150, to 160. And we all know that that’s that’s not the case, you know, so it really doesn’t take too long in dealing with this and trying to trying to use it as a performance predictor before you realize there’s some things that just don’t work in that linear model. So, yeah, we definitely need to do better. And, you know, initially, that was just a case of trying some different types of models, you know, trying some exponentially fit models, try some, some logarithmic models, and just basically trying to incorporate some of these things that we know about the way the body works, you know, that we know there’s diminishing returns, we know that you don’t get the same improvement, if you’re already superduper fit as if you’re this novice who’s just starting a program off the couch, you know, so I think that was really the impetus to start looking for some models that can do some of these patterns that we know exist in the real world. And I

Rob Pickels  14:21

do want to say there is merit and value to TSS I almost view it like kind of training by heart rate, right? And some people will say, Oh, well, heart rate doesn’t give you the full picture, and you have to use power and all of these things. But heart rate was really great for a lot of people for a very long time. And TSS was also very effective. But, you know, Alan is you’re saying we’re learning more about the body research has come out we have more understanding. And so maybe we are at this inflection point where it’s time that the model grows just like our knowledge.

Alan Couzens  14:51

Yeah, I mean, I think in the beginning, you know, they just didn’t have when we when we were talking about Bannister, this was back in the early nine. These, and they really didn’t have the algorithms and they didn’t have the computing power to deal with a lot more variables, you know. So it was a measure of necessity that we had to wrap a whole bunch of stuff up into this, this one input measure. But obviously, now the situation is very different. And we have all of these fantastic algorithms that we can use that are really complicated and can accommodate a whole lot of variables that we can throw into the thrown to the equation. But we just haven’t really made that that leap to using them in this in this context of predicting sports performance.

Trevor Connor  15:40

So that’s what I would ask you. So you, you came to the realization that CTL, there might be some value to it, but it’s got a lot of shortcomings. So what made you then jump to and maybe this is why your nickname, The mad scientist is saying, let’s use the most sophisticated AI system in existence to solve this problem.

Alan Couzens  16:00

Yeah, I mean, certainly wasn’t the wasn’t the initial thing. You know, initially, I was just looking for anything that really fit fit the data. And as it seems to happen with me, I start on one thing, and then just buy a book and then buy another book, and kinda kind of just go further and further down the rabbit hole and think about the way that some of these new super duper algorithms could could be applied to this domain, you know, but I think neural networks do have a lot of advantages in the fact that they can really describe any relationship, you know, and we talked about diminishing returns being one thing that we might see, as coaches, you know, we start an athlete with a certain load ramp, and they improve a certain amount, but then that amount changes with with another load, Ram. But another pattern we might see is for a lot of athletes, when we hit a certain load, it’s not just diminishing, but it actually reverses. So we want a model that can also do sort of like an inverted U, you know, where there’s this sweet spot of load for a particular athlete. And if we go beyond that, oftentimes, the performance actually gets worse, you know, if the athlete doesn’t have enough space within their life, or is just kind of capped out in terms of their their adaptive ability at that point, you know, so I think we, we do need a complex model that can describe a lot of different relationships, depending upon the particular athlete and the particular level that we encounter.

Rob Pickels  17:38

And Alan, correct me if I’m wrong, is something that almost leads this to be a really great and elegant solution is that with a neural network, you don’t have to know everything, right that if you divine some inputs, then it’s the machine learning that ultimately sort of creates the algorithm or the equation kind of in a different word, as opposed to you yourself, having to figure out every little component that goes into that is this important is is not important. That’s what the iterative process, and this neural network automatically does.

Alan Couzens  18:13

Yeah, exactly. I mean, and that’s certainly the, the way that the more complex networks heading, you know, you really don’t want to, you really don’t need to do much kind of initial data processing, you just feed it as much data as you can possibly give it. And, and, you know, a complex network can really drill down and see which parts of that data are important to the end result, you know, in our case, athletic performance. And, you know, in the beginning, we didn’t have a lot of data, you know, my first power meter, I think I was an SRM, you know, way back when, and we had these, these kind of single data files that you really, you couldn’t do much with in terms of in terms of drilling down and getting long term patterns from or anything like that. But But nowadays, we do, you know, I think, pretty much every athlete has a whole lot of data sitting in their training peaks account, or, you know, on on their computer in one way or another. And it’s really a case of making use of that data to determine what’s important for that athlete,

Trevor Connor  19:24

or you raise that we’re actually living in an era now of data overload. We collect so much data on each athlete, that for a coach to take that data and read it, and to try to make sense of it. I mean, it’ll be an all day job and on a single athlete, and it’s actually hard for the coach to look at all of it and come to any conclusions about the athlete. And what you’re saying is these neural networks have this amazing ability to process all that information, process all that data and go, here’s what it means and down to simple things like you should be in bed today. or you’re actually doing really well, you could actually train harder today. But it gets it down to those simple, actionable recommendations. This information overload we have right now is frustrating. Many coaches, as we discussed with Lauren Valley, let’s hear from

Lauren Vallee  20:16

the coach athlete experience at the heart of it is a relationship that I don’t think AI could deconstruct. And so I don’t think that there is a metric that is able to go into my athletes body physiology and say, This is what Robbie needs this week. My job as a coach is to take the verbal information, the nonverbal information, meaning, body language, tone of voice, energy level, that you can just feel being in a room with someone or being on video with someone or talking to somebody on the phone. There’s all sorts of information that that we as humans can gather, and I think I love coaching by perceived exertion, because it forces not only the athlete, but me to be in communication in a way that if I’m just using, you know, how many minutes did you spend in zone two? How well did you follow the prescription? That’s very, to me sterile, number one, it doesn’t develop the athletes sense of themselves in an embodied way, and how to deal with tough days, easy days. And so for me, yeah, I would prefer that we got rid of all data. And we just really focused on perceived exertion and kept things simple. Because, again, science is awesome. And it under pins, all of the coaching work that I do, but you know, a metric is not always going to dictate what I’m going to do for an athlete what that athlete needs on the day well,

Rob Pickels  21:41

I want to share an example of kind of a non endemic example of how a neural network works. That really made sense for me, and I am not very well versed in this topic, right. And so I think Trevor and I both had our homework that we had to do before this. And so I was perusing the IBM website. And this is the example that they gave it was, let’s think of using a neural network to decide if we want to have pizza for dinner. Right? There are some basic inputs, and the neural network is describing or trying to take into account. Trevor, as you had said before, the process that you go through, I gotta

Trevor Connor  22:15

stop, you always want to have pizza for dinner. That’s it easy. Well, that worked for that one.

Rob Pickels  22:22

Well, there you go. That’s what Trevor’s algorithm always tells him, his weighting is different, you know, but what what questions do you ask yourself? Is ordering going to save me time? Is ordering going to be a healthy alternative for me? Is it going to save me money, right, these are the things that we decide every single time we make this decision, you know, and then the output comes from that, well, we’re not going to make dinner tonight, we’re gonna get all the toppings, which is not healthy, because pepperoni is bad for you, you know, and but we’re only going to get one slice. So it’s going to be cheap, all of these things, all of these decisions. And there’s 100 million different decisions or inputs that we could make in here, right. And that’s where the complexity comes from. But what’s really interesting about this neural network thing is that we can decide which is more or less important. And that’s where this nonlinear side of things come from, with with our chronic training load, we said, hey, it’s intensity times volume, it’s very linear, one doesn’t count more than the other. But in the neural network, we can say, You know what, saving time is by far the most important to me, and eating salty foods is the second most. And so yeah, the pizza wins out every time because it’s always going to satisfy those criteria that’s important to us. But that’s really where the power of this comes from, from a few simple inputs and a little bit of waiting, we can ultimately make really complex decisions just like the human brain does, we can now feed that into a machine and let it ultimately do more computations than we’d ever be able to do ourselves.

Alan Couzens  23:53

Yeah, I think that’s that’s really the key, you know, is that with the neural network, the amount of factors that it can weigh, are so far beyond what what we can do consciously. And, you know, a lot of, I guess, old school coaching is based on these sort of gut feels, you know, things that we might have seen these patterns that sort of, they register, you know, we see it again, and it’s, it’s a pattern that does something to our nervous system. So we roll with that. But a neural network does a very similar thing. It’s just able to do those things much more systematically, and to weigh each of the variables in a way that truly leads to the best output based on the inputs that you give it.

Trevor Connor  24:48

Let’s hear from ex pro Steven Hyde, who talked with us about this issue of lots of great data but limited ability to interpret it.

Stephen Hyde  24:56

I think that what we have right now, in terms Have of tools in terms of interface works really well and getting some kind of communication across some direction and communication across. But does it tell the whole story? I don’t think so. There’s so much left off the table when we’re just looking at metrics. And that’s whether that’s, you know, like GSS tsp. If we’re looking at HRV, acronym, acronym charts across the board, I think that like, we can look at those things in isolation. And we can see obvious trends, but we’re taking the human out of it all. And we’re looking at it all under this very, very, very tight microscope. And in that way, I think we’re always going to miss target. I have not found a tool necessarily that like quantifies all of the numbers that come from all of the gadgets that we have hooked up with how an athlete is feeling, and what’s going to happen next with them. If we had some wild ability to predict the future, predict where someone’s headspace is going to be, then I think we’re really onto something. But for right now, I mean, communication is my best tool. It’s asking questions. I’m like, yeah, that workout was great. But how are you feeling? Did you have a positive outcome from that? And like, where do you think this training should go? Yeah, you didn’t sleep very well. But do you feel like that’s going to lose this race for you? Are you slept great? Just do you feel like you’re gonna win the race. Now, there’s a lot left to be desired.

Rob Pickels  26:32

November, the air is cursed, the leaves are falling, and I get to take a break from riding my bike, now is a great time of year to rest and reflect on the past season, visit fast talk labs and take a look at our pathways on recovery and data analysis. These two in depth guides can help you get the most from your offseason See more at Bastok labs.com/pathways.

Rob Pickels  27:01

You know, Alan, I’d love to steer this conversation toward your as someone who has actually been doing this right and take this conversation out of the theoretical and really talk about some of the practical applications. How have you specifically implemented this with with some of your athletes? Yeah. So

Alan Couzens  27:19

you know, going back to kind of that whole process of of making decisions and hopefully making, you know, better decisions than what other coaches are making? I do utilize these neural networks to run predictions. So rather than kind of making the cool in my head, are we going to do an interval workout today because my gut says this athletes ready for an interval workout? I have these neural networks set up that actually run these algorithms forward to predict what will happen if if we do this and they predict it in two ways. One is, will it make the athlete better? So is the predictive performance likely to be better? And the other one is, will it make the athlete excessively fatigued or at risk of injury or illness? So I’m constantly running these two networks against each other to come up with Okay, well, what looks like a good decision today based on something that will make the athlete better without exceeding this given fatigue threshold that I that I’ve given the network?

Rob Pickels  28:35

And Alan, if I remember right from from your prolific tweeting, and your writing, these models were trained with historical data from these specific athletes, right? They’ve been in this situation before they’ve had various training loads and inputs and and then you were able to see, yes, their performance improved, or it didn’t improve. When you create this model. Is it all athletes training, one model that then gets applied to again, all athletes, or is this Rob pickles as an athlete has his model based on his historical data? Trevor Connor is an athlete as his model, so on and so forth.

Alan Couzens  29:12

One of the neat things about neural networks is that you can incorporate both of those things, because the networks have layers, so they have multiple levels to them. You can have a group model for that’s been trained on a whole bunch of people for the first couple of layers of the network, which gives kind of a general sense of what’s likely to happen. And then for the final layer, or the second, second last layer, you can have data that’s specific to the athlete. So depending on how much data we have to work with, we could have a model that is primarily coming from the group if it’s an athlete that doesn’t have a lot of data or We can have a model that’s, that’s very tailored to an athlete, if there’s someone who comes with, you know, 10 years of training picks falls and that sort of thing. So it’s, it’s one of the really cool aspects of neural networks, is this transfer learning. It’s called where you can you can tweak group models to better serve individuals.

Trevor Connor  30:20

I’m wondering, is there a danger? As you’re describing all these different layers and these different inputs? Is there a danger of over analysis? Can you get too complex?

Alan Couzens  30:30

There’s definitely a danger of overfitting. So overfitting is kind of the biggest problem with these neural networks. And neural networks, as I said, are wonderful at approximating any function, you know, they can approximate diminishing returns, they can approximate your your novice who improves almost linearly, no matter what they can approximate the over trained athlete who actually gets worse as the as the load goes up. But because they have so much flexibility, they can start to see patterns there that aren’t really there. Because they they’re just fitting to the training data that we give them. So the trick with these complicated models, is not to get too excited when we see that it’s fitting the training data perfectly, but to actually test that on data that it hasn’t seen, and base, how excited we are on how it does on that test data. So I think that’s, that’s the big Watch out, you know, and if if you don’t have a lot of data, then oftentimes, because this model is so complex, it won’t perform as well as as a simple model. So it’s, it’s definitely something to keep an eye on and make sure that you’re not overfitting the model to the to the one athlete.

Rob Pickels  31:51

Another question that sort of follows up with that, that I’ve been wondering is, how much data is needed both in terms of say, historical length of data, but also, as a number of inputs? Let’s just say that you only have power? Is that enough? Or do we need power and heart rate? Or do we need power and heart rate and how the athlete fell and power and heart rate and heart rate variability? At what point is it too much or not enough data for the neural network?

Alan Couzens  32:23

Yeah, those two things kind of weigh against each other. So if we have, we have a whole lot of data from an athlete, then we can use a whole lot of parameters as well. And that’s, that’s the cool situation. You know, when we have, we have an athlete that comes in with a bunch of data. And from all different channels, they’ve got heart rate data, they’ve got heart rate variability data, they’ve got their power data, maybe they have temperature data, now that the core is coming out with some of that sort of stuff, as well. So when we have the luxury of a lot of data, we can throw a lot of parameters into the model. And it won’t overfit. It will come out with with a good useful model. If on the other hand, we don’t have a lot of data and we try to incorporate a whole lot of parameters, then the model will will overfit. When we when we test it on the test set, it won’t give us very good model performance. So it’s something you always have to have to kind of weigh up and, you know, make sure that you’re not you’re not getting too greedy with how many things that you want to include, depending on how much data you have.

Rob Pickels  33:34

And Allen real quick, you use the term parameters a couple times, can we just sort of clearly define exactly what that is? Because I think that I know, but I want to make sure.

Alan Couzens  33:43

Yeah, so these are the input features. So if we if we have a model that we want to predict performance, and we give it the athletes, right variability, that would be one parameter or one feature, and we give it how much training load it did yesterday, that would be another parameter or feature. So those are sort of the inputs to the to the model.

Rob Pickels  34:05

Okay, great. So that is what I thought and then there is a weighting that gets applied to those as well Correct? Is that something that you’re doing manually? Or is that something that the neural network creates through the process?

Alan Couzens  34:20

Yeah, that’s something that the network does does all on its own. And that’s perfect. That’s really the magic of that’s the magic of these things. You know, we, we as coaches might have certain ideas of how things should be weighted. And then you give the network the data and it comes back with a completely different different analysis of what should be weighted and what shouldn’t be weighted highly. So it’s, that’s really what made neural networks take off was, in the beginning, there wasn’t a way to tweak these these weights. So You know, the AI researchers worked out a way to, to build a network and to build the predictive network. But if an output was wrong, they had to tweak all of these things manually to try to make it make it more right. Now, obviously, that’s not not a practical thing to do in in any sort of real world context. But this, this guy named Geoff Hinton, came up with applying what’s called back propagation to the network. So in the case of the beginning of a neural network, it just makes these random predictions, right, it just comes up with these weights, and it just has no rhyme or rhythm to why it’s applying weights to different nodes. And it comes up with a prediction. And depending on how wrong that prediction is from the known output from the truth, it goes back through each layer and each node of the network, and just tweaks each node to make the prediction a little bit less wrong. And that’s, that’s the backstage. So now it’s got a network that’s a little bit better. And it just repeats that over and over again. So it makes the forward pass figures out how off the network is from the truth goes back through it through this backpropagation tweaks or the little parameters a little bit more. And by the end of things, you know, it’s like tuning a guitar, by the end of things, you’ve, you’ve got something that actually sounds pretty good and works pretty well.

Trevor Connor  36:31

So I love the example used in one of your articles, which is you feed into one of these neural networks a bunch of pictures and have it try to figure out which ones are cats. And you make no effort whatsoever to teach the neural network what a cat is it just gonna randomly pick and then you go, Okay, that’s a tree, you’re way off on that one, that’s a dog, you might be close. And then it goes back and adjusts. Its waiting, and it keeps doing this and doing this until they can very accurately say that picture is a cat that picture isn’t, but it learns it on its own. You don’t teach it

Alan Couzens  37:04

figures it all out. So if you if you look at and some of these things you can visualize with different networks, if you look at what it considers important in the different nodes, you might get, okay, there is a point here Now do you know so if this thing has point is that node lights up and it’s likely to be a cat, here’s a snub nose face. No, you know, here’s a, here’s a longtail node, and you get all of these things that the network has figured out, are important to determining if something is a cat or not. The same thing in our case, maybe one node is zone one training in OneNote is zone two training and it so it figures out okay, when this athlete, when the zone one node lights up, he does well when we this that this athlete when the zone three node lights up, he does he does better. And it’s similar sort of thing.

Trevor Connor  37:57

What’s neat is you’re not telling it what’s important, it can figure out what’s important. So it might discover things you didn’t know like, for example, Hey, this athlete doesn’t train well, when they go out in the morning, they train well when they go out in the afternoon, if that happens to be one of the inputs, but it can be quite creative and unique and figuring out what actually has an influence on the athlete.

Alan Couzens  38:18

Exactly that I mean, that’s the fun thing. You know, once you have the network, if you have enough data from that athlete, you can really throw in whatever parameter you want, you can see if it if it improves the output or not. And sometimes the things as you said that you wouldn’t really consider as a coach, he’s just kind of, you know, things that don’t necessarily line up with what what should be important, you know, based on what, what you learned in school, that the network is actually teaching you some things that that we might not have known.

Rob Pickels  38:55

I think this is really interesting. And we oftentimes talk about how coaching can actually lead researchers in terms of knowledge. And this is an area that I think we really might be able to unlock some knowledge and some information because one of the criticisms and I fully understand why it is but one of the criticisms of research is that it’s essentially performed in a vacuum, right? You’re trying to limit as many variables as possible so that you can really see the signal within that noise. And answer the question, Does training in the morning or the afternoon lead to improved gains? But what that does is it might answer that question, but it takes out all of the other variables that would go into making that practical or some of the things that are changing, or that just aren’t going to happen in the athlete’s life. And in the real practical world situation. Well, the athletes stress goes up when they train in the morning because they’re trying to get to work and their kids are crying and whatever else so their adaptation is lower, even though when you remove everything else The research papers said it was better.

Alan Couzens  40:02

Yeah, I mean, I think that that machine learning on the whole has a lot to offer, just by virtue of the fact that you don’t need to control as much, you know, we’re not dealing with parametric methods where we need to know everything that’s going on, in order to come up with a valid conclusion we can, we can use the power of the data that we have. And we can use some of these nonparametric bendy approaches, to better answer some of these questions of how do these things combined together to make a conclusion, you know, and there’s only so much you can do when you control every variable, because that doesn’t happen in the real world. You know, it’s not a lab situation where we have complete control over is the athlete sleeping from you know, 10 at night till six in the morning, every day, that that’s not something that we can control. It’s something that we have to deal with. And we have to see, when the when this changes, how do the other things, what are the other parameters that we’re interested in change as well. And I think Machine Learning offers a lot of, you know, a lot of advantage over traditional statistical methods when it comes to that.

Trevor Connor  41:19

So I’m actually going to reference an episode that we did a while ago on the exert software with its founder. But we talked about machine learning in that episode. And Armando said that there’s the stages that we’re seeing in how we address the data. And I think you’ve talked about this as well, that, right now we live in a descriptive era, meaning your training software gives you a picture of where you are currently at, it describes where you are, but doesn’t go beyond that. The next step is predictive. So does the you know, if you train this way, you should end up here, if you train that way, you should end up there. And the ultimate level is prescriptive, which it sounds like the these neural networks get into which is it takes in all this information and says, here’s what you should be doing right now. So the question I have for you, if these neural networks are proven very effective at this, is this eliminating the coach, is there still a role for the coach, only if they’re a computer programmer,

Rob Pickels  42:23

I think

Alan Couzens  42:24

there’s a leap between knowing what the right action is and taking that right action. You know, so when I think back to kind of phase one of my coaching career at the Institute of Sport and being on deck with the athletes, the prescription side of things was was something that the coaches were were obviously doing as part of their time, they spent a lot of time in their office kind of coming up with with the training plan. But then there was the cultural aspect of actually getting the athletes to do the plan, and to kind of set the tone for the environment as well. And I think I’m hoping and, you know, I’m hopeful that the more that we can delegate to the machines in terms of the actual kind of, you know, if this, then that, that, we can start to get some of that back in coaching. And we can start to get more of a humanistic approach to coaching, which I think is, is really, really lacking in something that the remote coaching is kind of taken away from. And I think there will always be a role for that. Because ultimately, there’s something special about putting together an environment of excellence and being there and being present in that environment of excellence. And I think that the more that we can devote our time to that and the less that we can devote our time to crunching numbers, the better we will be as coaches.

Rob Pickels  44:02

I’d love to use that to segue into my next question. And that’s Alan, is there a situation or a time where the neural network is just wrong? Let’s say garbage in garbage out? Does that happen with neural networks? Is there a case where yeah, the neural networks not to be trusted? Or you can make a mistake or, or do these things sort of always lead to perfection? Like people kind of assume computer and technology does? No,

Alan Couzens  44:27

I mean, there are there are definitely problems. Going back to the the example of go with the reinforcement learning one of the researchers said that every so often in the game, that computer will hallucinate a completely different scenario to what’s actually where it’s actually playing out. And the reality is that because we’re coming up with these kind of approximations of different states, you know, like that, we see one particular position on the board and it’s similar to what we might have seen before. So we make these leaps, it leaves some wiggle room in there for the machine to get a very wrong. So that there’s definitely situations where, because a machine hasn’t seen this particular thing often enough, or it hasn’t dealt with this particular scenario before, that it will come up with some very odd predictions or very old recommendations if we’re moving into sort of a prescriptive approach. So at the moment, there’s definitely a need for there to be a driver in the car sort of thing as well. I wouldn’t trust the machine to just go off and trust an athlete to do every single thing that the machine said, because it could kill someone if it if it gave the wrong advice on one of those, one of those leaps.

Rob Pickels  45:50

Yeah, Alan, as you were just explaining that it jogged another question for me, which is, if we look at aerodynamic bikes, if we look at say, Formula One cars, oftentimes the solutions end up being very, very similar to each other, because it’s the same program that’s kind of creating this aerodynamic optimization. If we take that concept is that happened with neural networks? If 10 People sat out and they tried to make the neural network that we’re talking about, does that spit out? 10? Very, very, very similar, if not identical answers, or is that 10? Very unique answers, either, because maybe some of the inputs or the the genesis of the person that was programming it?

Alan Couzens  46:32

aYeah, I think I think there’s quite a lot of differences, depending on the physiology of the athlete more than anything else. So the things that are important, the inputs, I think, would probably be similar. You know, it’s important to know, the various components of the training, it’s important to know how much volume the athlete is doing in each intensity zone, it’s important to know how much they’re sleeping, you know, all these sorts of features that we feed the network. But there’s definitely differences in what the network spits out as the optimal course of action for different athletes. So I’ve talked before about volume responders and intensity responders. And that’s something that these networks are really good at distinguishing between you. So you might get, for one athlete, that zone, one node lights up really, really strongly as something that’s very important to performance. Whereas for another athlete, you might get the zone three or four node lights up as something that’s very, very important to their performance. So I think the general architecture and the general features of the network are similar, but the way that the different neurons light up and the the output that it provides, is quite different for different athletes,

Trevor Connor  48:00

it sounds like it’s gonna be very effective at identifying outliers. So this goes back in all scientific studies, it’s always the law, the bell shaped curve, you’re looking for the mean, but there are people who are on the far edges of those bell shaped curves. And if they read that study and follow its advice, it’s actually not going to help them very much. It seems like this would be very good at saying, Hey, you’re not responding the way you should be, or the way most people are. There’s something different about you, it could potentially start identifying what actually works for them. Yeah, it’s

Rob Pickels  48:33

that you’re Canadian. But that’s always it.

Alan Couzens  48:36

Yeah, I think that’s, that’s the advantage, you know, we you think about what happens in a study and you think about the type of results that you see, and you see a scatterplot and you see a straight line that goes through the scatterplot if you’re on that straight line, that that’s great. But with these more complex models, it doesn’t have to be a straight line, it can be a wavy line dependent on where you are on different different features, you know, and that wavy line can come very, very close to your particular specifics as an athlete and that’s where these things really shine. And I think it’s super important obviously, you know, Chris, we we are quite different as individuals where we have different muscle fiber types, we have different anatomies different height different way, you know, there’s all of these things that come into come into play in determining what the optimal action for athlete is. And yeah, I think there’s huge advantage in individualizing things that come from this approach.

Rob Pickels  49:42

Moving on, you know, I know nothing about programming. Trevor, do you you’re kind of a computer guy used to be you have you have a video game system at your house. That’s like closer to computer programming than I have. How does somebody you know, I don’t even know like what what language is This like is this C Plus Plus, I’ll pull that one out of thin air.

Alan Couzens  50:04

Now, the easiest way to get involved in this is definitely through Python, Python is kind of the machine learning language does your the the day sort of thing, if you are interested in machine learning, then you’re definitely going to find it easier with with Python, just because there’s a whole lot of libraries that have already been written around machine learning. So it’s super, I mean, it really is dead easy and 20 lines of code, you can you can build a model, if you using Python, just because a lot of smart people have already written all the hard code, and you just have to sort of import it and plug and play. But yeah, I definitely recommend getting started with Python. And, you know, really, once you pass kind of the basic levels of sort of learning how to structure code within Python and those those basic things, it’s it’s not a not a big leap to start playing with machine learning. It’s, you know, as I said, it’s really just you put a line of code in saying what you want the layer of the network, how many how many nodes you want it to have. And you you put a layer in saying what the output looks like. And it really does the rest. And it’s really just a matter of sort of trying different things and fiddling and saying, what scores well, and what doesn’t score well. And it’s yes, it’s kind of fun.

Rob Pickels  51:25

So what you’re saying is, I should just watch a YouTube video.

Alan Couzens  51:28

So I wouldn’t. I didn’t take that route. And I bought some books and things. I’m sure in this day, and age is probably some pretty good ones out there.

Trevor Connor  51:38

Are you aware of anybody who’s building training software right now that’s incorporating neural networks?

Alan Couzens  51:44

I was involved with a startup that was was doing that. But since I left, I really don’t know what they’re doing with with this software. I don’t think that they were going that same route, just because it’s hard to implement on a commercial scale. But yeah, I Not that I’m aware of, you know, I think I think that that’s the chair, the challenge is that it’s very easy to come up with a basic if then software, you know that you don’t have to train on the individual, you don’t have to do any of these things that cost a little bit of money and make things a little bit more complex. And most people at the current stage don’t really know the difference. You know, they could the company could say we’re doing some fancy things, we have all these algorithms, when really it could just be five lines of code and a bunch of if then statements, you know, so I think there’s there’s an opportunity there for to do things a little bit more seriously. I’m not sure. Not sure it’s really going to get going until people are like this machine learning stuff is really cool.

Trevor Connor  52:51

Well, and we’re getting to the end of the episode here. And you’re a first timer on the show. So I’ll explain this to you. We finished with one minute take Holmes, which is basically you have one minute to summarize the the most pertinent points to our listeners, or give that one single message that you really want everybody to take home from this. So we’ll start with you let me know if you need a second to think about it. I’m good, then take it away. You have one minute, nobody’s timing you. Except we

Rob Pickels  53:21

are so quick.

Alan Couzens  53:23

I just want to bring a neural network in to help me summarize,

Rob Pickels  53:27

what should I say now?

Alan Couzens  53:31

Yeah, no, I think that we really hit on the importance of the importance of simplicity, and the challenges that we’re facing right now with a whole lot of data and coaches being expected to process it. So we need something to help us with that. And I think computers can, can do a really, really good job at weighing all of these variables and helping us to determine what’s important. So I think that’s, that’s step one, something has has to give from the the current, the current paradigm that we’re in where coaches are just expected to have all of this stuff in their heads. And the other really key point, I think, is the importance of the individual. And rules can only describe so much and they can only apply to so many people. And it’s just not fair. And it’s not good coaching to those outliers, the people who are a little bit different from the norm, to try to fit them to, to the scatter chart with that that single line, you know that there’s enough individual difference out there, that we really need something that helps us better identify these athletes as individuals. I think those two areas of where computers and neural networks specifically can really help.

Rob Pickels  54:56

Nice for me. I think I’ve held for a long time. In that the current descriptors that we have aren’t doing the complete job, and that there’s something out there on the table. And this really feels like a means to learn more about people to understand to take in all of the parameters that I meant I could list 100 on the whiteboard, and not be able to think about all of them, this is a means for us to be able to do that. But I also know my limitations. And I know that I am not the person to do this. And so I’m really happy that there are people like Alan and listeners that are out there listening to this right now who are hopefully inspired. But I know that I’m really excited for the day that this problem is solved. And that potentially this is the solution. And, you know, maybe it might be difficult to be commercial ready. And maybe that’s why we haven’t seen it. But I’m positive that at some point in the future, we’ll be able to solve those problems too. And this could really unlock a big aspect of coaching, not replace coaching. But help coaches coach better. I think that’s really, really interesting to me.

Trevor Connor  56:03

So I’ve got two take homes. First one is if we ever get that commercial version, I guarantee you, it’s going to discover that the biggest factor in my performance is having pizza. Which is another reason why I want pizza for dinner. No,

Rob Pickels  56:16

it’s the two PS is the pizza in the poutine. Oh,

Trevor Connor  56:20

if you survive the heart attack, for now, bye bye take home is I’m gonna go back to that analogy just a little bit, which is we have seen a whole lot of athletes who have gotten caught up in the data trap and the more data they take in, the more data they read, the better they’re going to perform. And that to me has always been the trying to think as many moves ahead as possible approach and I love this discussion of the neural networks because that’s modeled after the human brain. And that’s how our brain is designed to function is to more recognize the situation and be able to figure out what’s what’s the best path forward. And that is why we’ve on the show tried to recommend to people don’t undervalue things like RPE don’t undervalue. recognizing how you feel knowing when you should skip on a workout and not just trust the data look at your TSS and go oh, TSS is this my CTL and ATL are this. So I should work out today, even though I’m dragging my feet. And I think when we can actually have a computer, build to do this work, think the way human brain thinks take in all that data that we’re trying to read ourselves and recognize those situations recognize the best path forward, I think then we’re gonna get into a really exciting age of how to effectively train and make the right decisions. Wow, it was a pleasure having you on the show. Thanks so much for joining us.

Rob Pickels  57:56

Thank you guys. That was another episode of Fast Talk. Subscribe to Fast Talk. Wherever you prefer to find your favorite podcast. Be sure to leave us a rating and a review. The thoughts and opinions expressed on Fast Talk are those of the individual. As always, we love your feedback. Join the conversation at forums.fasttalklabs to discuss each and every episode. Become a member of Fast Talk Laboratories @fasttalklabs.com/join to become a part of our education and coaching community. For Allen Couzens, Ryan Bolton, Steven Hyde, Lauren Valley, and Trevor Connor. I’m Rob Pickels. Thanks for listening!

Related Episodes