Purpose: To investigate if participation in a higher percentage of preseason sessions affects the injury profile within Division I-A American Collegiate and whether the Bradford Factor (BF) is viable for practitioner use.
Methods: A retrospective research design was used. Training load and injury data were collected and analysed for two collegiate American football seasons for 70 players.
Results: A total of 184 injuries were sustained across two seasons with 106 resulting in time loss (15.6 ± 5.4 time loss injuries per 1000 h). On average, athletes completed 93 ± 17% of preseason sessions. For injury likelihood in the following week, an increase in accumulated minutes in 7d increased the injury risk by 35%. For non-contact time-loss injuries, preseason completion showed a reduction in injury likelihood of 2% for additional 3 sessions completed. A high BF in preseason (>7) increases the risk compared to a low BF through the in-season period.
Conclusion: Preseason completion was not associated with a substantial reduction in injury risk in-season. A clear difference in BF between groups was evident and may provide a practical "flagging" variable. The BF may provide a simple but practically meaningful measure to monitor adaptation.