The purpose of this series is to discuss the use of analytics in sports organizations (see part 1). Rather than jump into a discussion of models, I want to start with something more fundamental. I want to talk about how organizations work and how people make decisions. Sophisticated statistics and detailed data are potentially of great value. However, if the organization or the decision maker is not interested in or comfortable with advanced statistics then it really doesn’t matter if the analyses are of high quality.
Analytics efforts can fail to deliver optimal value for a variety of reasons in almost any industry. The idea that we can use data to guide decisions is intuitively appealing. It seems like more data can only create more understanding and therefore better decisions. But going from this logic to improved decision making can be a difficult journey.
Difficulties can arise from a variety of sources. The organization may lack commitment in terms of time and resources. Individual decision makers may lack sufficient interest in, or understanding of analytics. Sometimes the issue can be the lack of vision as to what analytics is supposed accomplish. There can also be a disconnect between the problems to be solved and the skills of the analytics group.
These challenges can be particularly significant in the sports industry because there is often a lack of institutional history of using analytics. Usually organizations have existing approaches and structures for decision making and the incorporation of new data structures or analytical techniques requires some sort of change. In the earliest stages, the shift towards analytics involves moving into uncharted territory. The decision maker is (implicitly) asked to alter how he operates and this change may be driven by information that is derived from unfamiliar techniques.
Several key concerns can be best illustrated by considering two categories of analyses. The first category involves long-term projects for addressing repeated decisions. For instance, a common repeated decision might be drafting players. Since a team drafts every year it makes sense to assemble extensive data and to build high quality predictive models to support annual player evaluation. This kind of organizational decision demands a consistent and committed approach. But the important point is that this type of decision may require years of investments before a team can harvest significant value.
It is also important to realize that with repeated tasks there will be an existing decision making structure in place. The key is to think about how the “analytics” add to or compliment this structure rather than thinking that “analytics” is a new or replacement system (we will discuss why this is true in detail soon). The existing approach to scouting and drafting likely involves many people and multiple systems. The analytics elements need to be integrated rather than imposed.
A second category of analyses are short-term one-off types of projects. These projects can be almost anything ranging from questions about in game strategies or very specific evaluations of player performance. These projects primarily demand flexibility. Someone in the organization may see or hear something that generates a question. This question then gets tossed to the analytics group (or person) and a quick turn-around is required.
Since these questions can come from anywhere the analytics function may struggle with even having the right data or having the data in an accessible format. Given the time sensitive nature of these requests there will likely be a need to use flawed data or imperfect methods. The organization needs to be realistic about what is possible in the short-term and more critically the analysis needs to be understood at a level where the human decision maker can adjust for any shortcomings (and there are always shortcomings). In other words, the decision maker needs to understand the limitations associated with a given analysis so that the analytics can inform rather than mislead.
The preceding two classes of problems highlight issues that arise when an organization starts on the path towards being more analytically driven. In addition, there can also be problems caused by inexperienced analysts. For example, many analysts (particularly those coming from academia) fail to grasp is that problems are seldom solved through the creation of an ideal statistic or equation. Decision making in organizations is often driven by short-term challenges (putting out fires). Decision support capabilities need to be designed to support fast moving, dynamic organizations rather than perfectly and permanently solving well defined problems.
In the next entry, we will start to take a more in depth look at how analytics and human decision making can work together. We will talk about the relative merits of human decision making versus statistical models. After that we will get into a more psychological topic –decision making biases.
Part 2 Key Takeaways…
- The key decision makers need to be committed to and interested in analytics.
- Sufficient investment in people and data is a necessary condition.
- Many projects require a long-term commitment. It may be necessary to invest in multiyear database building efforts before value can be obtained.