"Analytics". We know that it is more than just a buzzword but too often we cannot realize it's promise. Why is it that all the budget and time we dedicate towards predicative analytics, BI, big data, etc does not always yield tangible benefits to the organization?
There could be many reasons but I would argue it is not necessarily because there is anything wrong with your analytics per se. You could be using the right tools, and have staff with the right skills, but still end up with not much to show at the end of the day. Wherefore the gap between promise and reality?
Effective = Good Enough First of all, how do we define successful analytics? The same way we define success for any business process, by focusing on it's contribution to the overall objectives of the organization.
ef·fec·tive·ness /iˈfektivnəs/, noun the degree to which something is successful in producing the intended or desired result
Thus the goal for any kind of internal investment should be that it is effective. In the case of the analytics function it's contribution to overall organizational success is to use data to inform effective decisions so that the business/program side can drive the overall organization to success.
Since budgets, time, and capacity are all finite resources, this inherently implies a target of "good enough".
All too often individual team members get wrapped up in doing one thing perfectly and sometimes lose sight of the overall problem they are trying to solve.
But any improvement which doesn't help achieve a specific desired business outcome is likely taking resources from one that does. Hence "effective" and not "correct".
It is rare that the solution which is completely correct from a technical perspective is the most effective because you will inevitably run into diminishing returns as you asymptotically approach perfection.
Planned compromise is the key to effectiveness: some things need to be done right but most things just need to not be done wrong. The key is identifying which is which.
Garbage in is Garbage out
So effective business outcomes are driven by effective analytics. Pretty straightforward. But how do we make analytics effective?
There are many resources out there for how to build an effective analytics engine in your organization, so that is not what I'm going to focus on.
Even with a finely tuned analytics machine, you could still end up with useless results for a very obvious reason: crappy data.
You build your high performance engine and then put skunky data in the gas tank and you will have poor results: sub-optimal business outcomes resulting from ineffective analytics due to ineffective data.
The normal approach to this is to just dump the data and have your analytics resources spend a great deal of time and manual effort to clean it before ingesting it into the analytics suite (if excel's "vlookup" function was a stock it would out perform any investment).
I constantly encounter new analysts who are shocked at the amount of time it takes to stage/process data before they get to the fun part.
Data Effectiveness The key realization behind data effectiveness is that data analytics folks are not necessarily database technologists.
Meaning that someone who knows analytics well doesn't necessarily understand the nuts and holts of how the data they use is stored or captured and as a result the pre-processing and data cleaning is not very scalable, repeatable, or efficient.
In a lot of organizations the data reporting function, ie the folks pulling data from the actual systems, is separate from the analytics function (which is often distributed throughout the organization rather than being centralized).
Data Effectiveness is the description of formalizing the responsibility of data processing and staging back into the reporting/data systems world so that the analytics folks can focus on the statistical analysis, machine learning, etc.
The outcome of reporting is to support the goals of analytics, and the outcome of analytics is to support the goals of business. So organizational success comes from effective analytics comes from effective data.
And the task prioritization should be in reverse: start from the business outcome and work backwards to the analytic and technical work that directly supports that. Hence, it really looks like:
Data Quality One good example is regarding data quality. The reality is that there is not just one type of data "fuel" because there is not one analytics "engine".
Meaning that not all reporting needs to be perfectly accurate. Perfectly consistent, yes. Perfectly accurate, no (I'll go into that in more detail in another post).
The output of your data effectiveness "refinery" is the data equivalent of everything from crude asphalt to heavy lubricating oils to the highest octane racing fuels: not all data needs to go through the same level of processing/cleaning.
Being effective means taking a pragmatic, systemic approach which focuses on what is important to fuel the particular analytics engine for each business workstream.
Final Thoughts I will be revisiting this topic in future posts to give more immediately useful examples but fundamentally the point is keep your ideal end state in mind while at the same time make decisions based on your current reality. Meet your data where it is today to get you to where you need to be tomorrow.
Effectiveness is strictly measured by how well it supports a desired outcome and there is no need to over-refine your data or over-engineer your analytics if the particular business goal doesn't require it.