See All Videos

7 Common Data Analysis Errors to Avoid

By Jessica
September 10, 2020

To gain insights from the abundance of data your team has, you have to analyze it. Analyzing data is hard. If done right, the insights your team mines could be worth millions. If you do it wrong, bad insights could cost you millions. In this video, we will go through the top 7 reasons people derive bad insights from data. Understanding these common errors will help you identify where they are most likely to occur in your organization.

View Transcript

Hello! Today I wanted to talk about the seven most common data analysis errors that we see people make all the time. These errors are pretty easy to make, but they’re also pretty easy to catch, if you’re aware that they could be happening within your organization. So today we really just want to draw awareness to these common errors and give you ways that you can help prevent them within your organization. Before we start…Having a lot of data by itself isn’t going to drive you higher ROIs or higher revenues within your organization. I think this is a common misconception that a lot of people have. In order for that data to actually drive higher revenues or higher ROIs, you have to actually analyze it and put those insights to use. You have to apply those insights. A lot of companies think the more data they have the higher their ROIs, the higher their revenues are going to be and that’s simply not the case. You have to do the analysis and you have to actually apply the insights that you’ve gathered from that data.

Error number one that we see people make all the time is that they have all of this data but they get extremely overwhelmed by it and they don’t know where to start. You know…like I said we’ve seen lots of companies that have lots and lots and lots of data make poor decisions, because they’ve analyzed it incorrectly. This can scare a lot of people off. But, the fact of the matter is you will get better at analyzing the data the more you do it. So, the key is just to start. Before you start, we actually recommend implementing a measurement plan that way you know exactly what KPIs you’re looking for and that are really going to drive the needle forward and push that needle for your revenue or your ROI or whatever your goal is. That’s what we want to start by analyzing. Then we can filter in some of those additional metrics and really get context to why certain things are happening, but really the key is just to start. You can even start small and then scale up to a larger scale once you’ve hit the ground. You are starting to run, but first just crawl. and start analyzing little sections of data here and there.

Error number two is that, we come in with inaccurate assumptions and we never root those out. So it’s human nature to assume things, we all do it. The key is though that we understand that we’re doing it and we can go back through and pick out who, what, where, why, how. Where did we get this information, and is it a reliable source? We want to root out every single one of those assumptions before we really start to analyze anything. One way that we help prevent that here at Anvil is by having question meetings. So before we start any kind of big analysis projects, we all come together in one room and just simply ask questions, and this helps us root out these assumptions. So in these meetings nobody’s allowed to spit out statistics or anything like that, it’s just asking questions and from there we go back to our desk and we really find the answers and start from scratch on some of these questions.

Error number three: a lot of people will find a data point here or there, they may even report up to it, to their CMOs, whatever the case may be, but they don’t add a lot of additional context to it. So you can really get lost, for example in this forest, or in this picture, you can only see the trees and you’re not looking at the bigger picture of the forest. You know we really want to get as much context as we can, because for instance in Google Analytics I can pull five metrics, my colleague can pull five metrics, and they may tell completely different stories depending on what five metrics that we actually pulled. So we want to make sure that we have all of that data and we’re looking at it in a big picture kind of way as well.

Error number four, we like to call this cherry picking. So it’s picking metrics that look the best in your report. Nobody wants to report bad results, that’s not fun for anybody and honestly sometimes we do this subconsciously, we don’t even realize that we’ve done it. So one thing to keep in mind is that going back to that measurement plan that we recommended creating before you start analyzing anything, that should set metrics that you’re going to analyze month over month. You know if we have one month where we report on clicks, the next month we report on impressions, the next month we report on conversions, we never really get a sense of how we’re improving. We see, you know, different metrics here and there, and maybe one is good this month, maybe one was good last month, but how are they trending over time? We lose that when we aren’t looking at the same metrics month over month, or whatever time frame you’re looking at.

And then error number five, if you’re reporting consistently, we want to make sure that we’re reporting consistently on the right metrics. Once again, identifying those that are really moving the needle and a measurement plan will help with that. If we’re reporting on the wrong metrics, we could very easily go astray by that. So making sure that we take a step back before we analyze anything and understanding what metrics we really need to be tracking in order to pair our KPIs with our core business objectives.

Error number six: drawing conclusions from a small sample size. So whenever you’re doing this, you have to take into consideration your sample size. You know, if we get three out of five for a result of a test that we ran, okay that’s about 60 percent, take it for what it’s worth. If we run that same experiment and we get 60 out of 100, that’s a little bit more reliable, even though the percentage is the same, you know it’s going to have that greater statistical significance, and really statistical significance is what we’re aiming for here. And I won’t go into a lot of detail about what statistical significance is or anything like that, we do have another video on that, so I would encourage you guys to watch that if you have any questions, but really making sure that we have a large enough sample size to draw conclusions and make accurate assumptions about the population is extremely important.

And then error number seven, we see this all the time, confusing causation and correlation and then having business intelligence tools that really create correlations between two things that just happen by mere coincidence. So to start with, correlation and causation are not the same thing. Two variables can move, can be related and move you know either in the same direction or opposite direction and be correlated, but one doesn’t cause the other. You know, just because one is correlated does not mean that one happens directly after the other, that causation is a completely different thing, so we want to make sure that we’re not confusing those two terms. And then in this example that is on the screen right now, divorce rate in Maine correlates with per capita consumption of margarine. Now we can assume that this is pretty coincidental and that consumption of margarine doesn’t actually impact the divorce rate in Maine. But a lot of times your business intelligence tools will give you similar correlations in your data so you really need to go through and look at some of those and apply just your common sense, your logic to those and really understand if they’re actionable or if this is something you know we can probably disregard and move past, just simply because how do you apply this? You can’t really apply this. So taking a step back, looking at causation correlation and really understanding the two, that is the seventh and final error that we see a lot of people make all the time.

If you have others that you’ve noticed in your organization definitely let us know in the comments, and if you want to continue receiving tips and tricks about Google Analytics, Data Studio, data analysis, definitely let us know. Subscribe to our channel, and subscribe to the newsletter on our website where you can continuously receive updates about our content that we push out weekly.

Still need help?

GET IN TOUCH
About the Author
Jessica joined the Anvil team after earning her degree in Marketing. She really enjoys getting to work on something different every day in her role, whether that’s helping with a Facebook project or solving an issue in Google Analytics. Her problem-solving prowess extends beyond work, too—she loves games and is her family’s pinochle champ.
More from this author
Ready to start a project?
let's work together