Staring at the analytics section of an application wondering “What do I do with this?” can be anxiety-inducing. Leaders and teammates often want to know “What does the data tell us so we can make informed decisions?” But it’s not always that simple. So let’s take a step back and explore how to gain actionable insights from your data.
Using the scientific method for marketing
Since it seems every marketing solution includes an analytics tool (ours is called Insights), it’s tempting to just jump right into data analysis. But without a formal strategy for reviewing and interpreting data, the information you bring back to your team will lack insight and accuracy.
The scientific method is one proven approach to acquiring and understanding new knowledge. The main steps include: asking a question, making a hypothesis, and then testing that hypothesis. In this article, we’ll walk through these steps and how to use them in data analysis.
Form a question to focus your efforts
Although developing a plan for data analysis can feel like overkill, it will save you time in the long run by providing focus and direction. Before you even open the analytics app, write down the questions you’d like the data to answer.
Ensure that each question aligns with team or organizational goals. Ask yourself, “Will this answer help inform change?” Or, “Will my team thank me for this information?” Don’t be afraid to get specific with your questions. This will help eliminate time spent simply clicking around in the analytics tool, wondering what to look at first.
As a content team, we’re interested in understanding how our marketing resources perform. So the questions we seek to answer are ones that will help us achieve our content marketing goals.
Some questions to ask could include:
- Which marketing resources are buyers finding useful?
- What blog article generates the most new site visitors?
- What ad copy produces the most new leads for a specific resource?
Creating content is time-intensive. It requires coordination between writers and designers, along with alignment across promotions and campaigns. Because our marketing team wants to create resources that are valuable, questions like these will help provide insight into which content types and topics produce the engagement and results we set out to achieve.
Make a hypothesis to determine success
A hypothesis is a testable theory or idea that is not yet proven. The key here is that a hypothesis can either be proven or disproven. Using an “If _____, then ______” statement could be a helpful place to start.
Using desired project goals or key results to inform your hypothesis can help you identify factors that impacted your final numbers, and make your results more repeatable in the future. Including a hypothesis in your approach to data analysis will help get you better answers.
Any successful customer experience is the composite of numerous cohesive and memorable brand interactions. Let’s try to prove or disprove the impact of some of the details in these experiences by articulating some hypotheses:
- Personalizing an email subject line will increase open rates by at least 4%
- Changing a button color will increase click-through rates (CTRs) by at least 8%
- Asking a question about a customer experience pain point will increase resource downloads by at least 5%
Knowing which actions make the biggest impact with our specific audience will help inform the direction of our copy while also aligning with company objectives.
Test your hypothesis to validate results
Now it’s time to set up the experiment and put your hypothesis to the test. To achieve the most accurate results, you’ll want to focus on only changing one element, or control variable, at a time. The control variable is held constant throughout an experiment to assess the relationship between multiple variables.
You’ll also want to allow plenty of time to conduct the experiment, to ensure you have enough data to achieve statistical significance.
Any piece of content can be influenced by numerous variables. Not every piece of content is the same form. Not every piece is created for the same audience or is being promoted equally. By identifying some key criteria for our data, we’re able to get more specific answers.
Taking one of the sample hypotheses from above, let’s look at how this could be tested.
Changing the “request a demo” button color will increase click-through rates (CTRs) by at least 8%
Version one will keep the button color the same as it has been — blue with white text.
Version two will change the button color to green while keeping the white text the same.
The duration of this test will depend on the amount of site traffic a website has, but in general, a test should run for at least two weeks to ensure the data is statistically significant.
Once the test has run for enough time, it’s time for analysis.
Analyze the data
Take a look at what happened in your experiment. Does the data support your hypothesis? Spend some time looking at the data with a critical eye. Look for patterns or unexpected results. It’s okay if you disprove your hypothesis. This new information can be used to refine, alter, or expand your hypothesis, or even form a completely new hypothesis.
With the button colors different but everything else on the landing page the same, the CTR for each page was about the same. Since this test ran for enough time to give our data statistical significance, we can conclude that the green and blue buttons perform similarly.
Communicate your findings
Not everyone will be as close to the data as you are. Determining a way to effectively share your findings with others is a crucial part of the experiment. The final product for your share out also depends on the amount of time and effort that was put into the experiment.
Visual representations of data are typically the easiest way to quickly convey results, but these representations can range from great works of art to simple charts and graphs. Understanding your audience and project expectations will help guide your communication. And remember, even if your hypothesis was disproved, there’s still valuable information to share.
Our experiment was relatively small and only for the marketing team. Since our audience understands these tests, it didn’t require a large presentation or formal charts. Instead, we presented the engagement numbers and CTRs for each version of the landing page during the experiment time frame and used that as a jumping-off point for our next hypothesis.
Tips for success
Be transparent with your data
Data can be an incredibly powerful tool to inform change. Make your data accessible to your teams. Create public dashboards to monitor stats, or consistently share metrics in meetings to raise awareness and get people asking questions. Let your team embrace data to generate more ideas and interpretations.
Save dashboards for future reference
Refer back to this data to see how it’s affected by the changes you make. Will your changes have the impact as you intended? The best way to know is by looking at your new data with the same criteria.
You can’t measure and evaluate everything
Before you jump into what to measure and how, remember that data has its limitations. Ed Catmull, the author of Creativity Inc., says “Measure what you can, evaluate what you measure, and appreciate that you cannot measure the vast majority of what you do.”
Starting your own experiment
Admittedly, a lot of what we do cannot be measured. And in marketing, we’re also up against time constraints. So we need to find the right balance. Just because software offers tools to measure something, might not mean those metrics are important to you and your objectives.
Monitoring your digital asset management (DAM) system data can help inform marketing decisions or save thousands of dollars. We want to help you refine your approach to data analysis, so you successfully evaluate what you can and keep moving forward. If you want to check out our DAM solution and analytics app, Insights, of our sample DAM site today.
Note: This article was originally published in January 2017 and has been updated to remain current.