top of page
  • Sam Decker

7 Steps to a Data-Driven Culture


The majority of brands have software to test, personalize and use data. But as we learned in our recent study, the readiness to achieve data-driven outcomes still suffers. The gap to real impact is the difference between your technical/data capabilities and the people, process and skills to fulfill its promise. These cultural headwinds may be against you.


Before founding Clearhead we were ecommerce executives and have experienced these headwinds first hand. Now, after working with over 60 clients on their digital journeys over the last four years, we’ve learned some tricks to accelerate a data-driven culture.

Here are seven we’d like to share with you:


1: Product and UX Should “Own” Experimentation

2: Invest in Experimentation Fluency

3: Practice is the Thing

4: Roadmap from Problems

5: Story Before Stats

6: Plan on Personalization Phase

7: Combine the “What” and the “Why”


Read on to learn more about how each of these can help you guide your organization toward a culture of experimentation.


1. Product & UX Should “Own” Experimentation


Organizations often think of analytics, testing and (sometimes) personalization as highly distinct activities. These small teams may struggle integrating with process for site operations, product management and UX development. In these cases, testing becomes after the fact (“CYA-testing”), or test-ideas are non-core and not focused on the biggest problems.


Think of experimentation as a function directly at the intersection of product development and UX decisions. In other words, testing and (use of) data should be a core function of product and UX teams rather than a distinct team and set of processes that live in a separate world.


There can be a team that supports the execution of tests, however the responsibility of getting data for product and UX decisions should be directly owned by the leads in product and experience. Quantitative and qualitative insights on problem discovery and hypotheses should be an integral part of the process they own, regardless if the help to get there is internal or external. Said another way, hold your product and experience leaders responsible for testing and experimentation.


2. Invest Equally in Experimentation Fluency


You buy a testing or personalization technology and typically have one or two people trained on the tool. We often see that these testing experts are either not fully available or not focused exclusively on testing.


More importantly, a broader team is not trained on the process of testing or personalization. Frequently operational teams don’t understand the process or how to get to the right ideas. In these situations your experimentation culture will be reactive, and adoption of testing will be a fraction of what you expected when you bought the technology (Read “How to Get Return on Technology”).


A best practice is to invest in the skills to do testing, but at the same time, invest in training a broader team on the skills needed for the experimentation process. Everyone involved in creating and managing digital experiences should know why you’re pursuing experimentation, how it works and how it relates to their jobs. The skills and knowledge of process are as important as the technical skills to execute.


Having outside partners who integrate with your team can accelerate learning and adoption. Investing in those as early as you are investing in technology helps ‘grease the skids’ towards executing experimentation and a data-driven culture.


3. Practice is the Thing


It’s one thing to talk about testing, experimentation and personalization. It’s another thing to actually do them… repeatedly. When you do, these reps become a trojan horse toward building a data-driven culture.


A best practice is to build a goal and process for continuous, repeated experimentation and then apply people to that process. Not the other way around. What slows down the flywheel of testing is when one or two people ARE the process, thus leading to a bottleneck.


4. Roadmap from Problems


Digital roadmaps are built from ideas. But where did those ideas come from?

When you build digital product ideas to measurable problems and goals first, you inherently create a data-driven layer in decision making.


Here’s what we mean: When something on the roadmap is an idea, you can create an assumed business case and you may measure the impact of that idea after the fact.

Imagine, instead, if you first aligned on the goals of the business. Then, as a team, you explored the business or customer problems that stand in the way of achieving those goals. Then, you put data against the size of your problems. Once you do that you’ve created a dialogue about selecting which problems are MOST worth solving.


From there, acknowledge that a product roadmap is a list of hypotheses. You generate hypotheses (a new experience, new technology, etc.) that will solve a measured problem. And since you put data against the size of the problem, you are predisposed to test any hypothesis to gauge if it will ACTUALLY solve the problem.


And now you are operating a product roadmap with a data-driven process and on your way to data-driven culture.


After our first two years doing tests for clients. we saw this gap and developed the Problem Solution Mapping (PSM) methodology. Read more here.


Without problems as the fulcrum for forming a UX or product roadmap, it’s difficult to have a forcing function of data over opinion. Any idea can possibly win, and the loudest opinion gets their product on the list. In other words, the distance of ideas to high-level goals is too large, and that’s when the data dialogue gets lost.


5. Story before Stats


When presenting the results of a test, do you set up the goal, problem and customer journey before you put up a grid of numbers and your 99% confidence interval?


Here’s the thing about data: Not everyone can distill the story the data can tell from the data itself.


When you have data that suggests the size of a problem or the confidence in a solution, be careful not to get too deep in the spreadsheets with your broader audience. More people will appreciate and accept what the data says when you tell a story first, and use statistics as a supporting role.


6. Plan on Personalization Phases


Most digital executives we talk to have a lot of ambiguity around personalization. The steps or outcomes are not clear to them. And thus, personalization is probably not clear to others in the organization.


To get excitement and alignment on this ambiguous personalization journey, focus less on the tactics, technology and segmentation ideas around personalization and more around “capability phases” that more people can get behind. For example you may share a plan for the following phases:


The first phase is exploration. Look at what’s being done, what you’re capable of now, which technology and service providers can help.


The next phase is prioritization. Based on assessing goals and problems, you decide where you want to experiment.


The next phase is experimentation and learning. Gain insights into if and how personalization is solving real customer problems.


The next phase is optimization. Look to accelerate what you’ve learned that works and make bigger bets. By focusing on the these phases and talking about them openly, you get more people comfortable with taking the steps within the journey.


7. Combine the “What” and the “Why”


The analytics and testing team live in data, spreadsheets and statistics. The user experience team may work more in qualitative research (user testing, session recording, card sorting, surveys, focus groups, etc.). How do you get the user experience team to appreciate the quantitative testing data, and the data team to understand the qualitative approaches to get data-driven alignment?


When we work with clients we use both approaches, often combining them, to achieve appreciation for data in the process to product and UX decisions. And we bring both groups together to share the data story using these multiple approaches. This reinforces the value of data (regardless of type) across distinct teams and processes.


Analytics, Testing, Personalization and Optimization. These are practices that — when executed properly — drive a data-driven culture. And it is this culture that makes a digital organization competitive and grow faster. In a world of technology and data abundance, the time is now to align and execute towards experimentation to get much higher returns.

If you’re a digital executive reading this, your challenge is to package and present this larger story.

bottom of page