Imagine a world where we’d know the impact that an experience would have before we shipped it.
This is not as impossible as it sounds, and most product companies are or have been building experiences this way for a few years now. Popularised by ecommerce sites and lean startups, validating design decisions by A/B testing and hard data is commonplace nowadays. But trying to apply some of these techniques to enterprise software can be much harder. There is a great article on UX mag debunking the 6 myths of data driven design. It’s an awesome read, but I would go a step further, and say that we need to move away from data-driven design and into a world of data-informed design.
Data-driven vs. data-informed
Data-driven design looks to ship fast, optimise at every step and let the data drive many of the design decisions. Often (but not always) it is possible to get large percentage improvements with small tweaks as pages have similar and standardised layouts, or a startup only has a few features to optimise for and one specific type of customer to speak to.
However, this way of working can be much harder to execute when you work in a complex product environment, with many features, experiences and types of customers to optimise for. Continuously shipping experiments may not be the right way to work when millions of customers rely on your products to get their work done, as is the case at Atlassian. But how do you keep trying to optimise and learn, whilst retaining a well thought-through design approach? I believe there’s a slightly different approach which I’m terming “data-informed design”. Here’s an internal Atlassian example of being data-informed, instead of data-driven.
We shipped an experiment to improve our onboarding experience for new customers. From a data perspective the result was a huge failure. It resulted in -12% engagement (time spent in product) and neutral conversion. So did we throw it away and move to something else? No, the team believed in the experiment we had designed and our qualitative research was telling us that we were on the right track. Our conviction turned out to be correct. We took some lessons from the data, applied some insights from our qualitative research, iterated on the experiment, and with tweaks to the design it became a success at +22% conversion, with engagement staying neutral.
This is being data-informed, not data-driven. We used the data we had and combined it with qualitative feedback and our design intuition to produce an iteration that was ultimately successful. We weren’t binary, we didn’t throw it away completely at the first sign of trouble.
Here’s another example of being data-informed from Airbnb. I love this article discussing their experimentation and in particular this quote about a major redesign of the Airbnb search results page:
A lot of work went into the project, and we all thought it was clearly better; our users agreed in qualitative user studies. Despite this, we wanted to evaluate the new design quantitatively with an experiment.
They had done all of the thinking behind the redesign and received positive qualitative feedback, but they wanted to ensure it performed well quantitatively in an experiment before a major release (imagine the impact on Airbnb’s revenue if it failed in production).
The difference between data-driven and data-informed is subtle but important in my opinion, and data-informed is perfectly depicted by this Venn diagram, which I borrowed and tweaked from Josh Porter:
Quantitative data tells you WHAT is or isn’t happening. Qualitative data helps you better understand WHY it is or isn’t happening. As the diagram shows, it’s not about any particular extreme but rather a balance along with your gut feeling. The best product designers find that balance between all of the inputs they have, and create clear, compelling experiences through the fog of data and insights.
Let’s look at what makes up each circle.
Data on its own is pretty much useless, I would even argue that results from A/B tests on their own only give directional information. You need the insights from data to create meaningful product experiences, so an amazing product analyst is a must. Data can be anything from conversion, engagement, time spent in product, feature usage, net promoter scores (NPS) and so on. Whenever you are designing an experience you need to ensure you know what data point you are optimising for and if it is a leading or trailing metric. You should only try to optimise for one metric, but you should also monitor other secondary metrics in your product. Again, the key is getting customer insights from your data.
Internally at Atlassian, we get a bunch of data, one of these sources is NPS. On its own the score is interesting, especially when tracked over time but not particularly useful. We go a step further and break this down into a framework called RUF:
R = reliability
U = usability/design
F = functionality.
By breaking it down we can get insights from different customer segments on different pages within our products on how they score the performance, usability and if we have the right functionality/features. This allows us to see the major areas that customers have problems with within our products from a quantitative point of view.
Data provides the “what”.
You have to build up and spread the empathy within your organisation for the problems your customer is having. Empathy can be built up by reading NPS feedback quotes, interviewing customers, surveys and more traditional ethnographic research methods, like contextual inquiries or usability testing. Again though, that empathy is useless without meaningful insights.
When we redesigned our onboarding experiences, we had a comprehensive experimentation program running. But in parallel we also ran diary studies and usability testing sessions. The diary studies really helped us understand our customers’ pain points and motivations. From these insights we built a conceptual model for onboarding. That model gave our experimentation program a clear framework to work towards.
Empathy provides the “why”.
Gut feeling / intuition
There is a great quote from Julie Zhuo, the Facebook product design director:
Data and A/B test are valuable allies, and they help us understand and grow and optimize, but they’re not a replacement for clear-headed, strong decision-making. Don’t become dependent on their allure. Sometimes, a little instinct goes a long way.
And it is very, very true. No amount of data or empathy will remove the fact that you need to essentially make decisions on how to interpret those customer insights. I mentioned earlier how we used our design intuition to stay the path with a failed experience. Intuition and gut feeling is not voodoo, it is built up over time through experience, by making decisions, by making mistakes and by learning along your own career journey. As a designer it is your responsibility to incorporate all of the available data points and create compelling customer experiences. Data or research will not replace the fact that you have to make decisions.
Intuition provides the “how”.