Increasing customer empathy
My role: design manager, lead and help socialize and structure effort
Problem statement and pain points
As an organization, we tended to think about what we wanted to tell our customers and then use quantitative data to try to optimize their behavior on our experience without having the empathic and qualitative information to understand their side of the experience. We knew really well *what* our users were doing, but not *why* or even whether we were building experiences that would be valuable to them.
We didn’t really know what our customer journey looked like. Because most of us had never gone through the experience of purchasing products like ours — enterprise products for teams — our understanding of our users’ journey and ability to empathize was filled with holes.
Our process wasn’t infused with user investigation and empathy. We often started projects with solutions instead of user-centered goals and problem statements. We sometimes ran usability testing on projects, but not very consistently.
We had a lot of new team and leadership ready to dig into what we felt was a very broken experience, but there were so many things to fix, it was hard to know where to make the most impactful start.
Solution 1: Shift the culture from X to X by building buyer journeys to increase our teams’ empathy with our customers’ experience
Luckily for someone passionate about user-centeredness like me, our newly hired Head of Buyer Experience had two kickoff initiatives: one was to fix our backlog process (see also design quality), and one was to get the org’s strategy centered around our buyers’ journeys. He tasked me with helping him build the communication to describe the plan for creating buyer journeys and develop a template for product marketing stakeholder teams to fill in what they knew about their products’ customer journeys. He then secured the leadership buy-in to covince every product team to fill in their templates.
I created and gave a presentation that described the benefit of the mental shift from what we wanted to tell our customers, to meeting them along their own path.
And I created an example overall journey a template for each product marketing team to fill in their customers’ details.
We pressed the team leads to omit anything that couldn’t be validated by user investigation, data, or research, to keep the journeys as clear of assumptions as possible so we could also identify our biggest gaps in understanding as opportunities for research.
New Issues and some pivots
Within just a quarter, every product team had filled in their Buyer Journey template with the knowledge of users they already had. Several help cross-functional workshops of internal stakeholders with different slices of data and research to contribute, and some teams, lacking that kind of pretext, interviewed dozens of their customers to get the details they needed. We now had Confluence wiki pages upon pages (upon pages…) of information about our journeys. This was great, but led to issue #1
Issue #1: Too much information about our journeys to socialize easily
Lucky for me, my team was gifted with a summer research intern from a Berkeley MA program with an interest in design. I tasked her with helping us distill all our learnings into an readout artifact (or whatever format worked best — interactive activity, template, tbd) so we could use that to help all our teams quickly understand what it’s like for our customers to research and purchase software like our. Her first assignment was a listening tour of all our stakeholders and potential audience to gather some background and understanding of their needs.
This quickly uncovered issue #2.
Issue #2: How clear of assumptions and bias was the information?
While, as I mentioned above, we *asked* our teams to avoid assumptions and bias, it was hard to avoid this, even with the best of intentions. Our research intern was very concerned after her interviews that the journeys contained a good deal of bias. She worried we’d end up with misleading stories about our users that would lead us in the wrong direction if she based a readout entirely on what was collected.
She and I consulted our central research team to get some advice. We learned that the central research team was embarking on a methodology called Top Tasks for all of our products and that the first step of this methodology was capturing exactly the kind internal information we had just collected--so that work wouldn’t be wasted—but it also included more steps with end-users to help us weed out assumptions we couldn’t validate. Our head of research offered to loop our intern into their Top Tasks work and add the buyer journeys to their Top Tasks research project. This was exciting because it help us solve a couple of problems, including a new issue, #3.
Issue #3 Which segments? How do we cover all of them and what happens to the ones we miss?
We knew going into the tasks of building buyer journeys that different segments of users had very different experiences, but which segments should we narrow down? And what were me missing in the segments we chose not to cover? The more we dug into uderstanding our customers’ experience, the more we worried about the segments we knew less or nothing about. This lead us to the beauty of solution #2...
Solution #2: Top Tasks research
Read more about top tasks in this A List Apart article. Basically, after a collection of tasks that users may want to do in our experience, collected from internal stakeholders as well as interviews with users, hundreds of visitors to our experience are asked to respond to a survey that asks them to rank the tasks most important to them. This gives the team a prioritized list of the tasks most important to our users. Since these tasks are agnostic of segments, this helps us avoid the issues in #3 that we were concerned over. Additionally, it helped us with our issue of having so many things to fix, we didn’t know where to start that would be most impactful and helpful. We could start by evaluating the ease of our tope 5-10 tasks and then tackle improving experience that were broken, and then work our way through the list in order of importance after that.
Our research intern uncovered a number of common pain points and themes cross-product and segment in her analysis and decided to turn 7 of them into comics that would help our teams with a quick-read interpretation of some of the findings so far. She created comics in the style of oh-no! comics that she published on our Confluence wiki.
[oh no comics]
As expected, we did realize there were a lot of things we didn’t know about our users’ experience in enterprise software. We were allocated some research funds that we decided to put towards a single research project and would expand with more research topics from there. I facilitated aligning leads and stakeholders around the topic we most wanted to research. I then hired an excellent contract researcher to help us investigate our topic. Here is a screenshot from her summary presentation on the experience of teams in the process of migrating from the basic server software to an enterprise option.
Solution #4 Moar user testing
New Issue: are we doing it right
Collaborate with research team
Need more socialization
I feel like most of my time is dedicated to one of two things — we need to include some qualitative information and connection to our users in this process or hey, check out this and this and this and this and this page or effort, or getting new efforts rolling, however I stil hear people say generalized things like “we don’t know anything about our users”
Nearly every project we kick off starts with analysis
Request to hire our own researcher
Problem statement/pain points
As a culture, we think about what we want to tell our customers and we optimize snapshots of our experience based from that perspective instead of understand what their journey looks like and how we can meet them better along their way
manifest a major head shift and culture change
Needed Tom’s help
Involvement in helping teams set theirs up
use usertesting.com in as many efforts as we can to guide us
Opinions don’t matter any more in direction discussions, let’s use usertesting
Use UMUX-lite as our metric for experiences we can’t measure success with quant
Encourage use, engage with research team to help us do it right
Attrakdiff for the even fuzzier stuff
Encourage kicking off new efforts with problem statement and investigation
There’s a lot of stuff we just don’t know - enterprise purchasing feels like a black box
Get some concentrated efforts going to help us answer these questions
Hire a researcher
Summer intern - help us see the way
Oh no comics
Alignment with Top Tasks efforts of the central research org
We need help with direction on where to concentrate first
Baseline usability testing
Strengthen relationship with the research team