Christine Lee's profile

Yesware Manager Experience

Yesware Manager Experience
My Role
As the Lead Designer for the Manager Experience project, I was responsible for conducting user research with the Product Manager to understand users’ pain points (20%), creating low-fidelity wireframes (15%), creating high-fidelity wireframes (15%), and conducting usability tests (50%).
The Challenge
Yesware’s business model is that of B2C2B (business to consumer to business). We’ve been targeting baseline sales reps in hopes that they will love the product enough to refer us to their colleagues and managers, who will then introduce our tool to their company.
As more sales reps began to install our product, managers began to see the value of Yesware for themselves. Despite this value that the managers found, many did not sign up for a license due to a disjointed user experience. Realizing that we had never optimized for sales managers, our Product team decided to invest in the manager experience, as they were decisionmakers in the buying process.

Meeting our Target User
Meet Marissa—a Sales Development Manager at Skalaway*. Ask her anything about the sales process, and she’ll give you a concise 2 minute overview in layman’s terms. Marissa oversees a team of 7-10 Sales Development Reps (SDRs) that is responsible for qualifying leads that are ultimately handed off to the “closers”, also known as Account Executives.

Most of Marissa’s daily efforts are spent around coaching her team. Most SDRs are fresh out of college, so they not only need coaching for their roles but also training on the general sales process. In addition to weekly Lunch and Learns, Marissa also spends 1:1 time with each rep to individually coach them through specific prospecting techniques and strategies.

Since her performance is measured by the success of her team, Marissa needs to know that her team is emailing and calling enough leads, as well as booking enough meetings with leads that turn into closed-won opportunities.
* Company name changed due to customer confidentiality

No Pain, No Gain
After talking with Marissa, we discovered that there was a lot of room for improvement on the reporting aspect of her role. When asked if she had any pain points, Marissa would politely say she had none but would then go on to describe how she spends around 6-8 hours every week to compile a report that tracks her team’s activity and success.

Marissa wasn’t the only sales manager with this problem. After interviewing 7 other managers, we noticed a couple recurring pain points:
- Sales managers don’t have actionable data when coaching their reps to become more efficient and to hit their goals.
- Sales managers don’t have enough insight into their reps’ activities when coaching their reps.
- Sales managers had no reports to look over during their 1:1s with their direct reports.
- Sales managers wanted to know what the formula for success looked like so that they turn it into a repeatable process.
    - How many emails does it take to get a reply?
    - How many calls does it take to book a meeting?
- Sales managers have to look in many different places to see the high-level metrics they care about, so gathering these metrics is very time consuming.

For example, there wasn’t an easy way to tie an SDR’s effort to revenue generated. This process was very manual. Sales managers were tracking this in Google spreadsheets, going to each Account page in Salesforce, then combing through the Activity History to look for Opportunities Opened.

The Two-Sided Challenge
We now had a clearer problem that we could dive deeper into. This motivation to help tackle the sales manager’s pain points was also further fueled by Yesware’s business objectives. As a company, we had always focused on individual sales reps, thus denying the managers a usable experience. In tackling the issue at hand, not only would we be able to deliver incremental value to the manager through simplifying the metric-gathering process, but we could also solidify our position amongst our customers as a trustworthy expert in sales, capture managers organically (thus expediting our growth initiative), and also charge more for a reporting add-on.

Approach
One of the coolest perks of working at Yesware as a Product Designer is that we have customers who are very eager and happy to give us feedback and participate in usability sessions. That’s right—our customers want to hop on the phone and talk with us. Not only that, but our internal sales team uses our product everyday and always provides valuable feedback, both as Yesware users and advocates for the prospects they talk to on a daily basis.

With our problem statement in hand, we figured it would be a good idea to start chatting with our sales team. Our desire to help all the managers with all their problems was quickly squashed, as we discovered how varied their responsibilities and goals were. Now we had to choose a focus. What would it look like if we focused our efforts on helping SDR Managers? Inside Sales Managers? Enterprise Managers?

Ultimately, we decided to hone in on helping SDR managers because they are focused on tracking their team’s activity. Metrics like # emails sent, # calls made, # replies received, etc. were all numbers that we had access to, so it seemed like the simplest and most logical place to start adding value.

Process
Now that we had a plan of attack, it was time to hoard some snacks because my fellow Product Manager and I were camping out in the call rooms, talking to customers non-stop. Broken down by tasks, this is how the next 3 months played out:

User Research Interviews
The Product Manager and I conducted 6 user research interviews with SDR Managers to validate our 2 hypotheses:
1. If we tell managers how much activity their team needs to complete to hit their goals, then they will know how to coach their team members towards hitting those goals.
2. If we tell managers how much activity their team has completed this month, then they will be able to tell whether or not their teams are on track to hit their goals throughout the month.

Sketch Session (Round 1)
Armed with this new knowledge, we led the first round of sketch sessions with the Design + Product teams to flesh out high level ideas gathered from the research interviews.
Low-Fidelity Mockup
Based on the sketches from the sketch session, I created 4 low-fidelity mockups to test the different directions.
From top left (clockwise):
- 1:1 Email - a weekly email digest that preps managers for their 1:1s with their direct reports
- BDR Activity Dashboard - a real-time dashboard that shows managers what activity their reps are generating
- BDR Goals + Activity Dashboard - a progress report that shows managers how their team is performing in relation to their monthly goals
- Sales Effort Dashboard - a calculator that lets managers see how many deals their reps need to close to hit their monthly quota
Usability Testing (Round 1)
The Product Manager and I conducted 6 usability test sessions to see which ideas resonated most with our target users. Here, we validated that managers needed help tracking their team’s activity in a more intuitive and consolidated manner and would likely use these reports when coaching their reps during 1:1s.

Experiment
Based on the validation from the usability tests and a couple brainstorm sessions, we created an experiment to help managers coach their reps in 1:1s. Managers didn’t have a report to bring with them to their 1:1s, so we figured this would be a good place to start and would benefit both both reps and managers.

Originally, we had wanted to scrape a manager’s calendar for 1:1s and send them an email with an activity summary so they would be prepared for their meeting. However, the cost of scraping calendars and the uncertainty in identifying 1:1s was expensive. So, in an effort to keep things simple, we decided to create an email that featured 4 activity metrics for a given SDR and whether they were on track to hit their goals. Rather than scraping the calendar, we decided to send these emails every Monday morning so that managers could prepare for the week ahead.

There were 3 main things we wanted to test in this experiment:
1. Do users find this report useful?
    - We would track success by measuring open + click rates.
2. What do users want to see in the drill down view?
    - This led to a Google form so that we could collect responses.
3. Is the cadence correct, or do they want to receive this email at another time?
    - This also led to a Google form, allowing users to suggest alternative times.

Not only was this an experiment to test whether or not sales managers would use this in the wild, but it was also to verify that we could pull these metrics from Salesforce. We had dabbled with pulling data from Salesforce in the past but deemed it too costly given the extreme amount of engineering effort required to mine custom fields and KPIs. By validating that # Calls Made, # Meetings Booked, and # Opportunities Created were used universally, we were able to validate that our simplified query would work across multiple organizations. To help speed things up, I coded up the HTML template for this email and ended up developing major respect for email designers and a love/hate relationship with Litmus and Putsmail.
Sketch Session (Round 2)
As we continued to monitor the Manager 1:1 experiment in the background, we also began work on kneading the feedback we received in the Google forms. Now that we had a clearer direction of what the manager’s pain points and needs were, I conducted a second round of sketch sessions with the development team that would be working on this project (1 Product Manager, 4 engineers, myself). Since the engineers participated in this sketch session, we used this time to discuss feasibility and any technical constraints that might affect the design.
This sketch session spawned 2 different ideas to test:
From top left (clockwise):
Scatterplot, which would make it clear how reps were doing and offer actionable insight
An Activity Snapshot, where various data points would come together in one view
Scatterplot on the left and Activity Snapshot on the right
Usability Tests (Round 2)
We conducted 6 sessions for Round 2. The Activity Snapshot initially caught most managers’ attention due to its familiar layout. But when the managers drilled down into the data, they found it redundant because this was information they already had access to in Salesforce. The Scatterplot took a little longer to understand. But after poking around a little more, managers were more excited by this design due to its unique visualization of information. The concept of plotting activity against engagement resonated with many managers because it captured their reps’ input and showed what the result of that effort was.

Solution
In the end, the feedback was loud and clear: Scatterplot trumps Snapshot. I went ahead and created a high fidelity mockup to flesh out not only the visual design but also the interactions and context to help managers understand what they were looking at.
In parallel, the Manager 1:1 experiment was running along nicely. Of the 14 managers that we had sent the emails to, all 14 had opened the email, 2 offered more details as to what they would have liked to see, and only 1 requested to be removed from the list. Recognizing that this sample size would lead to inconclusive results, we decided to roll this experiment out to 50% of our target users. In the second iteration, we were also able to add more details around opportunities that were closed-won so that managers could better understand which deals had closed and what type of activity helped the deal close.
Results
Armed with these two reports and many other product improvements, our Sales & Marketing teams previewed both the Manager 1:1 Report and the Activity vs. Enagagement Report at Dreamforce 2015 (the biggest conference of the year in the sales industry), we ended up recruiting around 10 companies to beta test these 2 reports. In fact, the AvE report was hard at work pre-Dreamforce, when our CEO took screenshots of his invitee’s reports and included those teasers in the invitations.

In addition, these 2 reports were able to fulfill the business’s objective of charging more for a Reporting Add-On. We created a brand new tier called “Prescriptive Analytics” to host these new reports and organized a huge Marketing launch around it.
It’s still a bit too early to tell how successful this project has been since the Prescriptive Analytics Trial is still ongoing, but I’d say that the initial momentum has been pretty good so far. We’ve received a lot of feedback—both positive and constructive—regarding this report and have a good understanding of what our next steps look like. I’m excited to continue reworking this report to ensure its success and will be back with an update to let you know how things pan out!
Yesware Manager Experience
Published:

Yesware Manager Experience

The Yesware Manager Experience includes a series of reports that helps sales managers monitor their reps’ progress toward their monthly goal, tra Read More

Published: