Add to Collection


A set of usability trials and resulting UX recommendations I did for Magoosh, a GRE/GMAT prep startup.
Magoosh is a bay area startup that prepares students for the GRE and other tests via online videos and articles.  In the spring of 2012, I conducted a usability study for Magoosh, with the goal of crafting a set of recommendations to improve user retention throughthe signup process.
Magoosh's landing page.  This is the page users first saw when navigating to  It featured largely in the study.
To improve the total number of paid conversions - the number of people that converted to a premium account after utilizing the free trial - I wanted to examine the signup process and free trial in detail.  To maximize results while minimizing scope, I settled on two primary questions:
1. Does the front page engage users and encourage them to explore?
2. Does the free trial help users understand everything Magoosh has to offer?
I tested a range of ages, including both undergraduate and graduate students, who were at various points in their preparation for the GRE or GMAT.  The usability test focused on having the user sign up and complete a few tasks within the site, and then examining their opinions and knowledge of the site's functionality immediately afterward.
Magoosh's dashbaord, the primary page users saw after logging in.  Much of the study focused around how users use the dashboard and other internal parts of the site.
While there were focused questions and tasks in my tests and analysis, I deliberately left certain parts of the tests somewhat vague and open-ended, with minimal instruction.  I was trying to capture the experience of “poking around” a site – a real user would be driven only by their curiosity, not a predetermined set of tasks, and could leave at any moment.  I tried to gauge users’ engagement during the test: whether they would give up and leave the site, or continue investigating were they not part of a usability test.

Throughout the data analysis, testing and interviews, I took special note of the following:

• What features do users gravitate toward? What stands out to users? What do most people visit or see first? What do they spend the most time on?

• What elements do users get stuck on? What are barriers to users having a seamless and productive experience with the site?
Data Analysis
Photo: Spencer Higgins; Illustration: Si Scott
Data analysis is an excellent, low-cost area to begin looking for patterns in user activity before plunging into high-cost usability testing.  I had two primary sources of data to draw on for this project: Google Analytics data documenting user paths into, through, and out of the site; and A/B testing performed by Magoosh prior to my arrival.
Due to a confidentiality agreement with Magoosh, I cannot reveal the specific results of the data analysis.  Nonetheless, they cast light on the effects of several aspects the landing page, including surprising results regarding the effectiveness of taglines and page length. 
Usability Testing
Myself conducting a usability test remotely.  The subject is exploring Magoosh's video lectures.
While data analysis held interesting results and showed the routes users took around the site, it did not tell us how users navigated the site or what happened on each page.  With the usability tests, I hoped to find some causation to go with the correlations provided by the data analysis. 
To prepare for the tests, I wrote a script detailing an introduction to Magoosh and what the test entailed, including procedural prompts for myself (such as reminders to obtain written or visual consent from the subjects) and of course the questions and challenges of the test itself. 
In total there were three rounds of testing, with three users per round.  I faced a rather paradoxical challenge in acquiring subjects: I wanted those where were unfamiliar with Magoosh, yet I only had a free subscription to offer as a reward to testers.  Thus, the first round consisted primarily of friends, but in each subsequent round after the first I endeavored to find more valid subjects. The second round was Berkeley students preparing to take the GRE or GMAT, and the third recent graduates from other colleges.  Further rounds would ideally have featured international users, as there are several foreign countries with high traffic but low paid conversion rates.
I used four technologies to conduct the tests:
• A digital camera for the first round of testing and as a backup for the other rounds
• Morae, a usability testing software, to record screen and webcam input
• Skype, to communicate with users during remote testing
•, a screensharing service used during remote testing
The test consisted of two tasks: users had to navigate from Magoosh's blog to its primary website and examine the landing page, then sign up for and try out the free trial.  I punctuated the test with questions at the beginning, middle, and end, focusing on their perception of what Magoosh offered in terms of quantity and quality of features.  This included one set of questions after the users felt they were finished exploring the trial, and another after I explained to them the functions they had missed.  Users were allowed to determine for themselves how long to spend on each part, and encouraged to think out loud as they navigated the site.   
Findings & Recommendations
In the end, the study did find some very important results.  As I had conducted the study, it became increasingly apparant that somewhere in the experience was a communication breakdown: not one of the users finished the free trial with a full understanding of what the website had to offer. Users invariably expanded their description of what Magoosh offers between each of the three interview sections:

• Initital responses (after exploration of the landing page) typically mentioned only practice problems alone and video lessons.
• After the free trial, responses expanded to include explanations to the questions, and occasional mentions of customized practice (one of Magoosh's biggest draws).
• Finally, after users had been fully introduced to Magoosh’s services, answers spread out to include customized help, section-by-section topic breakdowns, result breakdowns, etc.
My key finding, and the basis for many of my recommendations, was this: users don't like to sit through videos.  In particular, many users skipped the introductory video explaining all of Magoosh's features. This proved especially disastrous, as that meant they were navigating the site essentially blind: stumbling around and bumping into whatever features they found.
As opposed to the passive video, I recommended Magoosh adapt to the habits of its users by implementing a more active method of explaining the site's features. To quote my report:
" Frontloading all of the information has been shown to be ineffective; it’s like being given a verbal description of each painting in a museum while stuck in the elevator, then being allowed to explore the gallery on one’s own.  Users should be given relevant information while they browse, and should be alerted to new features as they come across them.  To that effect, a guided tour would be very useful."
By replacing the video with tooltips and other informational popups and guides, users could explore the site at their own pace and according to their own interests, and would get a much more thorough experience.
A mock up of what the guided tour might look like.  Frontloading information is ineffective compared to providing it when it is needed.
Magoosh also faced several other problems, including:
Static Landing Page
At the time of the study, Magoosh's landing page was decently effective at relating its features, but completely static.  Many users tried to click on screenshots or otherwise attempted to interact with the site, but could not.  I recommended they add a dynamic element to the landing page - something users could try out to get a feel for the site.  Allowing users to try, say, answering a question, and receiving the same answer explanation and video they would get in the real site, could convince many potential users to find and try the free trial in order to see the rest. A practice question would also appeal to the competitive nature of many test takers.
Adding a dynamic section to the landing page - namely, a practice problem with solution and video - could entice users to try the trial.
User Misunderstandings
There were several spots on the site were user expectations did not meet reality.  For example, Magoosh allows users to add notes and tags to problems.  However, the user had to then hit save, or their notes would be erased.  One user created notes on several questions in a row, never realizing that each was being erased as she went to the next question.  There were several similar issues that could be easily solved by slight adjustments to the user interface.  (In this case, saving notes automatically or popping up a warning before leaving the page while unsaved text is in the note section.)
Failure to Express Value of Key Features
One key feature that sets Magoosh apart from other testing companies is its capability to generate custom practices; data analysis showed that paid users spent upwards of 70% of their time on these.  However, almost all of the free trial users ignored this feature, intimidated by the setup page and unable to see the value (which was, of course, in the introduction video).  Adding context in key areas, and guiding users through first-time setup, would help to illuminate this highly useful feature.   
The Magoosh project was one of my first full-fledged usability reviews.  Were I to do it again, I would change several things - for example, creating and testing some potential A/B variances to test some of my assumptions before finding volunteers for the usability tests themselves. 
Nonetheless, it showcases one of my favorite things about usability testing in general - how major UX breakdowns can often be identified and fixed simply by taking a few steps in the user's shoes.
Special thanks to Flora Kuo, my partner on this project.  She was a huge help in finding test subjects, putting together the visual recommendations, and reviewing and editing the final report.