Welcome to Volume 2 of The UX Newsletter. After a lengthy hiatus to move into a new office (and build new stuff, which we cover in this issue), we're happy to resume sharing tales of UX design, development, and research.
We're getting back in the groove with a couple of stories about the development of MailChimp Pro, our new feature set for quickly growing businesses with large lists. In this issue, developer Mardav Wala goes behind the scenes of Multivariate Testing—sharing details about the UI, usability testing, and code considerations his team worked through.
We follow that with data analyst Emily Austin's look at sifting through and prioritizing customer research to determine feature sets and demands for a Pro product. We've written about UX research in previous issues, but here, Emily shares the analysis that comes after the research is collected. We conclude with links of interest from around the web.
We started 2015 with a MailChimp navigation redesign and the promise of "even bigger and better things." While we’ve continued releasing product refinements and feature updates, we kept our promise with the announcement of MailChimp Pro.
In the past few months, I’ve worked with Product Designer Michaela Moore and Staff Engineer Guan Liao to build MailChimp Pro’s Multivariate Testing feature. Apart from building a unique feature (which no other email service provider offers), one of the most challenging and fun exercises in this project was implementing an interactive setup screen that allows users to select variables resulting in up to 8 combinations.
The interface of the Pro Multivariate Testing feature
Our design challenge was creating a Multivariate setup and selection process while maintaining MailChimp’s simple, easy-to-use style. To tackle this, we implemented design ideas in the browser so we could interact with a realistic prototype, learn from it, and continue refining.
After a few iterations of fully-functional prototypes, based on dozens of high-fidelity mock-ups (for just one screen) and scores of copy changes, we were ready to reveal it internally and gather feedback for improvements.
Design iterations of the variable selection interface
Internal Testing and Feature Improvements
Having actual users test your products is important, but when you’re still working through things like basic workflows and usability, colleagues in different departments can provide immediate, objective feedback. MailChimp Email Marketer Brad Gula, Integrations Lead Kale Davis, and Employee Events Coordinator Ashley Wilson agreed to be our internal proxies for “Pro customer” testing.
Brad represents a true Pro user—he sends to a list of 9 million MailChimp users and is constantly experimenting and using A/B testing. As a side project, Kale sends the popular Hacker Newsletter to around 30,000 subscribers. He routinely uses A/B testing to experiment with subject lines and send times on 2 equal halves of his list, then studies his reporting and engagement to inform future campaign decisions. Ashley sends internally to 400+ MailChimp employees. She doesn’t segment, A/B test, or use any of our more advanced MailChimp features. While Ashley's use isn't "typical” Pro behavior, her feedback was equally valuable—we want our tools to be powerful, but they should also be approachable and easy to use.
Including feedback messaging
We were thrilled to see Brad, Kale, and Ashley breeze through Multivariate’s initial variable-selection step, especially since it’s very different from the rest of the app. Our testers helped make it even better by suggesting we add specific, contextual messaging to the page. For example, we added a message to let users know when they’ve hit the maximum number of testing variables. We also created a message informing users that their test will automatically be distributed across the entire list when they choose to test multiple send times.
Appropriate feedback messaging helps guide users through an otherwise complex process
Our internal testers also helped us reconsider steps in the Multivariate workflow that were confusing. Originally, when users wanted to test multiple variations of email content, they cycled through this sequence of steps:
Start at Content Setup screen
Select email variation to edit
Describe/name content variation (something to help identify that particular variation on the Reporting page)
Design and create content (using MailChimp’s standard drag and drop editor)
Return to Content Setup screen
Repeat steps 2-6 until all content variations for the test are complete
By observing our testers, we saw that step 3 came too soon and was confusing for users—we were asking them to describe an email version they hadn’t even created yet. So we moved that step to later in the process—after creating the version, but before moving on to the next variation.
Adding a description for a content variation
Working through feature ideas
We also talked with our internal testers about a part of Multivariate Testing that wasn’t complete at the time—reporting and results. Based on these conversations, we incorporated a visual link performance feature into the final product. This helps users compare how the same links perform in different content variations.
Visual link performance in MailChimp Pro
Bonus: A/B Testing With Content
From the very beginning, we planned to create an integrated flow for both A/B and Multivariate Testing—meaning that the Multivariate interface (for Pro users) and A/B testing interface (for all other users) would look more or less the same.
All free and paid (non-Pro) MailChimp accounts have access to MailChimp’s standard A/B testing. Once an account upgrades to Pro, Multivariate Testing capabilities are made visible in the existing A/B testing interface. All this is possible with modular and extensible components.
By sharing visual and functional elements between A/B and Multivariate Testing, we were also able to allow A/B users to test up to 3 variations, instead of the usual 2. And all MailChimp users can now test content variations in addition to subject line, from name, and send time.
Pro-inspired changes made their way into MailChimp's A/B testing feature
Ready, Set, Pro!
Multivariate Testing for MailChimp Pro is the result of months of research, designing and building, QA (Quality Assurance), and testing. For the last 5 weeks we’ve been making changes to the UI and refactoring code—all to make Multivariate Testing the highlight of MailChimp Pro. Now that it’s released, we’re listening closely to our customers so we can make it even better!
Does this type of experimentation and collaboration interest you? We're hiring.
At MailChimp, we’re all about empowering our users. We believe that products should be both powerful and easy to use—and we’re constantly working to deliver experiences that meet both of these criteria.
So how did we uncover which features were most important to our users? We started with one simple question:
Who The Heck Is Going To Use Pro, Anyway?
When the Data Science team began this research, we had a nebulous idea of what a MailChimp Pro customer would look like. Rather than moving forward on informed conjecture, we wanted to use our vast amount of customer insights to shape our recommendations of not only what to build, but who we were building for. After all, once you understand who your users are, it’s easier to envision the types of tools they need. Our first order of business was determining how a MailChimp Pro customer differs from all other MailChimp users.
Fortunately, in addition to other research methods, we regularly survey users to learn more about how they use our app and how we can improve it. This was our first stop on the road to understanding what MailChimp Pro would look like.
Customer Segmentation, MailChimp Style
We combed through recent surveys to categorize user feedback by feature. For example, some users desired more advanced reporting tools. Others asked for the ability to stop a send after scheduling it. And a great deal wanted additional A/B testing options. Each time we got feedback about a feature, we added it to a spreadsheet and tagged it for future analysis.
After categorizing responses from nearly 20,000 users, we had several customer segments we could compare to each other—and to the MailChimp population as a whole. We analyzed account data (list size, company age, user industry, etc.) for each segment to develop a clearer picture of users interested in each feature.
Segmenting our data, based on customer characteristics
Yes YOU! Get INVOLVED - Send in your spam and report offenders
The Spam Archive - Chronicling spam emails into readable web records index for all time
Our inspiration is the "Internet Archive" USA. "Libraries exist to preserve society's cultural artefacts and to provide access to them. If libraries are to continue to foster education and scholarship in this era of digital technology, it's essential for them to extend those functions into the digital world." This is our library of unsolicited emails from around the world. See https://archive.org. Spamdex is in no way associated though.Supporters and members of http://spam.abuse.net Helping rid the internet of spam, one email at a time. Working with Inernet Aware to improve user knowlegde on keeping safe online. Many thanks to all our supporters including Vanilla Circus for providing SEO advice and other content syndication help | Link to us | Terms | Privacy | Cookies | Complaints | Copyright | Spam emails / ICO | Spam images | Sitemap | All hosting and cloud migration by Cloudworks.
Important: Users take note, this is Spamdex - The Spam Archive for the internet. Some of the pages indexed could contain offensive language or contain fraudulent offers. If an offer looks too good to be true it probably is! Please tread, carefully, all of the links should be fine. Clicking I agree means you agree to our terms and conditions. We cannot be held responsible etc etc.
The Spam Archive - Chronicling spam emails into readable web records
The Glass House | London | SW19 8AE |
Spamdex is a digital archive of unsolicited electronic mail4.9 out of 5
based on reviews