Spamdex - Spam Archive

Report spam

Send in your spam and get the offenders listed

Create a rule in outlook or simply forward the spam you receive to

Also in

The UX Newsletter: Going Pro

Going Pro

Welcome to Volume 2 of The UX Newsletter. After a lengthy hiatus to move into a new office (and build new stuff, which we cover in this issue), we're happy to resume sharing tales of UX design, development, and research.

We're getting back in the groove with a couple of stories about the development of MailChimp Pro, our new feature set for quickly growing businesses with large lists. In this issue, developer Mardav Wala goes behind the scenes of Multivariate Testing—sharing details about the UI, usability testing, and code considerations his team worked through.

We follow that with data analyst Emily Austin's look at sifting through and prioritizing customer research to determine feature sets and demands for a Pro product. We've written about UX research in previous issues, but here, Emily shares the analysis that comes after the research is collected. We conclude with links of interest from around the web.

Editors: Fabio Carneiro, Gregg Bernstein, Laurissa Wolfram-Hvass
Code: Alex Kelly

The Multiverse

Mardav Wala, Developer

We started 2015 with a MailChimp navigation redesign and the promise of "even bigger and better things." While we’ve continued releasing product refinements and feature updates, we kept our promise with the announcement of MailChimp Pro.

In the past few months, I’ve worked with Product Designer Michaela Moore and Staff Engineer Guan Liao to build MailChimp Pro’s Multivariate Testing feature. Apart from building a unique feature (which no other email service provider offers), one of the most challenging and fun exercises in this project was implementing an interactive setup screen that allows users to select variables resulting in up to 8 combinations.

The interface of the Pro Multivariate Testing feature

Our design challenge was creating a Multivariate setup and selection process while maintaining MailChimp’s simple, easy-to-use style. To tackle this, we implemented design ideas in the browser so we could interact with a realistic prototype, learn from it, and continue refining.

The new design was a chance to experiment with flexbox layouts in the application. This gave us much more control over things like overall page structure and the alignment and order of UI elements than we’ve traditionally had when using CSS floats and conventional layouts. We also had the opportunity to implement the D3 JavaScript library, which lets us use SVG for displaying data.

After a few iterations of fully-functional prototypes, based on dozens of high-fidelity mock-ups (for just one screen) and scores of copy changes, we were ready to reveal it internally and gather feedback for improvements.

Design iterations of the variable selection interface

Internal Testing and Feature Improvements

Having actual users test your products is important, but when you’re still working through things like basic workflows and usability, colleagues in different departments can provide immediate, objective feedback. MailChimp Email Marketer Brad Gula, Integrations Lead Kale Davis, and Employee Events Coordinator Ashley Wilson agreed to be our internal proxies for “Pro customer” testing.

Brad represents a true Pro user—he sends to a list of 9 million MailChimp users and is constantly experimenting and using A/B testing. As a side project, Kale sends the popular Hacker Newsletter to around 30,000 subscribers. He routinely uses A/B testing to experiment with subject lines and send times on 2 equal halves of his list, then studies his reporting and engagement to inform future campaign decisions. Ashley sends internally to 400+ MailChimp employees. She doesn’t segment, A/B test, or use any of our more advanced MailChimp features. While Ashley's use isn't "typical” Pro behavior, her feedback was equally valuable—we want our tools to be powerful, but they should also be approachable and easy to use.

Including feedback messaging

We were thrilled to see Brad, Kale, and Ashley breeze through Multivariate’s initial variable-selection step, especially since it’s very different from the rest of the app. Our testers helped make it even better by suggesting we add specific, contextual messaging to the page. For example, we added a message to let users know when they’ve hit the maximum number of testing variables. We also created a message informing users that their test will automatically be distributed across the entire list when they choose to test multiple send times.

Appropriate feedback messaging helps guide users through an otherwise complex process

Re-ordering steps

Our internal testers also helped us reconsider steps in the Multivariate workflow that were confusing. Originally, when users wanted to test multiple variations of email content, they cycled through this sequence of steps:

  1. Start at Content Setup screen
  2. Select email variation to edit
  3. Describe/name content variation (something to help identify that particular variation on the Reporting page)
  4. Select template
  5. Design and create content (using MailChimp’s standard drag and drop editor)
  6. Return to Content Setup screen
  7. Repeat steps 2-6 until all content variations for the test are complete

By observing our testers, we saw that step 3 came too soon and was confusing for users—we were asking them to describe an email version they hadn’t even created yet. So we moved that step to later in the process—after creating the version, but before moving on to the next variation.

Adding a description for a content variation

Working through feature ideas

We also talked with our internal testers about a part of Multivariate Testing that wasn’t complete at the time—reporting and results. Based on these conversations, we incorporated a visual link performance feature into the final product. This helps users compare how the same links perform in different content variations.

Visual link performance in MailChimp Pro

Bonus: A/B Testing With Content

From the very beginning, we planned to create an integrated flow for both A/B and Multivariate Testing—meaning that the Multivariate interface (for Pro users) and A/B testing interface (for all other users) would look more or less the same.

All free and paid (non-Pro) MailChimp accounts have access to MailChimp’s standard A/B testing. Once an account upgrades to Pro, Multivariate Testing capabilities are made visible in the existing A/B testing interface. All this is possible with modular and extensible components.

By sharing visual and functional elements between A/B and Multivariate Testing, we were also able to allow A/B users to test up to 3 variations, instead of the usual 2. And all MailChimp users can now test content variations in addition to subject line, from name, and send time.

Pro-inspired changes made their way into MailChimp's A/B testing feature

Ready, Set, Pro!

Multivariate Testing for MailChimp Pro is the result of months of research, designing and building, QA (Quality Assurance), and testing. For the last 5 weeks we’ve been making changes to the UI and refactoring code—all to make Multivariate Testing the highlight of MailChimp Pro. Now that it’s released, we’re listening closely to our customers so we can make it even better!

Does this type of experimentation and collaboration interest you? We're hiring.

Decisions with Data

Emily Austin, Data Analyst

At MailChimp, we’re all about empowering our users. We believe that products should be both powerful and easy to use—and we’re constantly working to deliver experiences that meet both of these criteria.

Enter MailChimp Pro, a set of enterprise-level features that enable MailChimp users to gain a more detailed understanding of their audience. Think of it as a set of data science tools for non-data scientists.

So how did we uncover which features were most important to our users? We started with one simple question:

Who The Heck Is Going To Use Pro, Anyway?

When the Data Science team began this research, we had a nebulous idea of what a MailChimp Pro customer would look like. Rather than moving forward on informed conjecture, we wanted to use our vast amount of customer insights to shape our recommendations of not only what to build, but who we were building for. After all, once you understand who your users are, it’s easier to envision the types of tools they need. Our first order of business was determining how a MailChimp Pro customer differs from all other MailChimp users.

Fortunately, in addition to other research methods, we regularly survey users to learn more about how they use our app and how we can improve it. This was our first stop on the road to understanding what MailChimp Pro would look like.

Customer Segmentation, MailChimp Style

We combed through recent surveys to categorize user feedback by feature. For example, some users desired more advanced reporting tools. Others asked for the ability to stop a send after scheduling it. And a great deal wanted additional A/B testing options. Each time we got feedback about a feature, we added it to a spreadsheet and tagged it for future analysis.

After categorizing responses from nearly 20,000 users, we had several customer segments we could compare to each other—and to the MailChimp population as a whole. We analyzed account data (list size, company age, user industry, etc.) for each segment to develop a clearer picture of users interested in each feature.

Segmenting our data, based on customer characteristics

Yes YOU! Get INVOLVED - Send in your spam and report offenders

Create a rule in outlook or simply forward the junk email you receive to | See contributors

Google + Spam 2010- 2017 Spamdex - The Spam Archive for the internet. unsolicited electric messages (spam) archived for posterity. Link to us and help promote Spamdex as a means of forcing Spammers to re-think the amount of spam they send us.

The Spam Archive - Chronicling spam emails into readable web records index for all time

Please contact us with any comments or questions at Spam Archive is a non-profit library of thousands of spam email messages sent to a single email address. A number of far-sighted people have been saving all their spam and have put it online. This is a valuable resource for anyone writing Bayesian filters. The Spam Archive is building a digital library of Internet spam. Your use of the Archive is subject to the Archive's Terms of Use. All emails viewed are copyright of the respected companies or corporations. Thanks to Benedict Sykes for assisting with tech problems and Google Indexing, ta Ben.

Our inspiration is the "Internet Archive" USA. "Libraries exist to preserve society's cultural artefacts and to provide access to them. If libraries are to continue to foster education and scholarship in this era of digital technology, it's essential for them to extend those functions into the digital world." This is our library of unsolicited emails from around the world. See Spamdex is in no way associated though. Supporters and members of Helping rid the internet of spam, one email at a time. Working with Inernet Aware to improve user knowlegde on keeping safe online. Many thanks to all our supporters including Vanilla Circus for providing SEO advice and other content syndication help | Link to us | Terms | Privacy | Cookies | Complaints | Copyright | Spam emails / ICO | Spam images | Sitemap | All hosting and cloud migration by Cloudworks.

Important: Users take note, this is Spamdex - The Spam Archive for the internet. Some of the pages indexed could contain offensive language or contain fraudulent offers. If an offer looks too good to be true it probably is! Please tread, carefully, all of the links should be fine. Clicking I agree means you agree to our terms and conditions. We cannot be held responsible etc etc.

The Spam Archive - Chronicling spam emails into readable web records

The Glass House | London | SW19 8AE |
Spamdex is a digital archive of unsolicited electronic mail 4.9 out of 5 based on reviews
Spamdex - The Spam Archive Located in London, SW19 8AE. Phone: 08000 0514541.