It seems that everyone is interested in seeking out great ideas and success stories to apply to their own email campaigns.  But the most effective way to develop those “best practices,” see those successes and learn from those failures is to do it yourself!  That’s why the MobLab came together with fellow peers in the non-profit world to discuss these very ideas as part of a conversation hosted by Change.org.

If you’re interested in testing out these ideas yourself and are willing to share the results with others, let the MobLab know and we can publish your post here.

Testing Ideas

1. Testing Segmentation: Segment email lists based on recent action or activity.

The goal of this longitudinal test is to segment supporters based on their recency of action or activity to learn if customized content increases their level of engagement. Some colleagues suggest starting with testing open rates and then response rates based on customized subject lines and content.  By looking at these metrics, it can help determine what content works best.  Based on this, colleagues suggest strategies could be implemented to move users into a more active segment bucket. These testing tactics can show how supporters move through the engagement process to the final action or goal (signing a petition, donation, etc.).

Examples of segments based on recency of activity:

  • One month or less – this segment should not be included in your testing because they are “active” supporters.
  • 1-3 months
  • 3-6 months
  • 6 -12 months

Testing tactics:

  • Unopened emails: test subject lines for those who are not opening
  • Open emails but no clicks: test content (creative, layout, style) to increase response
  • Open emails with clicks but no follow through to final action: testing different landing page layouts to increase response

2. Testing ways to re-engage: Using email surveys to reactivate ‘lapsed’ supporters.  

Colleagues are noting that surveys seem to have a higher response rate than traditional campaign emails.  As a result, some nonprofits are using surveys to re-engage with lapsed supporters (defined as people who haven’t engaged in an action within the past 12 months).

Additionally, having a donation ask on the “Thank You” page after a supporter fills out the survey also has a good response rate.  But, other colleagues comment that the actual survey data collected tend not to matter to some organizations. They feel that surveys are viewed more as a tactic, a way to re-engage with lapsed supporters.  It could be that those survey respondents feel like they were heard as a result of completing the survey.  Therefore, they feel more invested and are ready to help by donating.

3. Testing senders: Does having a real name vs. an organization name vs. both more effective?

Almost everyone starts off at the same place, listing the organization name as the sender for emails. However, testing the idea of making emails more personalized by showing a real name that appears in people’s inboxes is known to increase open rates.  Colleagues suggest the next step for this kind of test is to see if the combination of a person’s name appearing along with the organization increases open rates, more so than just the name or the organization alone.

4. Testing email copy: Is your email too short or too long?

Most of the email best practice guides we’ve seen suggest shorter email copy to accommodate busy readers, who are increasingly reading via mobile or on the go. But does message length really matter? Several of our colleagues were surprised to learn that their audience responds better to longer, more detailed emails than shorter and pithier emails.

5. Testing multiple calls to action: Providing a summary of campaigns underway to supporters.

Do audiences appreciate a longer, newsletter style email, which summarizes multiple causes and efforts?  Does adding multiple calls to action in this summary style of content generate a higher response because it gives the supporter the option to choose how to engage with the cause?  As we talked through these questions, most colleagues mention they were not expecting to see this layout and style of content work, but their results indicate these elements drive higher response rates.

6. Testing the unexpected: Does something unique or different in your subject line increase open rates?

One colleague mentioned adding a colon to the end of their subject line and seeing a higher open rate.  Many asked if this was a fluke or not.  Does something out of the ordinary draw people’s attention to click and open the email?  This generated interest from many in the room to test the idea of placing something unique in the subject line.

Summary

From our conversation, these points sparked everyone’s interest to go back and test each other’s ideas for themselves.  But the takeaway everyone agreed upon is to never get complacent.  The ideas you think are “best practices” that worked for you in the past are not necessarily practices that can carry you forward.

The best piece of advice from the room is that you need to keep testing your assumptions and stop trusting your gut when it comes to knowing what works.

Wendee Parker is an Analyst with the Digital Mobilisation Lab. She works with the global organization to take Greenpeace’s testing and analytics practices to the next level, enabling Greenpeace and its allies to innovate more quickly, win bigger, and raise more money. You can reach Wendee at wendee.parker@greenpeace.org