DoSomething.org is one of the largest global organizations for young people and social change. They mobilize their members “to make the world suck less” by by participating in campaigns impacting causes from poverty to violence to the environment.
DoSomething.org has access to data sourced from more than 5 million members, which is handled by their in-house agency TMI Strategy. Meredith Ferguson, Managing Director at TMI, says “Fight for the User” is the organization’s mantra, and “our constant reminder that activating young people is not about getting them to like our Instagram post, sign up for our cause campaign, or buy your product. It’s about asking with every interaction at every touchpoint: how do we provide value for them? How can we figure out what they want and meet those needs? How can we give them the calls to action they feel are valuable, even when those actions may not be our top priority?”
DoSomething.org learned a valuable lesson about equating views of a YouTube video with success early on. In the Harvard Business Review article Know the Difference Between Your Data and Your Metrics, authors Jeff Bladt and Bob Filbin present the following case:
“How many views make a YouTube video a success? How about 1.5 million? That’s how many views a video our organization, DoSomething.org, posted in 2011 got. It featured some well-known YouTube celebrities, who asked young people to donate their used sports equipment to youth in need. It was twice as popular as any video Dosomething.org had posted to date. Success! Then came the data report: only eight viewers had signed up to donate equipment, and zero actually donated.
What happened? We were concerned with the wrong metric. A metric contains a single type of data, e.g., video views or equipment donations. A successful organization can only measure so many things well and what it measures ties to its definition of success. For DoSomething.org, that’s social change. In the case above, success meant donations, not video views. As we learned, there is a difference between numbers and numbers that matter. This is what separates data from metrics.”
Bladt and Filbin also provide their take on vanity metrics versus meaningful metrics:
“In the business world, we talk about the difference between vanity metrics and meaningful metrics. Vanity metrics are like dandelions – they might look pretty, but to most of us, they’re weeds, using up resources, and doing nothing for your property value. Vanity metrics for your organization might include website visitors per month, Twitter followers, Facebook fans, and media impressions. Here’s the thing: if these numbers go up, it might drive up sales of your product. But can you prove it? If yes, great. Measure away. But if you can’t, they aren’t valuable.”

© DoSomething.org, DoSomething.org website. Retrieved from Pregnancy Text Campaign on March 12, 2017
For DoSomething.org, meaningful metrics prove that a campaign has engaged their members in social change. In her article How DoSomething uses data to change the world, Beth Kanter discusses DoSomething.org’s popular “Pregnancy Text” campaign.
As described by Kanter, “[Pregnancy Text] was a text campaign where teens opted-in to receive texts on their mobile phones from the “baby.” Once they joined (and they could share it with their friends), they received regular annoying text messages at all hours from the “baby” that poops, cries, and needs their immediate attention.”
Here are some topline metrics:
- 101,444 people took part, with 100,000 text babies delivered.
- There were 171,000 unsolicited incoming messages, or one every 20 seconds for the duration of the campaign.
- For each direct sign-up, DoSomething.org gained 2.3 additional sign-ups from forward-to-a-friend functionality.
- 24% of teens could not finish a day with their text baby (texted a stop word to the baby).
In and of themselves, these numbers did not demonstrate that DoSomething.org achieved its objective of effecting social change. To do that – and to source insights to guide future campaigns – other metrics were needed.
- By pre-testing messaging (using A/B testing) and frequency, DoSomething.org was able to optimize the number of messages the “baby” would send and their content. (This also led to building in a response system so the baby would respond if the teen texted an unsolicited response).
- By testing various group sizes, they found a group of 6 (1 person asking 5 friends to take the challenge) led to highest overall engagement.
- By monitoring engagement by communication channel, they found text messaging was 30 times more powerful than email in getting their members to take action.
- By conducting a follow-up survey, they discovered that 1 in 2 teens who participated said they were more likely to talk about the issue of teen pregnancy with family and friends. This result aligned with survey data from the National Campaign, indicating that 87% of young people felt it would be easier to delay sexual activity and avoid teen pregnancy if they were able to have more open, honest conversations about these topics. Ultimately, the campaign was a success because it achieved that goal.
Lessons for Others
In the Harvard Business Review article cited above, the authors offer the following words of wisdom: “Organizations can’t control their data, but they do control what they care about… Good data scientists know that analyzing the data is the easy part. The hard part is deciding what data matters.”
To determine what social media metrics are meaningful to your organization, start with a clear understanding of the strategic goals you want to achieve through your campaigns. The metrics most likely to be of value are those that demonstrate you have moved closer to – or attained – those goals.
Before launching large scale social media campaigns, considering pre-testing with a small sample and use those metrics to optimize things like channel(s), content and frequency of messaging. When appropriate, consider conducting post-campaign surveys to gain deeper insights into impact.
Organization:
DoSomething.org
Industry:
Non-profit (social change)
Name of Organization Contact:
Jeff Bladt, Chief Data Officer
Authored by: Anna Borenstein
If you have concerns as to the accuracy of anything posted on this site, please send your concerns to Peter Carr, Program Director, Social Media for Business Performance.
References
DoSomething.org (2017). Who We Are. Retrieved from https://www.dosomething.org/about/who-we-are-0
TMIStrategy.org (2017). We Get Data. Retrieved from https://www.tmistrategy.org/#we-get-data
Bladt, J. and Filbin, B. (2013). Know the Difference Between Your Data and Your Metrics. Retrieved from https://hbr.org/2013/03/know-the-difference-between-yo
Ferguson, M. (2016). Fight for the User. Retrieved from https://blog.tmistrategy.org/fight-for-the-user-84345ef0c2ca#.wgy0hghy7
Kanter, B. (2012). How DoSomething uses data to change the world. Retrieved from http://www.socialbrite.org/2012/10/15/how-dosomething-uses-data-to-change-the-world/