Measurement and support for community, not you

Want to stir the pot amongst social media campaigners and their managers? Start a conversation with them (preferably when you’re all in the same room) about how they measure social media efforts. “Tell me,” for instance, “how you show the ROI of this work.”

Measuring social mediaPerhaps you’ll enter a coherent dialog on social media ROI across the organization (though we doubt it). If so, chances are good that the metrics discussed will be things like number of fans/followers/likes, number of comments, “people listening,” retweets and shares. You may get more programmatic correlations such as amount of money raised through Facebook or people that clicked a Pinterest link, came to the website and subscribed to the email list.

These numbers, however, say little about the value our work is adding to the life of the person at the other end of that like.

Most metrics are about us

The thing is, our social media metrics (heck, even our email and web metrics) are almost entirely about us, the organization. We assess our value and power by the number of fans, followers, subscribers as well as letters signed and cash in the door. And this informs our resource planning, staffing, program evaluation.

This is not terrible (at least not the cash in the door part).  These are informative data points if used in context.

But these are one-way relationship measurements. It’s as though we’re just in the business of selling shirts and all that matters is getting more customers in the door so we can sell more shirts.

But if I’m interested in sticking around as a company I really want to know what people think of the my shirts. How did it fit? Did it shrink in the wash? Did it fall apart?  Do you love it? Will you buy another one? Has it done the job?

How do nonprofits measure the value they are bringing to people’s lives? How do we get beyond discussions of tactics for getting more likes, retweets and impressions and move to learning about what is impacting people and creating the power to change communities? 

We must be able to clearly state how what we do relates to people’s lives. We need to understand precisely how our work matters to people before we can measure how we have helped people change their lives.

If you provide a direct service such as meals to the elderly, job training, or a bed for the homeless then you can measure the amount of such service provided. When it comes to social media, look for measures that tie your use of social media as closely as possible to that service. How many people knew about or took advantage of help based on social media? Look at social media metrics but also at client data. How did your organization’s use of social media affect use of services by your clients or audience?

Organizations that provide primarily advocacy services have a trickier time measuring benefit to audience and as a result use primarily indirect metrics. They infer from likes and shares that the audience is valuing (or not) their social media. These organizations, too, should directly and regularly query followers/supporters about the impact of social media on their actions and views.

Culture of community support

But advocacy organizations could also more directly seek guidance from social media about what advocates need, want, could use to help them be more effective. Social media (and email and the web) is an opportunity to have a direct conversation with the people that matter most to your work (and, no, we’re not talking about legislators or even your staff): your members, donors and activists.

Here are some guiding principles for helping your community:

  • Be deliberate about asking them what they need to be better advocates;
  • Provide what they ask for and test it, measure results and share feedback;
  • Be transparent with your community: share your intent and learning openly;
  • Identify and track people as they become more engaged (as well as the actions that they take to get there); and
  • Create a culture that encourages sharing advocacy stories, not just rants or odes of support. Instead of “great job” or “this guy stinks” we should strive to hear “this is what I did and here’s what happened.” (And if/when we get those stories, thank the people that share them.)

Focusing on how the community can become better advocates and supporters of one another will build and spread power, create longer lasting change, and take advantage of the interactive nature of current communications channels.

The Pitfalls of A/B Testing and Benchmarking

Improvement begins with measurement, but the ruler can also limit your audacity to try wildly new approaches (photo by Flicker user Thomas Favre-Bulle).
Google is famous for, among other things, crafting a deep, rich culture of A/B testing, the process of comparing the performance of two versions of a web site (or some other output) that differ in a single respect.

The benefit: changes to a web site or some other user interface are governed by real-world user behavior. If you can determine that your email newsletter signup button performs better with the label “Don’t Miss Out” instead of “Subscribe,” well, that’s an easy design change to make.

The practice of benchmarking – using industry standards or averages as a point of comparison for your own performance – has some strong similarities to A/B testing. It’s an analytic tool that helps frame and drive performance-based testing and iteration. The comparison of your organization’s performance to industry benchmarks (e.g., email open rates, average donation value on a fundraising drive) provides the basis for a feedback loop.

The two practices – A/B testing and benchmarking – share a hazard, however. Because a culture of A/B testing is driven by real-time empirical results, and because it generally depends on comparisons between two options that are identical in every respect but one (the discrete element that you are testing), it privileges modest, incremental changes at the expense of audacious leaps.

To use a now-classic business comparison: while Google lives and breathes A/B testing, and constantly refines its way to small performance improvements, the Steve Jobs-era Apple eschewed consumer testing, assuming (with considerable success) that the consumer doesn’t know what it wants and actually requires an audacious company like Apple to redefine product categories altogether.

Similarly, if your point of reference is a collection of industry standards, you are more likely to aim for and be satisfied with performance that meets those standards. The industry benchmarks, like the incremental change model that undergirds A/B testing, may actually constrain your creativity and ambitiousness, impeding your ability to think audaciously about accomplishing something fundamentally different than the other players in your ecosystem, or accomplishing your goals in a profoundly different way.

The implication isn’t that you should steer clear of A/B testing or benchmarking. Both are powerful tools that can help nonprofits focus, refine, and learn more quickly. But you should be aware of the hazards, and make sure even as you improve your iterative cycles you are also protecting your ability to think big and think different about the work your organization does.

And if you want to dive in, there are a ton of great resources on the web, including a series of posts on A/B testing by the 37Signals guys (Part 1, Part 2, and Part 3), the “Ultimate Guide to A/B Testing” on SmashingMagazine, an A/B testing primer on A List Apart, Beth Kanter’s explanation of benchmarking, and the 2012 Nonprofit Social Network Report.

Google Analytics Tips and Resources

Google Analytics is free but getting actionable data from it takes a bit of planning and time. We wanted to highlight for you here some great resources and tips for using Google Analytics in your organization or business.

Google Analytics - Make strategic data-driven choicesThe sky is the limit (of course) as you can spend almost endless amounts of time sifting through data and creating Google Analytics reports. At the end of the day though you need your website help you achieve some pretty discrete tasks. These could be things like:

  • give your readers more of what they’re looking for;
  • make information easier to find;
  • get more visitors to subscribe to your email list;
  • get more visitors to buy your products or make donations;
  • see more visitors sharing links to your blog posts, videos and social media networks.
New to Google Analytics or just want to get some insight personalized to your organization? Check out our free Analytics Assessment.

Google Analytics can help you see if you’re achieving these tasks and meeting the goals you set for your online program. But you won’t find this data by just looking at just pageviews and visitor numbers.

Here are a few tips and resources to help you measure what matters and make more data-driven decisions using Google Analytics.

Continue reading “Google Analytics Tips and Resources”

Measure what Matters

It goes without saying that data helps people make decisions. When deciding how much to pay for a house you want to know the sales price of comparable homes in the neighborhood. When looking for a place to go out for dinner you might look at the average star rating on Yelp.com. This data gathering is tied to a personal goal: paying the right price for your new home and finding a yummy dinner.

But many nonprofits gather and analyze vast amounts of web analytics data that doesn’t help make decisions and isn’t tied to organizational goals. Reports are prepared that talk primarily about pageviews from one month to the next. Detailed reports may display pageviews for specific pages instead of the full site. You may see the number of visits that come via Google and perhaps the search terms used.

Possibly, year to year goals for websites may discuss increasing overall pageviews by 10% or some other amount.

What’s missing? Data that ties to outcomes so that actionable decisions can be made about content, design, search optimization, advertising and so on.

Often, what I don’t see is leaders that know how to ask the right questions about web analytics and related online strategies. It is taken on faith that rising pageviews means success. What’s missing are analytics tied to measurable outcomes driven by organizational goals. These are, as Avinas Kaushik calls them, “faith-based initiatives.” This is an unfortunate way to base resource decisions.

A smarter approach is to focus on Key Performance Indicators (KPIs), preferably those tied to program and organizational goals. This doesn’t need to mean an extra layer of data, analysis and review (aka more work for everyone). But it does mean agreement on asking the right questions and being willing to base tactical decisions on what the data is telling you. In other words, if your goal is to increase readership in California and your blog posts on California issues aren’t getting more pageviews then consider adjusting content.

Put another way, increasing web traffic has nothing to do (in and of itself) with your website meeting organizational goals. If 500,000 people a month view your site but nobody shares content, makes donations, comments on your posts, signs up for your email lists, buys your t-shirts, writes blog posts relating to your content or otherwise acts then are you any better off than you were when 50,000 people a month viewed the site and nobody did anything?

If you are better off then how do you know?

It’s that “how do you know” that counts, right? What are the actions that people are taking as a result of viewing and engaging with your content that matter and how do you measure those?

Perhaps a goal is to increase your reach in California or west coast in general because you are working with key members of congress on an issue that affects trade (or the environment or whatever) in these states. You can hone in on metrics for pageviews (preferably of a certain type of content) from that region. Are they rising during a key timespan? Better yet can you identify why these pageviews are rising and if the rise is tied to visits from AdWords campaigns, content in regional blogs to which you have contacted or another action intended to drive more traffic?

In this case a KPI is still as simple as pageviews but it is tied to reach, specifically your organizational reach in an area key to a program goal.

Another key performance indicator helpful to nonprofit organizations would be items in the category of “conversion.” Are your site visitors doing something after seeing your content that A) you can measure; and B) helps meet at least one of your goals. It is one thing to say that a blog post is getting 500 pageviews when similar posts used to get 300 page views. But it is rare that this tells you much about how (or even if) you are moving the needle towards changing policy (or raising money or building an engaged constituency).

To measure a conversion metric think about what you want people to do after visiting the page. Should they go to a next page with more detailed or related content (a lightweight conversion but an indicator of interest)? Should they subscribe to an email list? Should they share photos or a link on Facebook? Should they comment on the post? Should they make a donation or purchase an item?

If visiting a page is point A then what is point B? Most analytics tools will help you attach values to that step B and/or let you see user navigation paths.

Don’t be disappointed if conversion rates are low. Really low. Two percent can be considered a solid conversion rate.  The reality is that people are looking information and not wanting to be “converted” to what you need. It’s like door to door canvassing. Tough but results count and you can learn from results and try to improve your technique. Test tweaks to design, page layout, headlines, related offers or content.

Getting to Stories with Metrics

Jeff Brooks is the man behind a great blog called Future Fundraising Now. In a recent post he discussed what performance metrics donors are looking for from the nonprofits they support.

Brooks’ thesis is that donors give primarily for emotional reasons and while metrics aren’t irrelevant donors aren’t seeking to connect with a cause or organization on data-driven, analytical levels. Stories about the people involved in and benefiting from the organization’s work (work funded by the donor) fuel the emotion that engages people to give, volunteer, fan organizations on Facebook and spread the word.

Yet Brooks doesn’t dismiss metrics one bit. With respect to stories he writes of good data gathering:

“You’ll get better stories. A system of gathering metrics will put you in contact of what donors really want: stories. And that leads to better fundraising.”

Metrics can tell a story about stories. Donors want stories.

But how do we learn about our storytelling from metrics? There is a ton of data out there. Too much. What can help guide us, inspire us to write better stories for our donors and others in our audience?

We can look at feedback metrics to gauge interest in our work. Some of these measurements also come with commentary that can give insight into the quality of the work. These metrics might be, for example:

  • pageviews of a blog post or other web page,
  • comments on a blog post, and
  • retweets, facebook shares and other social media discussion

If you don’t think that your content is generating the sort of reader numbers or discussion that you expect it could be a sign that you aren’t telling good stories and engaging people in your work through the content. Growing pageviews and comments in a certain type of content or subject area could indicate that those stories are interesting your constituents and may be good topics for further stories and fundraising efforts.

What about programmatic metrics? We can look at data measuring the type, quality or quantity of programs the organization provides to flesh out stories about that work. Here it is going to depend on your work but tying programmatic metrics to the people (and places) you have helped will strengthen stories.

  • How many people have been trained at job training sessions? How many participants are now in the workforce? What is a story of one or more participants about their experience and way life has improved?
  • How many meals were put on a table by your food bank? How many people have access to more local food with better nutritional value through your inner city slow food project? What are some of the stories from participants about how this has made them more independent or better able to feed their family?
  • How many people are receiving calls and emails encouraging them to attend a county hearing on natural gas drilling in the area? How many showed up? How many spoke? How many are now engaged in ongoing efforts to improve energy production in the area?

We work in a time when it is literally possible (almost) to drown in data that measures our performance. Your supporters don’t want to see it all. They will love you because you do great work that changes lives. Focus (for your donors) on telling good stories informed by solid metrics. If you want data for your accountant that may be something altogether different.