Slide 1
Slide 2: Statistical Research and Methods, Performance Measurement, Program Evaluation, SPSS Statistical Software
Slide 3: e-Marketing / SEO, Social Networking, Blogging, HTML / CSS / Photoshop
Slide 4: Adobe Creative Suite, Web Design, Database Management, Microsoft Office
Slide 5
Previous Slide
Union Station
Statistical and Research Methods, Performance Measurement, Program Evaluation, SPSS Statistical Software
E-Marketing/SEO, Social Networking, Blogging, HTML/CSS/Photoshop
Adobe Creative Suite, Web Design, Database Management, Microsoft Office
Washington Monument in Red
Next Slide

What I'm Saying.

Measuring Performance with the End Outcome in Mind

Nonprofit Communications

Source: Checklist by mykpwedding on FlickrIn relation to a previous post, “Measuring Performance, ROI and The Production Line,” I would like to suggest that you take a moment and read this excellent post from Beth Kanter, “Get Your Social Media Strategy in Shape With Spreadsheet Aerobics.”

What is most impressive about this article is the approach she takes to data collection. It is certainly a cumbersome task to retrieve data, let alone make sense of it. She suggests many useful tips such as beginning with the end outcome in mind by “think[ing] through your content and engagement strategy… what [you want] to measure [and] set up an efficient method for collecting that data.” She also doesn’t seek to measure everything that is potentially measurable. She selects what is meaningful for her to measure progress toward her end outcome.

Performance measurement isn’t only about gathering data, it is making sense of the data, finding outcome indicators rather than outputs for the data you are collecting. Rather than measuring the number in your audience that has increased recently, we seek to find out why and work to isolate what we did that made that happen. Is seeking a larger audience helping to achieve our goal? For most, it is. But this is only an example of what I am working to illustrate.

Performance measurement is more than seeking to fill the box with numbers and then say the job is done. We also, as Kanter notes, need to “mak[e] the time to actually look and think about what the data means.” I find that many programs overlook this part of the performance measurement process. They look at the data, as Kanter says further, and they see an upward pointing arrow and assume that all is well with the program. Most never seek to venture deeper into the data. Maybe there are several unsubscribers and newcomers offset that number of unsubscribers. Perhaps these newcomers will unsubscribe very soon due to something the unsubscribers experienced as well? But the numbers aren’t telling you this at first glance. Gathering qualitative data will help reveal this problem by gathering unique responses to the unsubscriper’s problem to ameliorate and eventually alter the program, addressing the problem that the unsubscriber is having. Kanter writes, “This is where I’ve gleaned most of my insights – a combination of quantitative metrics culled from [Facebook] Insights and what people are saying on the page” in reference to her measurement of her own Facebook Fan Page.

In conclusion to her blog post, Kanter asks as open-ended questions, “What are you learning from your… measurement strategy? How have you kept your data collection trim, fit, and actionable? What is the most compelling thing you learned about your… strategy through measurement that lead to better results?” These questions should accompany any strategy in acquiring and analyzing data for any program. Further, listening, learning and adapting is the name of the game in any performance measurement analysis. Don’t check the box because there is a box to check. Check the box because it helps you to reach your end outcome that you seek.

- Will

Comments are closed.

Advertisement


Advertise Here