Skip to content

Search
Generic filters
Exact matches only
Search in title
Search in content
Search in excerpt

Elena Harman, PhD

About Elena

Avoid Three Common Pitfalls When Using Data

I’m sure you’re familiar with the phrase “too much of a good thing.” Take data, for example. You knew I would. The business sector loves statistics, and lots of them. And the social sector is quickly following suit. As a collective, we’ve even given it the name Big Data, because volumes of data must mean something larger than life.

The Great Nonprofit Evaluation Reboot: A New Approach Every Staff Member Can Understand, by Elena Harman, PhD (2019, CharityChannel Press)

Elena Harman, PhD is the author of The Great Nonprofit Evaluation Reboot: A New Approach Every Staff Member Can Understand,  (2019, CharityChannel Press)

Don’t get me wrong. As an evaluation consultant, I’m a fan of measurement and information, but only when it adheres to a thoughtful and intentional approach with those who are most directly involved using a balance of data sources. I recently read an article by Simon Rodberg, who was the founding principal of District of Columbia International School (DCI), a public charter middle and high school. Rodberg found himself on the wrong end of too much information—so much so that he’s become an advocate for reform of data-driven approaches.

Long story short, Rodberg inherited the field of education’s thirty -year belief system that more student testing will get us the results we need to improve performance. Under that belief system, teachers set aside “Data Days” for analyzing end-of-year and midyear exams, interim assessments, teacher-created and computer-adaptive tests, surveys, attendance, and behavior notes. Today, principals who ascribe to that philosophy have gotten lost in the abundance of numbers.

Rodberg’s call for reform is a sound one. Statistics and numbers are not inherently bad, but the education system has lost sight of the best way to use them. Somewhere along the way, the system has arrived at a more-is-better philosophy. Unfortunately, more statistics means less intentionality, which often results in accountability as the focal point. When accountability becomes the center of attention, you stray from opportunities to learn from data.

Three Universal Pitfalls

Chart: Three universal pitfalls the education system is experiencing with data.

Chart: Three universal pitfalls the education system is experiencing with data.

Here are three pitfalls the education system is experiencing with data. To be clear, these are not unique to education. I see these three issues again and again across systems in the public and social sectors trying to use more data:

Looking Outside the Immediate Context to Determine Questions

Evaluation works best not when it’s mandated by policymakers and administrators but, rather, when teachers on the ground are enlisted to identify the questions that, if answered, would help them do their jobs better.

Using Only One Data Source

When we think of data narrowly as only what can be quantified, we’re missing half the story. Dashboards, and other quick analyses that exclusively focus on tracking figures because they’re easily captured, leave out qualitative information and other equally important answers that inform growth.

Overemphasis on Accountability

In this example, a misguided approach to and relationship with data has failed kids. Data is a powerful tool only when embedded in a system of ongoing improvement. The No Child Left Behind Act focused on using data as a tool for accountability without giving schools the skills and capacity to learn from data. With that one-sided approach, we are left admiring the problems in education instead of learning how to fix them.

Focus on Three Priorities

Like Rodberg, I'm not ready to give up on data, so let’s consider downsizing the volume of metrics, choosing the right mix of data sources, and refocusing our efforts on letting those directly involved—in this case, teachers—identify where they genuinely need support. Remember that measurement is only as useful as the extent to which it informs strategy.

Chart: Three priorities to a sound strategy

Chart: Three priorities to a sound strategy

Let’s recap: A sound strategy is formed by effectively addressing three focus areas: (1) empowering those directly involved to form the questions that genuinely drive change, (2) applying a mix of data sources that fit your key questions, and (3) embedding data in a strategy aimed at nurturing growth rather than blame.

Rodberg’s story has universal applications in the social sector. No matter your cause or your nonprofit structure, you can draw lessons from the list above. If you can keep an eye on these priorities, you’ll avoid drowning in a sea of unhelpful data and, instead, find yourself buoyed by answers to key questions that inform good decision-making.

Getting Out from Under the Demand for More Data

So far, we’ve examined a school system where the principal was drowning in a sea of data, which was distracting everyone from focusing on the key questions that teachers really needed to have answered. We presented three common pitfalls that many nonprofits have to overcome. Now we get the chance to see how a large member-driven association overcame the blunt pursuit of more is better and, instead, created a measurement strategy that genuinely served its members. Let’s get acquainted with Businesses Unite and learn from their approach.

Businesses Unite

Meet Brandon, the vice president of communications and marketing for a five-thousand-member association, Businesses Unite. Brandon had previously worked for an advertising agency with a large budget to support sophisticated communications efforts. But at Businesses Unite, he had to do more with less. Instead of a communicating a singular message to a singular target with a large budget to do so, Brandon was charged with communicating about the association’s many programs to members, potential members, and policymakers with a budget that was less than 10 percent of what he was used to.

After three years of trying everything he could think of to serve all the core messages to all the target audiences well, he was ready to throw in the towel. Internally, program directors said their attendance wasn’t what it should be because Brandon’s team was not communicating their events enough. Externally, members said that they received too many communications from Businesses Unite; unsubscribe requests to the email list were skyrocketing.

Between a rock and a hard place, Brandon looked to the annual membership survey for help. Could members help Businesses Unite prioritize messaging to the areas that they cared most about? Brandon sat down with Anita, Businesses Unite’s chief operating officer, to discuss that year’s membership survey. The nonprofit had always done the membership survey in-house. Looking back, it was clear that the survey did not generate much useful information to inform a communications strategy.

Then Anita remembered meeting me at one of Businesses Unite’s events and reached out to see if I could help. I worked with the full executive team to articulate key evaluation questions for the membership survey. And to Brandon’s delight, one of the key evaluation questions that rose to the top was “What are the most and least effective communication vehicles for Businesses Unite to communicate its value to members?”

With that question and others in mind, I transitioned the membership survey to a “membership feedback process,” including both a survey and a series of follow-up focus groups. The survey confirmed what Brandon already knew: members were not taking advantage of the many programs Businesses Unite made available. And the large number of emails they already received from the nonprofit irritated members—to the point of reducing the perceived value of their membership.

But, like previous years, the survey did not uncover useful suggestions to improve and prioritize messaging. A survey was not and had never been the right instrument to answer that question. The focus groups were another story. My team, in collaboration with Brandon and Anita, developed an interactive focus group approach. The first step asked the group to develop a list of what participants would like to hear about from Businesses Unite.

What did they care about? What did they want information about? This step revealed many of the same things we knew from the survey—that people had “favorite” events and that they generally they wanted to network and learn. Not much new information here. But the second step dug deeply into why the group wanted to know about each group of items. What about the content mattered to them?

Here’s Where the Magic Started

Chart: How it comes together

Chart: How it comes together

By pushing the group to articulate why they wanted to hear about certain events, we began to gain a more generalizable understanding of what members valued in Businesses Unite’s communications. Instead of sending out an email about each networking event because members liked networking, we started to see the components and goals of networking that resonated with members.

And in the last step, we asked the group to prioritize only three areas to receive communications about. Brandon expected the usual pattern to emerge: members want such different things from Businesses Unite that the priorities would not converge. But the opposite happened. After articulating the why behind the content requests, a clear set of priority areas emerged.

By using evaluation intentionally, with a clear focus on the things he needed to know most, and by engaging the right methods to answer those key questions, Brandon finally had the information he was so desperate for. With the findings from the focus groups, Brandon crafted a new communications strategy. He was able to push back on program directors’ request for more communications with clear feedback from members. He was able to reduce the number of email communications while increasing the alignment of communications with member feedback. And slowly but surely, the complaints about email volume decreased while perceived value increased.

Themes You Can Use in Your Own Nonprofit

Below are some of the themes we can explore from the Businesses Unite story you might consider replicating in your own nonprofit:

Empowering Those Directly Involved to Form Key Questions

Key evaluation questions help you specify what aspects of a program will be evaluated based on what you want to know most. The Evaluation Center at Western Michigan University hosts a series of evaluation checklists to help you stay on course with question creation. Without key evaluation questions, you’re rolling the dice. If you are not clear up front about your learning expectations, you are letting whoever is implementing the evaluation make those decisions and you may not end up with the information you need at the end of the day. This pattern of unclear expectations leads the evaluator to make assumptions and produces an end product that doesn’t address what you care about. It’s what makes evaluation feel useless so much of the time.

Giving those who are more removed from a program the power to assume what questions need to be asked is also incredibly risky and may cause your nonprofit a lot of angst and cost it resources in the long run. The fact that Brandon’s department was viewed as being in a supporting role by the programming team didn’t help matters. Program directors presumed the membership’s questions, and their solution was that more communications is better. Brandon was able to break the cycle by using the members’ focus group answers to his key evaluation questions rather than simply responding to the program staff’s request for more email communications, which was causing in-box fatigue.

Using More Than One Data Source

Deciding on a data-collection method is as important as starting with the right key evaluation questions. If you miss the mark on how to collect information, you’ll find yourself beating the same drum over and over without creating a pleasing rhythm. In Brandon’s case, sending the ubiquitous survey year after year wasn’t generating any clear ideas for the staff. Only when we coupled the survey with a series of follow-up focus groups did we discover key findings that ultimately saved staff effort and resources. What’s more, the initial survey provided quantitative information, which spoke to the “scale and scope” of Brandon’s challenges, while the focus groups opened up “how and why” qualitative findings or ideation about what the members really wanted from Businesses Unite.

Chart: Quantative Data and Qualitative Data

Chart: Quantative Data and Qualitative Data

A common dichotomy in research methods is the qualitative-quantitative distinction. This split is useful for thinking about which broad categories of methods might fit best. But it’s had a terribly negative side effect of introducing a false hierarchy, where quantitative data is perceived as being superior to qualitative data. And as a result of this perception, well-done qualitative data is dramatically underused in program evaluation. It’s a shame, because nonprofits are fundamentally about people, and qualitative data is what helps the stories come alive in evaluation. So, let me clear this up right away: quantitative data is not inherently “better” or “more rigorous” than qualitative data. Both quantitative data and qualitative data can be high quality and rigorous when used to answer key evaluation questions they are well suited for—and when executed according to best practices for each method.

Embedding Data in a Strategy Aimed at Nurturing Growth Rather Than Blame

Nonprofits have much more to gain by assuming a learning-driven stance rather than a data-driven one, and this is the basis of my approach toward evaluation. A focus on improvement empowers nonprofits to prioritize program adjustments and incremental growth. Initially, Brandon felt like he was at the center of the blame game between the directors’ demand for ceaseless communications and the members’ outcry over full in-boxes about benefits that didn’t interest them. Thankfully, Anita and Brandon recognized that if they continued to gather data for data’s sake, they would ignore member attrition and slowly lose their membership base. Instead, they had an eye on the association’s growth and how to genuinely serve its members. With learning as a guiding principle, Businesses Unite regained credibility and the loyalty of its members.

Businesses Unite is a prime example of a typical nonprofit that feels if it asks enough times, the obvious answer will eventually surface. Fortunately, Anita and Brandon saw that if their measurement methods continued like Rodberg’s did at DCI, a climate of blame would continue without generating the answers they needed. Take a look at the data you may be collecting in your own organization and ask yourself if it is answering the questions you genuinely care about. If not, what circumstances or decision makers need to change to decrease volume and increase intentionality? As we learned from Rodberg’s DCI and from Businesses Unite, putting those most directly involved or impacted by your program is a great place to start.

image_pdfimage_print

Copyright © 1992-2019 CharityChannel LLC.

 

Leave a Comment