Impact
Are We Tracking the Right Data? What’s Measured Is What Gets Done In Social Impact
If it’s true that what’s measured is what gets done, then data can be a powerful lever for change. In the social impact realm specifically, equitable data collection and evaluation practices can be transformative in advancing equity.
That was the topic recently at Tides Corporate Impact Leaders Forum panel, Centering Equity in Data Design. Guided by moderator Alexandra Robinson, data ethicist at threshold.world, panelists shared the ways in which they’re doing just that: rethinking and reshaping data practices in service of equitable outcomes. The conversation elevated examples of data equity in action in social impact work; articulated lessons learned; and emphasized the vital roles that strong and intentional relationship-building, critical investigation, and distribution of power play in working toward equitable data design.
“Historically, in the social sector, in philanthropy, it’s been a one-way black box of how information is being used,” said panelist Drew Payne, founder and CEO of UpMetrics. “We now have a window to rethink that piece of it.”
Panelists spoke to the importance of integrity in data design, collection, and evaluation, and—perhaps most fundamentally—in the quality and intentionality of relationships between evaluators and the people or organizations being evaluated. In sharing what she and her colleagues learned upon embedding an inclusion index into their annual employee engagement survey, Tiffany Apczynski, vice president at data analytics and software company Alteryx, emphasized the role that integrity of approach plays in ensuring integrity of data. When the company’s survey returned a 70 percent satisfaction rate among employees, instead of presuming that things must be going well, Apczynski’s team saw an opportunity to probe further.
When Alteryx dug in deeper, the company discovered that workplace satisfaction was far less positive among employees who are underrepresented in the workplace. Apczynski described this experience as her team’s “first step in using data to build the case that we need to really invest in DEI and approach it in a way that favors the smallest number of voices versus the largest.” Robinson reflected that, “When we look at top-line survey results, they can tell one story, but there are so many blind spots that we have to uncover.”
Mala Kumar, who directs Tech for Social Good at GitHub, addressed the need to dig deeper to define standardized metrics for international development, public policy, and economic research. “In tech, we tend to talk about edge cases quite a lot,” Kumar said, referring to extreme or exceptional cases. “But I think if you’re looking at data equity—or any kind of equity—the edge case actually tends to be the thing that happens more often than we think.”
Acknowledging such limitations of conventionally sourced data is just the first step. Once Alteryx demonstrated the need to create more equitable conditions, the company invested in a consulting firm specializing in DEI to guide it through next steps. The initial focus was on targeted research with underrepresented employees and identification of gaps in data collection. Those research learnings were presented to company leadership, who supported the creation of an informed and intentional action plan.
While none of this work went without challenges, Apczynski said the support of an outside consultant was crucial, and that the development of trust among staff was an unretractable gain. “Everyone was very vulnerable,” she said. “And even if you were on other sides of an argument, the desire to understand one another was there.”
Jason Saul, executive director at the Center for Impact Sciences at the University of Chicago, and founder and CEO of Mission Measurement, highlighted a concerning and consequential disconnect that compromises the mutual understanding Apczynski elevated.
On the topic of food insecurity data, for example, Saul said, “All they ever do is ask people how hungry they are. Why aren’t we actually asking people what they need to not be food insecure? What interventions will be most effective? What will move the needle for you?”
Different answers arise from asking folks to identify their needs for themselves. “What beneficiaries said was, ‘We don’t actually want free food or meals,’” Saul said. “Top beneficiary-stated need is extra money to help pay for food or milk. And then it goes on and on until something like number six was ‘enough free food to last a few days.’”
Building Relationships
Payne’s work with The Schultz Family Foundation in Seattle, in the context of addressing food insecurity as part of their COVID response, provided another example of more equitable data design. To guide the work, his team asked: “How do we get capital out to communities as fast and effectively as possible, and loop in voice within the community, leveraging public datasets in terms of demographic data, income data, and then aligning on what success looks like?”
The path to the most equitable results has to start with relationships—a priority that all panelists emphasized. The Schultz Foundation, according to Payne, “had deep relationships within the community and did their research around the issue that they were taking on, which positioned them to orient towards mission- and vision-aligned success metrics.” What it took to get here, Payne said, was “upfront conversations across a variety of stakeholders to start with the why. What are we looking to accomplish within communities?”
Saul emphasized the need for donors and funders to put trust in the organizations they’re funding for the simple reason that those organizations represent the communities they serve.
Funders can “trust [organizations] that they know how best to solve the problem,” he said. “That also carries over on how to give money: ‘We are investing in buying an outcome and giving the organization the latitude to figure out how to best, most effectively, deliver that intervention.”