Skip to main content

Data Equity Principle 5

Question default methods and assumptions for data collection and analysis and triangulate quantitative data with other sources.

Description

Data users must critically examine their methods and assumptions when collecting and analyzing data to ensure they do not inadvertently reinforce historical biases, deficit narratives, and power imbalances. Modern data collection and research methods are rooted in legacies of racial power imbalances and exploitative practices. Some lasting effects of these legacies include maintaining whiteness as the standard to which other groups are compared (for example, reporting Black-White and Asian-White gaps in outcomes) and over-relying on quantitative data, which can perpetuate stereotypes, without considering qualitative, contextual factors. Data teams that lack racial and ethnic diversity and varied life experiences, including experiences close to the community at the center of data projects, may reflect inherent biases. The makeup of data teams can lead to potentially misleading research questions, uneven power sharing, and assumptions of what data are “meaningful.” By triangulating quantitative data with qualitative information and reexamining personal and institutional biases, data users can mitigate these risks.


Quantitative data alone are insufficient to illuminate the full picture of a community’s experiences. Though often seen as objective, quantitative data can reflect the biases of the researchers and administrators who design data collection instruments and of the individuals who report the data (such as teachers and police). Relying solely on quantitative data can also remove pertinent institutional factors from analysis that reveal critical information. Using qualitative methods in addition to quantitative methods can more adequately capture why and how disparities exist, including root causes. Qualitative data sources include focus groups, interviews, observations, or long-form surveys. In some projects, it can be appropriate to employ community-based participatory research methods (CBPR)—one model that challenges traditional research structures. CBPR prioritizes collaboration between data users and community through equal partnership. Whatever methods data users choose, they must ensure data collection instruments are clear, unbiased, and speak to the experiences of community members by piloting questions and revising them accordingly.


The racial, socioeconomic, and cultural identities of data users implicitly influence the research questions they seek to answer, the way in which they collect data, and the methods through which they analyze and report them. Before a project begins, data teams should consider their team dynamics and characteristics and examine their individual and group implicit biases, for example, by using tools like the Implicit Association Test or an intentional reflection of how the team’s experiences and motivations might differ from those of the priority population. In doing so, team members with less dominant identities should be able to opt out of potentially harmful spaces. Uncovering, acknowledging, and addressing personal and institutional biases at the outset can guide the team’s approach to each phase of the data life cycle. For example, if a project involves employment data, the team can assess whether bias exists in its definition of “valid’” employment and adjust data collection or analysis plans to make the inquiry more inclusive. Exhibiting cultural competency and including a diverse team of data users with proximate experiences to the priority community increases the accuracy and ultimate benefit of the data work.

Child Trends initiative with PBS Kids 

A 2019 Child Trends initiative with PBS Kids sought to develop family engagement programs in four communities. To ensure program designs were rooted in community needs, Child Trends launched a community assessment study as a first step. The team held an open discussion to consider how its experiences differed from those of the communities it planned to interview, including how bias might influence proposed interview questions. The team then repositioned interview questions to lead with the existing strengths in family engagement efforts, rather than gaps or weaknesses. Next, to challenge the norm of centering White, middle-class experiences and values as the standard for family engagement, the team employed a “360-approach” to understand the priorities in schools across the four communities. This approach involved interviews with educators, parents, and leaders of family groups. The strategy ensured the team did not just default to an approach that would not be useful to each community. 

Applying this Principle

Key phases for this principle
Example applications
Context-setting

At the outset of a data project, conduct an implicit bias test or group reflection activity among the proposed team to identify individual and institutional biases and discuss ways to mitigate them throughout the project life span. To increase cultural competency, learn about the history, power structures, and systematic barriers that exist in priority communities, as well as the community’s prior experiences with data collection efforts. Continue questioning biases and assumptions in each subsequent phase.

Planning

Ensure data teams reflect diverse lived experiences, and in particular the experiences at the center of the data project. Consider which type of data collection or research model the project is proposing—traditional, community-engaged, or full community partnership. Examine whether the proposed approach and metrics inject any assumptions about the partner community, or whether they place undue burden on them. Pilot all data collection instruments, both qualitative and quantitative, with community members to ensure the instruments are culturally aligned to capture accurate and reliable data.

Collection

Employ qualitative methods, such as interviews, focus groups, town halls, narratives, or long-form surveys, to triangulate quantitative methods. Gathering data through a wide variety of sources strengthens analysis and can validate, contextualize, or challenge quantitative findings.

Access
Analysis

Carefully consider whether findings perpetuate or reinstate a negative stereotype or deficit narrative. If findings meaningfully neglect institutional or systemwide factors, consider how community input might supplement the evidence to give a fuller picture.

Reporting

Reflection Questions

  • What assumptions are built into the proposed data collection or analysis approach?
  • Is the data team reflective of and close to the community whose data are being collected? If not, has the team conducted an implicit bias exercise or group reflection?
  • Have efforts to examine the disparity in question existed in the past? Can you pull from those efforts and supplement quantitative data through qualitative exploration?
  • Have you piloted research instruments or data collection prompts with members of the priority community? Do the instruments reflect assumptions about the priority community? Can they be repurposed using asset-based framing?

 Be On The Lookout

Publicly available quantitative data sets often report measures of compliance, such as arrest and suspension rates. These “simple” measures may be cheaper and easier to collect, but can perpetuate stereotypes and deficit narratives if not analyzed with care. Data users should think closely about the metrics they choose and consider whether they are defaulting to using data that happen to be available, even if the resulting metrics are not as meaningful for the project’s goals. When possible, data users should gather input from community partners when selecting data for collection and define metrics using asset-based framing. If the project must use a “simple” measure that relies on available data, data users should supplement it with other data points, including qualitative data, to help in interpretation.

Additional Resources

  • The Equitable Evaluation Framework. The Equitable Evaluation Initiative’s site offers a framework of principles to align evaluation practices with an equity approach, along with a suite of resources, reflection tools, and examples to help data users apply these principles.
  • Why Am I Always Being Researched?. This Chicago Beyond guide offers ways to authentically partner with and engage community members in selecting approaches and methods to data collection and analysis. The section “For Researchers” (p. 62) discusses specific probes to challenge internal and institutional biases in default methods.
  • Making Racial Equity Real in Research. This resource from the Greenlining Institute outlines promising and problematic practices throughout the data life cycle. The sections “Methodologies, Data Collection and Analysis Can Perpetuate Inequities” (p. 14) and “Lack of Cultural Competency of Researchers” (p. 15) caution against pitfalls and offer promising practices when launching data collection initiatives.
  • How to Embed a Racial and Ethnic Equity Perspective in Research. This Child Trends resource introduces a model for data collection through the lens of five equity principles, including that “researchers should examine their own backgrounds and biases.” In addition, it offers guidance on qualitative and quantitative data collection and analysis.
  • Community Based Participatory Research. Chapter 36 of this University of Kansas guide on evaluation outlines principles and practice guidance for engaging in CBPR, an alternative to traditional research.

References

The framework's recommendations are based on syntheses of existing research. Please see the framework report for a list of works cited.