Skip to main content

Data Equity Principle 2

Protect the privacy of individuals who provide data while ensuring appropriate ownership and access to information.

Description

Data privacy policies protect the right of individuals to maintain control over their data. They include a combination of federal, state, and local laws—including the Family Educational Rights and Privacy Act (FERPA)—and institutional policies. Most policies focus on protecting personal information—or information that is important to an individual (even if it does not personally identify them)—and regulating data access and use, thereby limiting emotional, financial, and even physical harm that can result from data privacy breaches. Although privacy considerations are critical, it is also important to understand and honor data ownership. Data users must acknowledge that data providers are data owners that consent to the use of their data.

Data privacy policies have evolved in recent years to better reflect that data systems do not “own” data more than the people whose lives are represented in them. In 2018, the European Union passed the General Data Protection Regulation, which gives European residents the right to know, access, update, erase, and restrict the types of data collected on them. Since 2020, the California Consumer Privacy Act (CCPA) requires businesses (including for-profit education service providers and for-profit universities) to obtain parent or guardian consent before collecting data from California’s children and to delete data upon request, among other things (CCPA has inspired similar laws in other states). A common feature of these laws is that they grant individuals the ability to update, delete, or opt out of all or specific applications of their data at any point during or after collection. Even if not mandated by law, E-W data systems should have a clear process for accepting these requests and clear guidelines around honoring them.

With growing interest in using student data to drive decision-making or to support the development of new education technology, including artificial intelligence (AI), comprehensive data and AI governance (see Principle #8) are critical to protect students’ privacy. Leaders must implement robust policies so all data users can operate in ways that safeguard privacy, enhance data security, and ensure the ethical use of data and AI in educational settings. Governance should include transparent guidelines for AI use, with continuous assessments to identify and mitigate biases and discriminatory impacts. This could include ensuring students’ data are not used in training sets for AI tools without their consent, that students’ data are not input into open-source AI tools or projects, developing quality standards for measuring the evidence about AI in education, and clarifying how AI can be used in educational settings.

The real risks of data breaches

The Government Accountability Office (GAO) discovered 99 data breaches in 281 school districts from July 2016 to May 2020. The breaches affected thousands of students and parents, exposing sensitive data such as special education records, test scores, phone numbers, and Social Security numbers. School staff, students, cybercriminals, and vendors were all responsible for various data breaches, which were both intentional and accidental. Citing the risks to students’ physical, emotional, and financial well being, the GAO recommended that schools review and follow data privacy laws, provide data security trainings, require vendors to configure data systems adhering to the Federal Trade Commission’s “Start with Security Guide,” or take an annual Nationwide Cyber security Review self assessment.

Applying this Principle

Key phases for this principle
Example applications
Context-setting

Review federal, state, local, or Tribal data privacy laws and policies that apply. Determine whether you need memoranda of understanding, data-sharing agreements, or consent to collect or share data.

Planning

Develop a list of data elements to collect and any linked data sets, as well as how you will store data, who will have access to data, how you will use data and for how long, and what you will do with the data after analysis is complete.

Collection

Communicate data privacy and security processes when collecting data. Seek informed consent even if not required. Only collect data that are necessary and have been approved.

Access

Store data in a secure location that is only accessible to authorized users. Ensure storage systems have the proper protections (such as locks, encryption, and passwords). If you share data, ensure they are transmitted through secure methods. Train those with access to data on relevant laws and best practices. Practice data minimization; only give users access to the minimally necessary data elements and data sets. Ensure individuals who provide data can access, update, and delete their data upon request. Upon project completion, discard or return data as directed or previously established by individuals who provided the data.

Analysis
Reporting

Maintain confidentiality of participants in reporting. Do not name individuals without permission, share a combination of data points that could lead to an individual being identified, or report data on very small sample sizes that could risk identification. Delete data when no longer in use for the intended purposes.

Reflection Questions

  • Beyond federal data privacy laws such as FERPA, which state, local, or Tribal data privacy laws or policies apply to you?
  • What procedures have you established to enable individuals to access, update, or delete their data, if requested?
  • If many people opt out of data collection, why have they done so? How can you use their feedback to inform and redesign data collection efforts to minimize conflict and harm?
  • What will you do with the data after analysis and reporting? Can you share the data back with communities? How can the individuals who provided their data inform your decision?

 Be On The Lookout

When sharing data, both parties must formally consent to sharing data (such as through a memorandum of understanding or data-sharing agreement), transmit the data securely, and clearly track the data lineage—where the data came from and where they are going. Data should always be deidentified to protect individuals’ privacy. Parties must never share data with third parties (whether businesses, researchers, law enforcement, or other government agencies) or use the data for other purposes without permission.

Additional Resources

  • Roadmap to Safeguarding Student Data. This Data Quality Campaign implementation road map for state education agencies overviews relevant data privacy laws and best practices for transparency, governance, and data protection procedures.
  • A Path to Social Licence: Guidelines for Trusted Data Use. Data Futures Partnership offers eight guidelines for data use related to data value, protection, and choice. Although some of the guidelines are specific to New Zealand and its Tribal communities, many are universally applicable.
  • A Toolkit for Centering Racial Equity Through Data Integration. The chapters on “Racial Equity in Data Collection” and “Racial Equity in Data Access” by Actionable Intelligence for Social Policy address positive and problematic policies related to data privacy, as well as cite brief case studies.
  • Envisioning a New Future: Building Trust for Data Use. This resource, developed by the Urban Institute for the Data Funders Collaborative, describes approaches to building trust for collection and use of data, such as ways to expand and control data access and improve systems for consent and transparency. It includes a list of additional resources for data use and integration.
  • Digital Promise: Artificial Intelligence in Education. Digital Promises’ AI in Education initiative offers recommendations for a human-centered approach to AI use, including discussions of AI and digital equity and AI and safety, along with additional resources.
  • Massive Data Institute Privacy-Enhancing Technologies. Georgetown University’s Massive Data Institute offers trainings, resources, and research for implementing privacy-enhancing technologies, or PETs, when working with education data.
     

References

The framework's recommendations are based on syntheses of existing research. Please see the framework report for a list of works cited.