Safe Data Journeys: Governance and Ethics

December 3, 2025

Comments

Most campuses have a growing list of data projects in motion. New dashboards launch, vendors pitch AI tools, and someone eventually asks, “Can we do this with student data?” or “Is this even allowed?” When the room is not sure, momentum stalls or, worse, people move ahead without a clear guardrail. The gap is straightforward. If you’ve been using previous installments of our Field Guide series, you’ve learned how to ask better data questions and design better charts. What fewer people have had is the time and structure to talk through what safe and responsible use actually looks like in practice. 

Part 5 is designed to help you find that safe path. You will learn how to protect student and employee privacy, how to think about access and sharing, and how to spot common risk points before they become a problem. Instead of memorizing every rule, the goal is to give you a few repeatable habits you can carry into any project: pause to ask who is in the data, who will see it, and what could go wrong if you get it wrong. 

Our approach here is specific to Evisions, but it is grounded in wider work you may already know. The U.S. Department of Education’s Student Privacy Policy Office describes FERPA as the federal law that protects the privacy of student education records across both K-12 and higher education. In Europe and for some international students, the GDPR defines personal data broadly as any information that relates to an identified or identifiable person. EDUCAUSE has also highlighted data governance and data classification as foundations for safe handling, especially when multiple regulations overlap. Finally, research in learning analytics emphasizes the importance of privacy, transparency, and a commitment to student benefit rather than surveillance 

If earlier Field Guide parts helped you define shared language, clarify questions, and tell better data stories, Part 5 adds the rails that keep that work safe. The goal is not to slow you down. The goal is to give your institution confidence that your use of data for enrollment, student success, and planning is grounded in care, clarity, and respect for your community. 

What you will learn 

Over the course of this post, we’ll cover how to: 

  • Build a simple mental model of FERPA and personal data, so you know when you are working with protected information.  
  • Learn a practical approach to privacy and small-n suppression that works across dashboards.  
  • Explore access control options, including role-based access and data classification tiers.  
  • Understand the basics of consent and anonymization, especially for learning analytics and vendor tools.  
Privacy Basics and Small-n Suppression 

You do not need to be a lawyer to work responsibly with student data, but you do need a basic sense of the playing field. FERPA is the main U.S. student privacy law. It gives eligible students rights to inspect their education records and restricts disclosure of those records without consent, with some exceptions. Education records can include almost any record that is directly related to a student and maintained by the institution. That means your retention dashboards, advising notes, and LMS activity reports are often in scope. 

Outside the U.S., or when you work with international students, you may also need to think in GDPR terms. Under GDPR, personal data means any information that relates to an identified or identifiable person, including student IDs, email addresses, and even some combinations of demographics. The exact rules differ, but the shared principle is simple. If you can reasonably re-identify a person, you should treat the data with care.  

In earlier Field Guide parts, we talked about small-n thresholds as a trust check. In Part 2: Asking the Right Data Question, we used small-n to flag when a slice was too thin to support a confident conclusion. In Part 3: Metrics that Matter, we asked you to write your small-n rule into each KPI spec and footer so people could see when a metric was stable and when it was not. In Part 5, we reuse that same habit with a slightly different lens. The small groups that make a percentage noisy are often the same groups that are easier to re-identify. 

One of the simplest tools you have is small-n suppression. Instead of showing counts or percentages for very small groups, you hide or aggregate those rows. Many institutions use a threshold such as “suppress any cell where n < 5” to reduce the chance that someone can guess who is in a subgroup, especially when multiple filters stack. When you treat that rule as part of governance, not just “good analysis hygiene,” it becomes a shared guardrail instead of a personal preference. 

In practice, that means: 

  • Agree on a suppression threshold and document it in both your KPI specs and your dashboard footers 
  • Watch out for combinations of filters that can bring n below that threshold, even if the base report looks safe.  
  • Remember that percentages with very small denominators are both risky for interpretation and higher risk for privacy.  

You do not have to solve every edge case to make progress. Start by making it easy for anyone who builds or consumes reports to see the small-n rule, understand why it is there, and know who to ask when a use case pushes into a grey area. That continuity from Parts 2 and 3 helps people see governance as an extension of the same shared discipline you are already building around questions and metrics.  

Access Control and Data Sharing You Can Explain 

Good governance is not just a committee or a catalog. It is the way you answer a simple question: “Who can see what, and why?” EDUCAUSE describes data governance as the processes and methods institutions use to manage and use data wisely. For most campuses, that shows up as a small set of access tiers and clear role definitions rather than a thick policy manual no one reads.  

A basic pattern looks like this:  

  • Public or open data, such as institutional factbook summaries and marketing statistics.  
  • Internal data, such as aggregate dashboards for faculty and staff that do not allow drilling to individuals.  
  • Restricted data, such as named student records for advisors, financial aid staff, or registrars.  
  • Highly restricted data, such as detailed HR, conduct, or health information.  

Your job in data literacy is not to invent all the rules from scratch. Your job is to help make them visible, so people know when they are crossing from one tier to another. An Access and Privacy Matrix can help. List a few common roles on your campus, list a few common reports, and mark whether each role should see only aggregates, de-identified records, or named records. Once you have a draft, bring it to your governance or IT security group for refinement.  

If you want a deeper dive on how to start those conversations, our on demand webinar “Building Better Data Governance” walks through best practices for launching a governance effort and shows how Evisions tools can support consistent definitions and safer sharing. 

Consent, Anonymization and Ethical Use 

As more campuses explore learning analytics, early alerts, and AI, questions about consent and anonymization become harder to avoid. A common misconception is that removing names is enough. Under GDPR and similar frameworks, pseudonymization, where you replace identifiers but retain a key that can link back, is still considered personal data. True anonymization means that individuals can no longer be identified, even indirectly, which is very hard to guarantee in rich educational datasets.  

In learning analytics, researchers and practitioners have highlighted three themes that show up again and again: transparency about how data will be used, a focus on student benefit rather than surveillance, and meaningful opportunities for students to ask questions or opt out where possible. You do not need a full ethics board to start living out these principles in your day-to-day work. You can:  

  • Add a short “purpose and use” note when you introduce a new dashboard to advisors or faculty.  
  • Avoid building reports that are purely about monitoring without a clear support action tied to them.  
  • Document when and how you are combining data sources in ways that might surprise students, such as LMS traces plus card swipes. 

Our one-page Safe Data and Ethical Use Checklist for Dashboards and Reports can remind teams to ask who benefits, who could be harmed, and whether the students whose data you are using would be surprised by the use. 

How To Get Started 

Download our Access and Privacy Matrix and Safe Data and Ethical Use Checklist and add them as a discussion point for your next data team meeting.  

Consider starting by: 

  1. Complete three rows on the Access and Privacy Matrix and share the draft with your governance or security group for feedback. 
  2. Pick two dashboards and run the Safe Data and Ethical Use Checklist on them.  
  3. Find at least one collaborator who can help you gain traction at your institution. Watch “Building Better Data Governance” with them and use the webinar as a starting point for a campus conversation about where governance work already lives and where you might need to formalize it. 

 You don’t have to do everything at once. Start small and remember that perfect is the enemy of good. Give some of your ideas a try and refine as you go. Building small habits now can offer big returns as the academic year continues.  

If this still feels overwhelming or you’d like a little extra support, don’t hesitate to reach out to your Strategic Solutions Manager. Our Client Experience Team (CET) is always here to partner with you and help bring your reporting and data analytics goals to life. 

Where the Field Guide Goes Next 

This is the fifth post in our six-part series on Data Literacy. Building safe data journeys through governance and ethics is one step on the path to building trust in your data so that leaders feel comfortable using it in everyday decisions. While each installment builds on the last, you can use Part 5 on its own and see immediate results. 

Join us in two weeks for Part 6: From Analysis to Action! 

Allen Taylor
Allen Taylor
Senior Solutions Ambassador at Evisions |  + posts

Allen Taylor is a self-proclaimed higher education and data science nerd. He currently serves as a Senior Solutions Ambassador at Evisions and is based out of Pennsylvania. With over 20 years of higher education experience at numerous public, private, small, and large institutions, Allen has successfully lead institution-wide initiatives in areas such as student success, enrollment management, advising, and technology and has presented at national and regional conferences on his experiences. He holds a Bachelor of Science degree in Anthropology from Western Carolina University, a Master of Science degree in College Student Personnel from The University of Tennessee, and is currently pursuing a PhD in Teaching, Learning, and Technology from Lehigh University. When he’s trying to avoid working on his dissertation, you can find him exploring the outdoors, traveling at home and abroad, or in the kitchen trying to coax an even better loaf of bread from the oven.

Related Posts

Telling Data Stories

Most visuals on dashboards try to do too much. A chart lists every slice, a legend asks the reader to decode colors, and the title tells us only the topic. By the time the room agrees on what the picture says, the meeting is over. Part 4 is about a...

Metrics That Matter

Most campus debates about “the number” sound familiar: a retention rate appears in a slide, someone cites a different figure from a dashboard, and the meeting drifts into detective work. The core issue is not the tool; it is that our key metrics...

Asking the Right Data Question

Most campus meetings start the same way: someone asks, “How are we doing on retention?” or “Is the funnel healthy?” Dashboards open, people scroll, and the hour ends without an answer. A business prompt is not yet an answerable data question. Part...

0 Comments

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *