Using Maturity Models for Data Validation in Higher Education



There are multiple ways of assessing where your institution is at regarding data validation and data governance. In this discussion, we’ll look at data validation as it relates to maturity models. The maturity model is an assessment of the data governance program using measures. Such measurement enables you to determine what stage the organization is in and how you are progressing over time. By taking this “data validation challenge,” you are laying the groundwork and designing a roadmap for a data governance program at your institution.

A Note About Maturity Models

There are many maturity models that can apply to data governance. A model consists of various stages of development or maturity. It’s important to realize that the model should be customized around the unique goals, priorities and competencies of your organization.

The Five Areas  

When it comes to maturity models, there are five areas of focus: Formalization, Awareness, Metadata, Data Quality, and Stewardship. For each area, we’ll provide a definition and a list of questions you can use to assess your institution’s data governance program. The measures are subjective, as there is no ranking or assignment of values to responses. This is good for a self-assessment. The responses can be codified for future use, if need be.


Formalization is the extent to which the roles and activities are structured and documented. This is a measure of how formal the data governance program is at the institution. Questions to ask:

  • Are there defined roles/responsibilities within the organization to support data governance?
  • Are there any policies or, more importantly, documents detailing data validation and governance practices?
  • Do data governance meetings occur? Are they documented?
  • Are the policies applied to all systems and data?

Awareness is the extent to which an organization’s members have knowledge of the roles, rules and techniques associated with the data governance program. It is a measure of how much is known about the purpose of data governance and validation. Questions to ask:

  • What do the executives know about the data governance program? What do the knowledge workers know?
  • Do executives and others understand their roles? Are they promoting data governance and validation?
  • Are data governance capabilities (meetings, programs, data collections, data definitions, etc.) being used within the organization?

Metadata describes the data (databases, tables, applications, etc.). It also includes the definitions of specific data elements related to the business processes, which facilitate an understanding of the usage of the data. The definitions are sometimes the hardest part of the program. Questions to ask:

  • Is metadata collected for all systems? Is it defined for all systems (including the critical data elements)?
  • Are the definitions available to all system users?
  • Are the definitions handled via data governance policies and procedures?
  • Have all departments agreed to and approved the data definitions?
Data Quality

Data Quality is the process for defining acceptable levels of validated data to meet the needs of the organization. This process includes data entry as well as importing data from disparate systems. Questions to ask:

  • Is data quality regularly tracked and reported? Or is it an ad hoc effort?
  • Are your data quality efforts more than a one-time data cleaning?
  • Is data quality a set of SQL statements?
  • Are you applying data quality testing to both data at rest and data in flight?

Stewardship addresses accountability. It’s a measure of how well the data governance program has been adopted by the data owners. Questions to ask:

  • Do all data owners participate in the data governance process?
  • Are data owners performing their assigned roles?
  • Is compliance with the documented policies enforced? Are policies around key data elements being enforced?
  • Is data quality reported, and are there data exception reports?

Why Measure?

It is certainly helpful to measure your data governance program, but keep in mind that the driving business need for the program is not just process definition and automation. It is the creation of a system that produces better business decisions. As more and more data is collected and available for reporting and analysis, the process of understanding and controlling that data becomes more important.

Time and Target

While the goal is to achieve high measures or positive responses in all areas, know that reaching such goals may take several years of effort. Many organizations – most, in fact – do not reach high measures in all areas, yet they have successful data governance programs. How? By deciding what’s necessary and important to the organization and then implementing only that.

For example, there are organizations with robust data governance programs that lack comprehensive data quality measures. Or an organization has good data definitions and good data quality procedures, but does not have specific data governance policies in place.

Data Economy

It is also important to note that the data economy is driving changes in how data is collected, processed and analyzed. These changes include the growth of cloud sources, business intelligence self-service tools, extended enterprises, personal information management, and open data. Data governance must evolve and adapt alongside these new innovations affecting business decisions.

Next Steps

Once you’ve taken the data validation challenge it’s time to think about what comes next. Keeping in mind the results of your assessment, you should focus on deriving data governance objectives from your business objectives. Then, use that to drive governance structures, processes, and criteria. While the procedures and automation of data governance are important, they take a back seat to your institution’s need for a system to help facilitate better business decisions.

Like this blog?

Check out the On Demand webinar:

Peter Wilbur is a Strategic Solutions Manager on the Client Experience Team at Evisions. Peter graduated from Northern Arizona University with a computer science degree in 1984. After working in several industries and with numerous companies, he joined Evisions in 2010 working on the support desk before moving to Professional Services, where he eventually came to serve as Professional Services Manager. Peter is a member of the Project Management Institute and a PMP. He enjoys spending time with his German Shepherds.

Related Posts



Submit a Comment

Your email address will not be published. Required fields are marked *

Share This