Taming "Big Data"

Using Data Analytics for Student Success and Institutional Intelligence

By Stephen G. Pelletier    //    Volume 23,  Number 7   //    Special Issue 2015

Deliberatively provocative and a tad spooky, a recent headline in The New York Times read, “Blowing Off Class? We Know.” The author of that piece, op-ed contributor Goldie Blumenstyk, a senior writer at the Chronicle of Higher Education, observed that “the stuff some colleges know right now about their students, thanks to data-mining of their digital footprints, boggles the mind.”

Blumenstyk has a point. Using technology to map student activities and practices online, in classrooms, and even at extracurricular events, some colleges and universities today know a lot more about students than just whether they skipped class. Technology makes it possible to collect and analyze rafts of details about student performance and behavior. And while the ability to collect that information raises profound questions about individual privacy and data security, it also creates a rich seam of data that colleges and universities are starting to mine productively to help students map individual paths to academic success.

At the same time, technology also makes it possible for colleges and universities to collect reams of data about their own fiscal and operational performance— data that can be massaged and scrutinized to reap rich insights that can drive efficiencies and improve performance on the administrative side.

In sum, institutions are learning to extract the truths that are hidden in what’s often called “big data”—insights that experts say can drive better administrative decisions and operational performance and help students succeed academically and complete their degrees. In an era when most colleges and universities need every advantage they can muster, leveraging data may give an institution and its students a competitive edge. In that pursuit, however, a few cautions and guidelines are in order.

Defining Terms

There is considerable fuzziness around the terms people in higher education use to describe their work with data. “Data analytics” has been described as the discovery and communication of meaningful patterns in data, using various techniques and tools to quantify performance and ultimately to describe, predict, and improve it. In higher education, “analytics” can refer to a wide array of functions on both the academic and administrative sides, from improving student performance and instructor effectiveness to managing student enrollment and budgeting strategically. Efforts by colleges and universities to use data more systematically to better institutional decision making and operational efficiency can be viewed as a version of what the private sector calls “business intelligence.” But many educators eschew that term, perhaps because it has too corporate a feel, opting instead for “institutional intelligence,” “academic intelligence,” or “operational intelligence.”

Finer nuances are often also drawn. “I think it’s important to make the distinction clear between what has been referred to as either academic analytics or institutional analytics versus learning analytics,” says Phillip D. Long, associate vice provost for learning sciences and deputy director of the Center for Teaching and Learning/Continuing and Innovative Education at the University of Texas at Austin. “In the case of institutional analytics, or what EDUCAUSE CEO Diana Oblinger calls academic analytics [emphases added], that’s an interest in trying to move the needle on some higher-order institutional metrics like attrition rates or year-to-year completion rates. These are probably the sorts of things I suspect most trustees would be thinking of.”

On the other hand, Long says, “learning analytics is focused at the level of the individual learner and on giving learners actionable information to make their decisions about study within a given course or set of courses.” The goal, he says, is to provide information that will help learners make optimal choices about “how best to apply their intellectual efforts and make the most effective progress toward their academic goals.”

Whether the goal is to use big data to improve academics or administrative performance— or both—IT experts say colleges and universities need to move from merely collecting data that report past accomplishments to more sophisticated analysis that connects the dots in ways that suggest future action. Crediting the construct to the consulting firm Gartner, Inc., higher education consultants Donald M. Norris and Linda L. Baer, authors of Building Organizational Capacity for Analytics (EDUCAUSE 2013), describe a progression in analytics that moves from hindsight to insight to foresight. The ladder starts with descriptive analytics, which describe what happened, and then progresses to diagnostic analytics, which tell why it happened. The next level, predictive analytics, suggests what will happen and offers insights into how can we make it happen?

Important Privacy Questions

From Target to Home Depot to JP Morgan Chase, notable recent breaches of databases underscore that colleges and universities have to worry about keeping data secure. For many institutions, a data warehouse is a crucial tool for making institutional data available for analysis in ways that protect the privacy of individual records.

Typically, a warehouse might collect data from a number of sources of institutional data. The way the warehouse is designed, as well as the policies that govern who has access to which data inside the warehouse and how those data can be used, can help keep personal information secure while also allowing analysis that can assist the institution and its students. For example, looking at numerous records in the aggregate, an institution might see patterns in course enrollment that lead it to make more-strategic decisions about program offerings—without compromising individual privacy.

As part of its successful effort to use data to improve student success, for example, Marist College gathers volumes of student information— from statistics culled from admissions applications to records of the digital tracks that a student leaves while using computers, mobile devices, library readings, and other digital channels throughout his or her course of learning—into what the college calls its “learning record store.” Marist analyzes the data to discern patterns of success or failure that can help inform decisions about when the college might need to intervene to help students succeed.

To protect personal information, data imported into the store are “anonymized by a random algorithm that takes the student ID and any identifiable information, gets rid of it, and applies another ID number so that all those records can be kept together,” says William Thirsk, vice president and chief information officer at the college. “Data stays in there and is secured.” If the data show that a particular student needs academic intervention, three different authorized users of the store, including the professor in the class where the student needs help, must each provide digital security keys before that student’s record can be unlocked. That way, Thirsk says, no one person alone can identify a student. Just as important, he adds, is to provide a channel for individuals to opt out of having their data in the mix of information that is being analyzed.

To help it decide parameters for data privacy and access, Marist initially turned to its Institutional Review Board. “When you’re starting this kind of big data project, you have to remember that every click that you record is a click of human behavior, and that brings with it great responsibility,” Thirsk says. With that in mind, he adds, “the first thing that we did when we started collecting data—and I would recommend that everyone do this, because this is human-subject research—was to go to our Institutional Review Board and definitively define what we were going to do with that data.”

With IRB approval in hand, Marist was able to develop an algorithm that can predict student success or failure about two weeks into a semester, with an 80-percent success rate, and then intervene appropriately. While the data revealed tantalizing patterns that weren’t part of the original inquiry—for example, Thirsk says, “we also started to see students who were succeeding greatly and outachieving their peers and we didn’t have a plan for them”—investigators felt it was imperative to get further permissions before delving any further. “The key is to put a project mandate down about what you’re trying to accomplish and stick to it,” Thirsk says. Even if the data show some other interesting trend, he says, “You can’t just go after it. You have to go back to the IRB and get what you are doing approved by policy. Get some rules wrapped around it, and then proceed.”

Many institutions report sometimes-paralyzing campus resistance to the use of data analytics based on fears about institutional compliance with the federal Family Educational Rights and Privacy Act (FERPA) and other regulatory requirements. Such concerns are of course valid and must be weighed. But, suggests Norris, who is president of the management consulting firm Strategic Initiatives, Inc., “some people hide behind FERPA when they don’t want to do certain things.”

Given such concerns, institutional pioneers in data analytics have had to wrestle with establishing policies for privacy. Starting in 2010, for example, Mark Yudof, then head of the University of California System, asked a high-level task force of administrators, faculty, and students to explore the appropriate balance between ensuring individuals’ privacy while allowing institutional access to information. Although relevant policies were in place, they lacked coherence, and many were out of date, says Kent Wada, chief privacy officer and director of strategic IT policy for the University of California, Los Angeles (UCLA). Moreover, he says, some policies seemed to work at cross-purposes.

Like many colleges and universities, UC and its campuses had to weigh many competing interests and concerns about data privacy and access. “The institution has to figure out what it wants to do, but, of course, the institution is made up of many different people with many different viewpoints,” Wada says. “So how you come to an agreement can be really challenging, especially when you have competing goals that are all legitimate.”

It is easy, he says, when “we have some kind of an external mandate—a law or some kind of obligation that is imposed on us that says either we must do something, or we can’t do something.” Yet between those two extremes, Wada says, “is a vast, gray space where we have discretion.” It is in that gray area, he suggests, that institutions can develop nuanced policies and practices that meet needs to preserve privacy but also open access to data for analysis.

One crucial understanding that developed from the UC task force’s work was that “privacy can mean fundamentally different things, depending on what community you are caught talking to,” Wada says. Administrators, he continued, typically approach information privacy in ways that track with laws like FERPA and the Health Insurance Portability and Accountability Act (HIPAA). Their focus is on protecting data about people, such as student records, patient records, credit card data, Social Security numbers, and the like, as directed by government edicts.

But other campus audiences—perhaps especially students—might tend to worry about what Wada and colleagues termed “autonomy privacy,” or protections from Big Brother-style surveillance and monitoring of behavior. “The various techniques that we use to protect data about people can be very invasive to autonomy privacy,” he says, because there are tools screening “what people are doing. Who is accessing databases? What are they doing with data? How many times do they access it?”

Says Wada, “There is no clear line” about what is appropriate in some circumstances. “There’s only the line that you draw for yourself. And different institutions choose to draw that line very differently.”

First Things First

While many campus leaders may get excited about the promise of data—whether they are mining information to improve student success or to make strategic business decisions—experts say it is necessary to ensure that certain protocols are in place before data analysis commences. At Hawaii Pacific University (HPU), for example, administrators and the board of trustees have clamored for good data to help inform a strategic-planning process. Although eager to help them get the right information, Sharon Blanton, HPU’s vice president of informationtechnology services, nonetheless had to push back a bit while she helped HPU organize its data operations.

“Right away, people want to jump at the big questions. They want to start right away with predictive analytics,” Blanton says. But first, she says, “You have to go back and ask, what are our data structures? Where does this data live? How are we accessing and analyzing data today? Do we have the right tools? Do we have the right people in place to be able to work on this?”

For institutions that are starting to engage in data analytics and operational/business intelligence, Blanton believes that board members and presidents need to ask a fundamental question: How do you know you can trust your data? “They don’t have to know anything about business intelligence, but college and university leaders should at least be asking that question,” Blanton says. “And then those responsible at the institution should be able to clearly demonstrate rules around data management and governance that show ‘this is how we know.’”

Blanton, who is helping to drive a businessintelligence initiative at HPU and earlier developed one at Portland State University, says institutions also need to make sure their data is protected and secure, and that the right tools are in place to allow justin- time access to data in ways that can inform decision making. Another imperative, she says, is that institutions “need to invest in actual data scientists” to manage their data analytics.

Blanton also argues that the ability of a college or university to use its data well hinges on a deep understanding of how it conducts its business. “You can’t understand the data if you don’t understand the business processes,” she says. “It’s about building an infrastructure, both human and technical, to have the right tools, providing the right data, at the right time,” Blanton says.

“A lot of times people undertake businessintelligence projects without having strategic questions in mind, and I think that is a mistake,” adds Param Bedi, vice president for library and information technology at Bucknell University. “My advice for a board would be to look at what the strategic questions are that they want to answer at their institution. That’s going to help frame some of the big business-intelligence discussions on the campus.”

Bedi also says that, from an operational perspective, it is vital that campus offices share data broadly—while of course keeping privacy and security protocols in place. That often requires a change in traditional campus practices, in which data is often closely held in siloes “owned” by departments and offices. “Stop looking at data from a student perspective or a finance perspective or a development perspective; look at it from an institutional perspective,” Bedi says. “This is a big cultural shift. We have said that this is not finance data or registrar data, but rather this is Bucknell data.”

Long of UT Austin suggests that a balance between two different mindsets is needed on campus—one for those charged with protecting data and one for those responsible for manipulating it. Both roles are critical, but unless a leader on the campus is balancing those respective responsibilities, they can work at cross-purposes. “Institutions need to decide what their data architecture is going to be,” Long says. Then, to make data analytics work at a functional level, “they need to get people in place who view their responsibilities as enabling the data to be used, as opposed to being protected,” he says. “Then they need to let them alone so they can do their job without political interference, without the issues around data ownership that often plague good data decisions.”

Yet another piece of advice is that institutions do not need to go it alone when it comes to developing a sophisticated data-analytics capacity. “I think there’s a lot of opportunity to share knowledge from the institution with another university, or to join forces with a private firm, as we have done,” says Henry Childers, executive director for University Analytics & Institutional Research at the University of Arizona (UA). UA partnered with Civitas Learning, a private software company, to use predictive analytics to better understand students’ journeys through the institution. (UA is also one of the drivers behind the University Innovation Alliance, formed in September 2014 as a model for inter-institutional collaboration on critical issues like serving low-income students. (See the May/June issue of Trusteeship for more on the Alliance.)

Similarly, Phyllis A. Wykoff, director of the business-intelligence center at Miami University in Ohio, says her institution has gained a lot of knowledge about data analytics by sharing notes and experiences with other universities through the Higher Education Data Warehousing Forum (HEDW), which started as an informal meeting of campus-based IT and data staff in 2003 and has since grown into a full-fledged organization whose participants share ideas freely.

The Board’s Responsibility

Linda Baer, a senior fellow at Civitas Learning who earlier served as a program officer at the Bill & Melinda Gates Foundation and as a high-level administrator at several Minnesota public universities, says that board members ought to think about data at a strategic level, as opposed to an operational level. “Trustees need to remember what key questions they should be asking and not get way down into the weeds about things,” she says. “Once data are reported to them,” she says, they should focus on “what actions should the institution be taking given the data?”

Further, it is incumbent on colleges and universities to help their board members develop that understanding. Says Baer, “Make sure that there is enough time for board members to understand the story behind the numbers. Make sure there’s enough time for them to talk with faculty and leadership about what’s that as the focus, versus explaining the formulae that led to the numbers.

Baer suggests that the effective use of data analytics pivots on whether a college or university has developed “a culture of inquiry” that is open to mining the full benefits of what data reveals. The right environment, she says, is one that uses numbers and data for continuous improvement. Nurturing that culture, she says, means “training faculty and staff members to understand the insights they can derive from data.”

Ora Fish, executive director of the programservices office at New York University, is responsible for the university’s data warehouse and business intelligence. Like Baer, she also sees data analytics as a lever for cultural change on the campus. Institutional offices that have ready access to data and the right analytics can shift from being merely operations-focused, she says, to asking “what if” questions that can lead to new practices that can help drive transformational change.

At one level, Fish suggests, data can help answer specific operational questions: “Are we offering classes for which there is a demand? How well are we managing tuition revenue? How can we optimize space utilization?” At an even more practical level, she says, data about operations like help desks and completing purchase orders can help improve student service and campus efficiency.

In terms of learning analytics, Fish says, colleges and universities are starting to be able to collect better data about metrics like student success and degree completion—and act accordingly. But she is quick to add that “we’re all just starting to think” about using those data to address higher-order questions. “At the end of the day, how do you know, really, whether a student learned what they were supposed to learn?” she asks. “How do you measure that? What criteria do you use? How can you intervene while the student is still in the process of learning? That is an exciting area.”

Says Fish, “I think business intelligence today is a necessity [in higher education]. But it is important to establish that overall it is a strategic program. It has to be monitored. You have to establish accountability for its success. Somebody has to be responsible for developing a strategy, defining what success means, and leading implementation.”

Moving Forward

Ultimately, it is important for boards to understand data analytics, advocates say, because in an era in which competition for students is fierce and the economics of running an institution are challenging, data mining and analysis create new tools that can help institutions better serve students and sharpen their competitive edges by making better strategic decisions. Data analytics may also help colleges and universities meet the increasing calls from legislators, government officials, accreditors, and the public for more transparent information about their operations and student outcomes. And data about student performance may help drive institutions to improve teaching and learning in ways that help students achieve better personal results. But grasping that potential can be challenging.

“Data is growing faster than anybody’s technical or intellectual capability to process it or understand it,” says Marist’s Thirsk. “If you’re just getting into this, you have to start small. You have to test your assumptions very carefully.” Noting that “the math and what we’re seeing in the analytics do not lie,” Thirsk cautions that data analytics requires new mindsets and may challenge old assumptions about how various processes work.

Thirsk offers one other piece of advice: Hire expert help to analyze the numbers. “This is far more important than simple reporting,” he says. “Don’t just hope someone’s going to understand analytics at a deep level. You have to get experts, not amateurs. Data scientists are expensive, but trust me, they’re worth it.”

The use and insights of data analytics will take hold in an institution’s life and operations only when boards and presidents make that a priority. Because institutions follow the strategic directions and targeted goals that board members and presidents articulate, Baer says, “if people get the feeling that this isn’t one of the more important things” among all institutional pursuits, “then it’s going to take a backseat and it won’t happen. It takes leadership to make things like this really fly.”

Norris, of Strategic Initiatives, adds that the critical questions are not about data per se. Board members and presidents need to ask, he says, “How do our students succeed? Are we remaining competitive? Do we offer a compelling value proposition? How does our value proposition stack up against other institutions’?” Says Norris, “To get at those questions, you have to get down into the heart of what the institution is doing.”

Advocates believe data analytics can help board members, presidents, and their institutions reach that vital core.

The Early Stages of “Data Mining”

Most of higher education is in the early stages of data mining and analytics. But a few examples show how a handful of pioneering institutions are using data to help improve students’ success:

Researchers at Purdue University developed Course Signals, a software program that combines student data in ways that enable instructors to monitor student performance and predict students’ success in a course—and, when necessary, step in to help students improve their performance. Early intervention helps retain students in courses and can boost their performance by as much as a full letter grade, researchers there have found.

As provost and vice president for academic and student affairs at Austin Peay State University, Tristan Denley (now at the Tennessee Board of Regents) spearheaded development of Degree Compass, a tool that compares a learner’s transcript and other data to those of thousands of other students to help individual students select courses and choose a major best suited to their talents. The education technology company Desire2Learn acquired Degree Compass in 2013.

Georgia State University has developed a robust capacity to analyze data about past student performance in courses and programs in ways that help the university recognize when current students might be in academic trouble. Its system triggers interventions to help the students get back on track. The system helps not just failing students but also identifies those who are pulling Cs in “gateway” courses for programs in which ultimate success typically hinges on better performance.

Marist College developed an “academic early alert system” that mines past student data to predict within two weeks of the start of a course whether a student will complete that course successfully. The system, tested successfully not just at Marist but also at several other colleges, triggers alerts for faculty members, who then intervene to help students succeed.

Similarly, we are also starting to see colleges and universities deploy more sophisticated data analysis to help guide operational/business decisions and improve institutional operations:

Bucknell University has established an award-winning institutional intelligence project that collates data formerly walled off in, for example, the offices of the registrar, admissions, and financial aid to shape strategic decision making campus-wide. First focused on priorities such as student enrollment counts and retention, Bucknell intends to expand data analysis to bolster its effectiveness in such areas as advancement, finance, and human resources.

New York University has made significant progress in tapping data to drive decision making in institutional operations and conduct cost/benefit analysis of academic programs. Next steps include embedding analytics in operational processes.

Miami University of Ohio is developing a business intelligence initiative that supports its recent adoption of responsibility-centered management (RCM). Focused largely on budgeting questions, the work includes models that help the university better understand sources of revenue by division and department, as well as trends in student enrollment and retention.

logo
Explore more on this topic:
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.