Higher Education’s Return on Information Technology

By Eric Denna    //    Volume 23,  Number 7   //    Special Issue 2015

Board members are increasingly confronted with some difficult choices surrounding investments in information technology. There seems to be an endless and incessant stream of proposals for new or upgraded systems for admissions, student information, learningmanagement, financial operations, research administration … the list goes on and on. Each of these proposals promises a variety of returns. Frequently, the costs of these proposals rival the costs of buildingconstruction projects in the tens of millions of dollars.

In 2002, the EDUCAUSE Center for Applied Research published a report by Robert Kvavik, Richard Katz, and others titled “The Promise and Performance of Enterprise Systems for Higher Education.” In it, they reported that a conservative estimate of what had been spent on large administrative systems during the prior 10 years exceeded $5 billon. One can only imagine what has been spent since then—and what is being spent this year or will be spent in the coming years. Unfortunately, Kvavik and Katz reported, many of the projects that contributed to the $5-billion price tag failed, in that they came in over budget, took longer than planned, or did not deliver the expected value. The authors concluded the following in the executive summary of their report:

External forces such as quality of software or consulting were found to be less influential than internal forces. When asked, these institutions revealed that the major obstacles to completion were mostly internal to the institution. They include data issues, cultural resistance to change, and lack of understanding software capabilities. … It’s interesting to discover that the institutions themselves—their cultures, their people, and their historical decisions— are the primary hurdles to clear for a successful implementation, not the technology, the consultants, or the vendors.

Trustees and administrators may be able to take some comfort from the fact that this does not appear to be an outcome specific to higher education. In 2008, IBM published “Making Change Work,” which includes the following results of a study that IBM had conducted of failed IT projects:

  • Only 40 percent of projects met schedule, budget, and quality goals.
  • The best organizations are 10 times more successful than the worst organizations in carrying out successful IT projects.
  • The biggest barriers to success were listed as “people” factors, including mindsets and attitudes (cited in 58 percent of failed projects), organizational culture (49 percent of projects), and lack of support by senior management (32 percent).
  • Underestimating the complexity of projects was listed as a factor in 35 percent of failed projects.

And a more recent report, published by McKinsey & Co. and Oxford University, included similar troubling statistics. This 2012 report, “Delivering Large-scale IT Projects on Time, on Budget, and on Value,” said that a study evaluating 5,400 IT projects “with initial price tags exceeding $15 million” found that, “On average, large IT projects run 45 percent over budget and 7 percent over time, while delivering 56 percent less value than predicted.” The findings were found to be consistent across industries.

Why do we tolerate such results? With the mounting financial challenges facing every institution of higher education, it is more critical than ever to make sure that investments in information technology result in real, measurable value. As a long-time university administrator of information-technology projects, I know that too often the statements of value in the proposals that a variety of university departments submit are vague and hard to measure—either by administrators or by board members responsible for approving major budgetary proposals. Therefore, we need to be clear about what we mean by an improvement when an IT project is proposed. Shouldn’t an improvement mean there are measurable desirable changes in the quality, timeliness, or cost of what is being done or how it is being done as a result of investing in technology? One of the biggest mistakes we often make is that we allow loose definitions of “improvements” in an IT investment proposal before approving it for funding.

Given all this, what can trustees do to change what is clearly a longstanding problem of realizing an appropriate return on IT investments in the higher education setting?

The most important recommendation is that boards must make sure the proposed value of a large-scale IT investment is clear and measurable. Within virtually any organization investing in information technology, the potential value can be clarified as being one or more of five types, and we can pose specific questions about the potential return on our investments.

  • Will the investment improve the value we provide to those we serve? For higher education, IT investments should improve services that directly help students or faculty learn. For example, investments in technology-enabled classrooms, learning-management systems, learning analytics, and the like, might be argued to enable improvements in quality, timeliness, or the costs of student learning. An investment in a new high-performance computer, visualization lab, or digital lab instrument might be argued to improve the quality, timeliness, or costs of faculty learning (research). Whatever the specific investment, its purpose should be clearly related to improving the value to important campus constituencies such as students and faculty.
  • Will the investment improve how the institution’s work is done? These investments typically should focus on improving the speed, quality, or cost of an institution’s administrative processes. Will processes for recruiting and enrolling students result in better students or take less time or cost less per student? Will the institution be able to pay its vendors more quickly or be able to reduce the cost per payment? Will the process of applying for and tracking grants reduce the time spent by researchers on administrative tasks or reduce the costs or time required to submit a grant?
  • Will the investment improve how decisions are made? Certain IT investments in the past have produced more timely reports to decision makers in an effort to enhance the speed, quality, or cost of decision making for the institution. Certain new technologies are touted as providing decision makers with “decisioncompelling” information, not just summaries of operational data. Instead of simply providing more summaries of transactions, investments in IT should provide decision makers with analytics that offer insights about what factors should be considered when making a decision. At any university, several key decisions fundamentally shape its success. For example, which students should the institution recruit and/or admit? Which faculty should it recruit and/or promote? Which capital projects are the most important to the future of the institution? Each of these questions, and many more, require useful information to enhance the likelihood of making the best decisions. Any investment in decision support should result in improvements in the quality, timeliness, or cost of either decision making generally or of specific critical decisions.
  • Will the investment improve how various groups in the institution communicate and collaborate? Increasingly, students, faculty, and staff need to communicate and collaborate with a wide variety of people not only within their institution, but also with people in many other organizations as well. Forms of technology services aiding communication and collaboration range from basic telephone service to email and shared calendaring to document sharing, instant messaging, video conferencing, interactive whiteboards, voicemail, and more. Again, these types of investments in technology should result in improving the quality, timeliness, or cost of communication and collaboration.
  • Will the investment improve how we manage risk (legal, operational, technical, reputational)? The focus here is on investments that concentrate on mitigating, if not eliminating, certain risks. Many risk-management challenges, including identifying, monitoring, and mitigating developing risks, are embedded in or shaped by the prior four types of investments. However, IT investments to manage risk typically introduce new technological risks that must be managed as well. For example, IT investments might include encrypting data when stored or being transported, finding sensitive and regulated data on a network, detecting and preventing network intrusions or malware, or hiring an external firm to try to penetrate sensitive areas of a network. Sometimes this may feel like a vicious circle—invest in technology to manage various risks that result in the need for more technology to manage the increased technological risk . . . resulting in the need for more technology.

The answers to the overall questions posed above can help administrators and boards determine whether an IT investment will be worth the return. And of course, the results should be measured after the fact to determine the actual return on investment (ROI) of the project.

Let’s look a bit further at each of these drivers of campus investments in IT and how results might be measured.

Improving value to those we serve. At the heart of the value of any institution of higher education is its ability to help individuals, groups, and communities (local, national, or global) learn. What distinguishes those within the community is their maturity as learners. Whether in the lab, lecture, or library, the focus is all about learning, and proposals to invest in information technology to improve learning in higher education have been around for decades.

Sometimes we think technological investments to improve learning are something new, but I was amused recently by an email from a colleague who referred me to a Smithsonian.com article titled “Predictions for Educational TV in the 1930s.” The article included a graphic presenting the possible future of higher education using a new and exciting form of information technology: television.

While we all might chuckle a bit at this reference, it is actually quite descriptive of what we have attempted to do with information technology— regardless of the form of media. The question to ask ourselves is, “Since we have been investing in learning technologies since the 1930s, what are the resulting improvements?”

As an academician who has been involved in the last 30 years of IT investment, I find this to be one of the hardest questions to answer. Have all of these investments improved the quality, speed, or cost of learning in higher education? Some may argue that there is no question that PowerPoint slides are much better than lecture notes hand written by professors, but that confuses teaching with learning. Improving the quality of presentation slides by faculty does not equal improvement in learning.

In 2009, the U.S. Department of Education released a meta-analysis of more than 50 research studies of online education between 1998 and 2008. What they found was that online learners performed better, in terms of learning outcomes, than traditional classroom students. Furthermore, they found that students in hybrid classes, which used both online and face-to-face learning environments, did better than those in either purely online or purely traditional face-to-face environments. Why were online-learning programs found to be more effective? The reasons ranged from the fact that online media allowed for more time on learning tasks, to individualized instruction, to opportunities for collaboration, to opportunities for reflection.

As higher education considers a variety of new investments in learningmanagement systems, learning analytics, and so forth, some experts are increasingly asking and attempting to answer the questions, “What does it mean to improve learning?” and “How would we measure improvements in learning?” I have long thought that higher education’s lack of sophistication in answering these basic questions constitutes its Achilles Heel. Many newer, and generally non-traditional, efforts to improve learning are increasingly focused on personalized learning. Such investments may require the complete rethinking of what our learning objectives are, how we assess whether someone has achieved the learning objective, the various learning experiences that could help the student achieve the objectives, and the learning resources required to support the educational experiences. A rush to simply buy a new learning-management system, predictive analytical tool, or adaptive instruction engines cannot, in and of themselves, improve learning.

Improving how work is done. Past research into why some IT projects fail and some succeed has turned up interesting results that are relevant here. In one study, after an analysis of 200 cases, it appeared that when an organization focused first on technology within a functional organizational boundary—say, the student-aid office or the finance department or the purchasing department—the likelihood of success was significantly lower than when an organization focused first on what might be called “enablers of processes” (not bound by organizational boundaries) in an integrated framework. By enablers I mean the following:

  • The work flow;
  • The policies and rules;
  • The motivation for and measurements of various functions involved in the work;
  • The human-resource requirements; and
  • The design of the workspace.

All these “enablers” of a process contribute to the role of information technology. If these enablers are not rethought, along with the role of information technology, then the likelihood of success is much lower. For example, I remember earlier in my career being told that we needed a new admissions system for the university. I suggested we start by first looking at the admissions process, paying particular attention to the enablers listed above. We documented more than 150 steps in the workflow that included 17 different departments (one of which was the admissions department), antiquated and misinterpreted policies and rules, competing or non-existent motivation and measurements, antiquated job descriptions, and poor workplace configuration. When we documented all this with a simple set of tools, we quickly found we could eliminate nearly 100 of the steps without changing any technology at all. We also made a few changes in the admissions system itself to make the process work better. Altogether, we significantly improved the quality, speed, and cost of the admissions process. Students, parents, and the institution were all happier.

I have seen far too often that organizations rush to buy a new shiny technology gadget without first thinking through the process enablers. As a result, they simply make a bad process more expensive and even harder to change.

Why is this important for higher education boards? Given the financial pressures on most institutions, it is important not to waste resources on projects that don’t improve how the institution’s administrative work is done. Process analysis and design are key to improving the success rate of IT investments. If an IT project is housed organizationally inside a particular division, such as budget, facilities, or academic affairs, there are too often political boundaries to what can be considered in analyzing and reengineering processes. Process analysis and design needs an institution-wide perspective, and the board has the ability and responsibility to make sure that the planning and execution of IT projects receive this kind of mandate. Boards can ask the questions of administrative leadership that facilitate process analysis and design across the institution, not merely within arbitrary organizational boundaries that are too often influenced by personality instead of process. This is a huge issue, and attention to it can significantly increase the likelihood of success for a college or university investing in technology.

One of my favorite resources in this area was published 20 years ago, but its lessons generally have not been heeded. Daniel Seymour’s book Once Upon a Campus: Lessons for Improving Quality and Productivity in Higher Education (Rowman & Littlefield Publishers, 1995) recommends deliberate attention to cross-functional design of processes in higher education. He outlines examples of processes that can be improved and ways to do so that can help improve the return on investment we receive from IT. I recently reread the book and found it every bit as relevant today as when I read it the first time.

Improving how we make decisions. The quality, timeliness, and cost of our decision making has a profound effect on every aspect of higher education—how we improve the value we provide those we serve, how we improve management of our work, how we manage risk, and how we communicate or collaborate.

It is hard to read anything about information technology today without a reference to big data, business analytics, or business intelligence. But just buying the latest technology for gathering big data and conducting sophisticated analytics will not solve all of our problems. Critical decisions for a university might include what academic program to start or stop, when to build a new classroom building, whom to target for large potential donations, which faculty members should receive tenure, or whether to approve a research center. Too often, little time is taken to clearly define what decisions need to be supported, what it means to improve the decision- making process or outcomes of decisions, and how the improvement will be measured.

These are questions the board should make sure are asked and answered. Furthermore, when the types of decisions I’ve just enumerated are done poorly or not in a timely fashion or cost more than necessary to achieve the result, it can be devastating—the wrong academic programs are continued or not started or not stopped, overly costly buildings remain in use or unnecessary new buildings are built, potential large donors remain unidentified, a faculty member is promoted who ultimately proves to be more a liability than an asset, or the wrong research center is approved or allowed to continue.

Sometimes we can be a little simple-minded about decision making, thinking that more data equals better decisions. When I worked at IBM early in my career, I remember hearing over and over, “We are data rich and information poor.” IBM was realizing that more data does not mean improved decision making. Whatever the IT investment in supporting decisions, it should result in improvements to the speed, quality, or cost of decision making for the institution.

Improving how we communicate and collaborate. As with any community, success is greatly influenced by how well we communicate and collaborate. Who can question that past investments in information technology have improved the timeliness and cost of communication and collaboration? Today, an institution of even modest size facilitates thousands, if not millions, of email messages every day. Trying to distribute even a small fraction of those messages using paper and “snail mail” would radically increase the cost and time needed to process the messages. Yet for all the past investments, we are only now beginning to ask such questions as, “What does it mean to further improve communication and collaboration? And how would we measure the improvement?”

Without careful attention to these questions, we can simply keep spending in such a way as to decrease the cost and time required to process a message, but without making measurable improvements in the quality of our communication and collaboration. A simple example of this is that too often larger universities will have more than one calendaring and email system on campus. Not only are such redundancies wasteful in terms of spending more than is necessary (thereby taking money away from the core mission of learning), but such redundancies also increase the cost and complexity of collaboration across campus. Rather than doing a very simple calendar search to schedule a meeting, we end up having an army of administrative staff trying to coordinate calendars across departments, colleges, or divisions. The duplication of effort and investment in networks, phone systems, email/calendar systems, video conferencing, and so on, is as wasteful as it is complicating. The board can be incredibly helpful in making sure such is not the case at its institution.

Improving how we manage risk (legal, operational, technical, reputational). Investments in information technology have the potential to improve risk management by helping to keep track of risks and charting progress in appropriately mitigating risks. Too often, however, institutions focus more on the other types of IT investments that I’ve just discussed, and they forget about the risk that the technology itself introduces. Recent breaches in information security, however, should have caused every board to take a hard look at whether risks in that realm have been appropriately identified and are actively being mitigated in appropriate ways. Clearly, in this realm, an ounce of prevention is worth a pound of cure.

Boards should insist on a formal method of assessing IT risk as part of a larger enterprise risk management approach. Each year there should be a new assessment measuring progress in mitigating known risks and in identifying previously unknown risks. In addition, the board should insist on getting reports on the number and severity of serious security incidents and the costs associated with such incidents. Given the rapidly escalating and dynamic nature of IT risks facing universities, simply having a reasonable idea of what risks a college or university faces is vital. Without knowing and prioritizing such risks, IT investments are simply guesses or are driven by trends in the marketplace. Risk assessment should be included in any IT investment.

Let me conclude with a specific risk needing the attention of every board. Almost every trustee knows about the growing problem of deferred maintenance with respect to the physical campus. What is often unrecognized is the growing deferred maintenance bill associated with the virtual (IT) campus. While the bill for deferred maintenance of IT may not be as large as the one for the physical campus, if it is even 10 or 20 percent of that of the physical campus, it should be of concern. Thus, if the deferred maintenance for a large physical campus is estimated at $500 million, it is not unreasonable to assume the cost of deferred maintenance for the virtual campus is in the area of $50 million to $100 million.

I believe that is a very conservative estimate. When I talk to IT people across higher education and ask about the replacement or upgrading that they feel should be taking place across their IT infrastructures and applications, they rather easily cite figures that amount to much more than 20 percent of the costs of deferred maintenance projects on the physical campus.

What makes this a particular challenge and risk is that the useful life of IT equipment and software is far less than that of the physical campus. We will often attempt to extend the useful life of facilities with good preventative maintenance, new paint and carpeting, and the like. IT has no such equivalent. When IT equipment fails, it does not do so gracefully. IT equipment has two states— working and not working.

Worse yet, while a facility built today might be expected to last 50 or more years, the vast majority of the IT investment should be refreshed every five to six years, with a few minor exceptions such as physical wiring. For a campus with 35,000 to 40,000 students, it is not unreasonable for the network to cost $60 million or more to install. That is like building a new building every five or six years just for the campus network. Not only do you have the campus network but also all the computing and storage equipment, the electronically enabled classrooms, special computer systems for research labs, and administrative systems—you can easily get to a total IT investment of over $200 million for a modest-sized university. If 10 to 20 percent of that historical amount is needed to keep it functioning, an institution might be looking at $20 million to $40 million a year to simply maintain or replace what it already has—with no improvements. The same ratio of 10 percent to 20 percent of total IT investment is fairly generalizable—for institutions large and small.

Many institutions thus face mounting risks of their IT investments becoming obsolete—and thereby increasingly risky—because there is no disciplined plan for replacing the investment. The big question for trustees is how will the IT investment be maintained and when will the relevant systems eventually be replaced? If an institution is going to invest in technology—and each one must—the institution also needs a plan for replacements and upgrades, just as is the case with campus facilities. If it doesn’t have such a plan, unwelcome financial surprises are inevitable in the future.

Conclusion

Today’s higher education institution requires an increasingly complicated integration of people, processes, and tools to support learning on the part of faculty and students and to provide appropriate return on the money invested in the institution by all its constituents. Realizing value from the growing investments in information technology should be a concern for every board, as aside from labor and facilities costs, investments in IT are likely to be the largest expense facing a college or university. Colleges and universities can realize specific types of value from investments in information technology. But given the increased costs of IT, boards should be vigilant in asking the important questions that they can best ask before new projects are launched—and in insisting on measureable results as projects are put in place.