Lessons Learned about Student Learning: Eight Test Cases

By February 10, 2014 March 7th, 2019 Trusteeship Article

For the past two years, AGB, with the generous support of the Teagle Foundation, has been engaging eight diverse institutions to improve their boards’ oversight of educational quality and student learning. Specifically, the project has had four pillars of focus:

  • Metrics of student learning (direct and indirect student learning outcomes);
  • Board assurance that institutions are engaging their students in high-quality learning experiences;
  • Changes in the work of the board to better focus on student learning and academic quality; and
  • New ways that faculty, administrators, and board members should engage one another.

The eight institutions—Drake University, Metropolitan State University of Denver, Morgan State University, Rhodes College, Rochester Institute of Technology, Salem State University, St. Olaf College, and Valparaiso University—have served as test cases to understand what information can be valuable to the board and how boards can adopt new practices to better oversee student learning. (See article on student learning here.) The experiences of each of these eight institutions provides insight into the elements that contribute to successful board engagement in the oversight of student learning and educational quality as well as potential pitfalls to be avoided. Their progress—and setbacks—have yielded a set of lessons from which others can benefit:

Ensure a sufficient institutional-assessment capacity. The starting point for any institution and board is the capacity to assess student learning and academic quality. Without such institutional capacity—which consists of agreed-upon student learning goals and outcomes, an assessment infrastructure, and an institutional commitment to act on the findings—the board will have little foundation upon which to establish its work. While regional accreditation requires some degree of student learning assessment, not all institutions can provide boards with the necessary, comprehensive information about the institution and its various programs on a regular basis.

The first question boards should ask of academic leaders is: To what extent do we have adequate assessment data? Depending on the answer, the follow-up questions at many institutions may well be: What must happen in order to develop and maintain that ability? And when will this capacity be in place?

Start with what you already have. Because most institutions have made at least some progress assessing student learning outcomes and academic quality, a board would be wise to start by asking the institution what data it currently collects and how it uses it. Drake University in Iowa began its efforts by undertaking an audit to catalogue all the assessment data that it already had. The administration and staff identified 16 different student learning assessments currently in use or recently used, including standardized national tests such as the Collegiate Learning Assessment (CLA) and the National Survey of Student Engagement (NSSE), student licensure examinations in professional fields such as pharmacy, and institutionally developed assessment efforts that already existed and had legitimacy on the campus. That saved the institution from having to simultaneously build, test, and validate new assessment methods.

In addition, all institutions already have data related to student success and academic quality—such as persistence and graduation rates—that they can draw upon to share with the board on a regular basis. This data can be reported by variables important to the institution such as major or field of study, or race/ethnicity and gender.

Alumni surveys can also prove to be a source of valuable information. Rochester Institute of Technology in New York modified a fairly traditional alumni survey to add dimensions of student learning outcomes and educational impact. The survey now asks alumni to note the levels of effectiveness and importance of outcomes such as critical thinking, ethical reasoning and action, oral communication, and creative and innovative thinking.

Make academic quality a priority of the board and institutional leaders. Institutions that made the most progress in the AGB-Teagle project had a strong partnership between the chief academic officer and the chair of the academic affairs committee. The chief academic officer and the academic affairs committee chair can assemble the right working group and create time in busy agendas to identify valuable metrics and collect needed data. Those individuals are central to creating new board processes and restructuring board committee agendas. When both leaders make the board’s oversight of educational quality a priority, progress happens.

Furthermore, the board chair and president need to be publicly committed to the effort. They may not play a direct role, but their blessing is important to keeping efforts on track and ensuring that attention to educational quality remains a priority for the institution and the board.

Successful efforts to engage the board must also rely on assessment staff, faculty leaders, members of the academic affairs committee, and other campus administrators. That is especially the case because board oversight of educational quality is an endeavor that is likely to take more than a year to launch and embed. Some institutions in the project had turnover in key positions that impeded their progress. While boards cannot avoid that, they can work to ensure some stability on the academic affairs committee and in major leadership positions, recognizing that such efforts require many consistent hands.

Attach the effort to other activities. Boards of the eight participating institutions learned that by linking the oversight of educational quality to other priorities or activities, they were able to make more tangible progress. For example, Salem State University in Massachusetts found value in linking to a statewide “Vision Project” led by the Massachusetts Department of Higher Education. Morgan State University in Maryland linked its work on educational quality to its strategic planning work. Similarly, Metropolitan State University of Denver linked educational quality activities to its strategic plan and to a “Performance Contract” signed with the State of Colorado. By tapping the momentum of other efforts, boards and institutions can benefit from assessment work done for other purposes, find synergies, and avoid having to re-create the proverbial wheel.

Educate the board on education. Institutions that participated in the AGB-Teagle project found that they needed to educate board members on academic issues, educational quality, student learning goals, and outcomes assessment. They had to explain how and why they do program review, for instance, and the particulars of high-impact educational practices and the research supporting them. They spent time briefing board members on the language and practices of assessment, as well as the current debate surrounding its application.

Rhodes College in Tennessee sought to educate board members about the concepts of student achievement and educational quality and how these issues are currently thought of across higher education. They wanted boards to understand the topic they were being asked to discuss and the nuances surrounding it. Unlike other issues, such as finance, to which board members often bring deep understanding and personal expertise, academic quality and student learning, in particular, require additional education and information.

Institutions participating in the project took a variety of approaches to helping board members get up to speed. At some institutions, this education was embedded into committee meeting work. Other boards used retreats to convey this information. Rochester Institute of Technology gave Peter T. Ewell’s book, Making the Grade (AGB Press, 2nd edition, 2013), to the education committee and discussed several key questions: What matters when judging academic quality? What does the education committee see its role as? What type of indicators does the board want to receive?

8 Ways to Gauge Student Learning

By Maurice C. Taylor

A team from Morgan State University participated in the AGB-Teagle project and, based on our experience, we recommend that boards and senior administrators follow these practices:

1. Know the major institutional assessments due each year. Over the course of the AGB-Teagle project, we at Morgan had two significant assessment initiatives underway:1) a request that each college and school develop a strategic plan with outcomes metrics, along with a dashboard to benchmark progress towards the goals of the university’s overall strategic plan, and 2) a “Periodic Review Report” to accreditors that included mission-based assessment goals for student learning, academic programs, services, and administrative processes. Those initiatives contributed to the regents’ oversight of student learning outcomes during the project.

2. Provide board members with professional-development opportunities. Boards should ensure that their members attend meetings and engage in other activities focused on educational quality and student learning outcomes. At Morgan, the chair of the academic and student affairs committee participated in the AGB-Teagle project and made sure that other regents were briefed on the university’s efforts to develop metrics on student learning outcomes, as well as raised other issues about and called for reports on academic quality.

3. Include experts on information technology on board task forces. The Morgan team also benefitted from having a member who could translate the project goals of developing board-level metrics on learning outcomes into data that could be routinely gathered. Equally important was that person’s ability to explain to regents the scope and limitations of metrics.

4. Develop university-wide student learning outcomes. While a university-wide report and those for accreditators and legislators are important, they produce far more data and measures than board members need. As a result of the project, we began to try to develop a concise set of measures related specifically to academic quality and student learning outcomes, linked to Morgan’s mission and vision statements.

5. Make metrics inform board members’ questions. The purpose of reporting data and metrics specifically related to student learning outcomes is to assist board members in raising the right questions about academic quality at the institution.

6. Use meeting agendas effectively. Often board meetings are organized around hot topics that rarely relate to academic quality or student learning outcomes. Instead, they focus on budgets, facilities, athletics, and capital campaigns. Questions about curriculum, academic performance, and student learning outcomes should be a key part of the agenda.

7. Rotate the memberships of the board’s standing committees. Board members are often nominated or selected to serve because they possess a particular skill or expertise. For example, the academic and student affairs committee is often reserved for trustees who work in higher education. But boards should rotate the committee memberships so all board members have some experience with the issues concerning academic performance and student learning outcomes.

8. Take the long view. Board chairs, in particular, should take a view of the institution that extends beyond that of the president and other board members. It is ultimately the chair who is responsible for the board’s meeting agenda, committee assignments, the nature of the metrics the board receives, and whether it gives sufficient attention to the long-term measurement of student learning outcomes.

Maurice C. Taylor is a vice president at Morgan State University in Maryland and a board member at Juniata College in Pennsylvania.

Find the right focus. The challenge at many institutions is not too little data, but rather too much. Institutions have no shortage of folders of data related to student learning and educational quality, ranging from grades in individual courses to student academic portfolios to nationally normed tests to academic program review reports. The challenge is to figure out how to “roll up” that data in a meaningful way so as to allow the board to focus on the right top-line data.

Rochester Institute of Technology has two indicators of student learning outcomes in its strategic plan. They roll up program-level assessment data of student learning outcomes from an annual progress report and provide the board with two core metrics: 1) the percentage of programs that meet or exceed the established benchmarks of student learning outcomes and 2) the percentage of programs that practice data-driven continuous improvement.

Allow for targeted deeper dives. While the goal is to create high-level metrics for the board, institutions found it beneficial to focus more deeply on some key issues (critical thinking, for example) or on key program areas (graduate education or general education). The opportunities to go more deeply into an issue or a degree program, coupled with the broader, topline overview, helped boards feel comfortable with two levels of oversight.

For instance, the board at Morgan State University focused on its junior writing proficiency exam. This focus helped the board concentrate more intentionally on student learning across the institution. At Metropolitan State University of Denver, the board undertook an intensive investigation into its aviation programs. The provost’s office provided significant data on that program and engaged the board in a discussion of its strengths and areas for growth.

Rhodes College focused its deeper dive on “high impact practices” that have been shown to lead to deep learning. Examples included the percent of students within each class that have participated in efforts such as learning communities, undergraduate research, study aboard and internships, and senior capstone projects.

At Metropolitan State University of Denver, the board held a retreat that dedicated the entire morning to student learning and educational quality. They created a topline summary report (supported by 70+ pages of appendices) that focused on academic goals, strategies, and measures of success to support the discussion. They also piloted a new academic dashboard to begin to build consistent reports over time. As part of the retreat, they developed a “Jeopardy” game of academic issues to engage their nine board members in creative ways without overwhelming them with data.

Framing Board Work

At St. Olaf College in Minnesota and Valparaiso University in Indiana, board leaders and administrators crafted a discussion around what the work of the academic affairs committee should be. To help frame that conversation, they identified a set of action verbs—for example manage, oversee, monitor, ensure, approve, facilitate, review—and topical areas—such as student learning, retention and completion, program quality, academic planning, educational environment. They then had the committee work through their charge by defining, discussing, and applying action verbs to content areas. They discussed, for example, whether the board monitors student learning, ensures student learning, or reviews student learning. What does each of those terms mean in relation to the work the board should be doing? In relation to academic quality?

Develop new board processes and use time differently. The oversight of student learning by most boards requires that they do things differently, such as developing new processes and habits. A place to start is with the charge of the academic affairs committee. Valparaiso University, for instance, realized that it needed a new committee charge that reflected an intensified focus on educational quality. (See box above.)

While student learning and academic quality are important, time must be intentionally scheduled in committee and board agendas to sufficiently engage the board. Otherwise such tasks tend to get shortchanged, as boards meet infrequently and often for short periods of time. Complex and nuanced issues and those in which the board has little experience simply require more time.

Institutions also developed the practice of intentionally structuring a 12- to 18-month calendar of topics related to educational quality for their boards to address. For example, at Rochester Institute of Technology, the first and third meetings of the education committee now highlight a particular academic quality practice or issue, such as academic program-level assessment, online education and academic quality, or international programs and global education. During each of these meetings, the committee engages in intentionally structured, focused discussions. The committee’s middle meeting of the year focuses on the academic quality dashboard—the institution’s overall indicators of academic success and student learning. Such intentional scheduling helps embed student learning firmly into busy meeting agendas. It also allows institutions and boards to create a long-term and integrated view of educational quality that can touch upon many elements.

Deepen the engagement of the board with faculty. The boards of the participating institutions were more easily able to oversee academic quality when they and the faculty created new ways to interact. All too often, faculty-board interactions are confined to faculty presentations or “dog and pony shows.” Through this project, institutions experimented with new ways to more deeply expose board members to faculty and to student learning.

For example, at Rhodes College, the president initiated “The President’s Common Table,” an informal working group of three board members, three faculty members, one staff member, and one student to serve as a conduit between the board members who charged the group with strategic questions and tasks and the internal college community. The president then, in response to board requests, structured nine additional faculty members, student, and staff cross-functional common tables that further discussed strategic issues related to educational quality. The college developed a structured way to engage various constituencies, including the faculty, in strategic conversations important to the board.

At Drake University, board members participated in “Mini-College,” an experience in which select board members took short, interactive courses consisting of high-impact pedagogies. Board members got to experience cutting-edge education and then debriefed the faculty on their experience during a lunch meeting.

Conclusion: Still Incomplete

The work of the eight teams yielded many insights and helpful materials that other boards might use to engage constructively with academic quality and student learning. Yet, the teams of board members, administrators, and faculty leaders found that progress also raised new and often more difficult questions. Two particularly challenging ones that surfaced and will need attention were:

  • How should institutions balance the competing goals of assessment for accountability purposes and for improvement? These two goals easily come into conflict. Assessment findings that show areas of improvement might not be those that the institution wants made public.
  • How can institutions demonstrate the value-added of the education they provide? Most assessments focus on a level of demonstrated student proficiency. While that is important, institutions may be better served by understanding how much students learn and the approaches through which they learn the most. Correspondingly, they should know the areas in which students learn the least.

The institutions in the project made tremendous progress in the oversight of educational quality, but all would clearly acknowledge that their work continues. Even those institutions that started the two-year project with robust assessment efforts and growing board engagement would admit that they are only beginning to engage the board in the right way on student learning and educational quality.

Indeed, the work to engage the board appropriately in student learning and educational quality will be a long and complex journey for most colleges and universities. Those that find the work straightforward are probably not asking the necessary questions.

Valparaiso University’s Revised Academic Affairs Committee Charge (an excerpt)

As its overarching responsibility, the Committee shall foster such policies that contribute to the best possible environment for students to learn and develop their abilities, and that contribute to the best possible environment for the faculty to teach, pursue their scholarship, and perform public service, including the protection of academic freedom.

To that end, the Committee is responsible for the following areas:

Academic Programs. The Committee shall review and recommend to the Board approval of significant academic program changes or administrative changes established in conjunction with such programs that have substantial impact upon either the mission or the financial condition of the university. Such changes might include (a) creation of new academic programs, (b) significant revision of existing academic programs, and (c) discontinuation of academic programs. The Committee shall receive and may endorse reports on other academic program changes.

Academic Organizations. The Committee shall review and recommend to the Board approval of significant academic organizational changes that have substantial impact upon either the mission or the financial condition of the university. Such changes might include (a) the establishment of new academic organizations (e.g., campuses, institutes, colleges or schools), (b) significant changes to existing academic organizations, and (c) the discontinuation of academic organizations. The Committee shall receive and may endorse reports on other academic organizational changes.

Academic Relationships. The Committee shall monitor the policies and practices that govern the many different kinds of academic relationships between the University and other entities, such as joint ventures or contractual relationships with other academic institutions.

Assessment. The Committee shall periodically review the University’s practices in assessing the performance of its academic programs and practices and receive reports of such assessments.

Accreditation. The Committee shall monitor the University’s participation in all accreditation processes.

(For full version, visit our website here.)