What’s Your Viewpoint (Survey)?

By Lesley McBain April 24, 2024 Blog Post

Opinions expressed in AGB blogs are those of the authors and not necessarily those of the institutions that employ them or of AGB.

Boards and presidents are increasingly being asked by external stakeholders to provide data about their students’ viewpoints on a variety of political and social issues, as well as on their perceptions of the value of their education. (In some cases, this is due to legislative mandate.) A common means of gathering such data is survey research.

In 2000, one political science scholar called surveys “a fundamental data collection method for the social sciences,” which, when introduced in the 1940s, “provided the gold standard for measuring citizen opinions that are at the heart of democratic deliberation.”1 Today, surveying has evolved to the point that seemingly every interaction with a website, service, or purchase is coupled with a consumer survey—increasing survey fatigue among potential respondents. Further, online surveys have enabled a cottage industry of freelance survey-takers who are paid to take surveys.

These issues make accurate and actionable data-gathering more difficult, but not impossible. This AGB Research blog outlines existing national data sources and uses a hypothetical example as a cautionary lesson in how not to construct a survey question regarding students’ viewpoints.

Existing National Surveys May Already Meet the Needs for Viewpoint Information

National surveys have asked students for decades not only about their academic engagement and learning, but also about their viewpoints on social, personal, and political issues. Their results can provide aggregated data to participating institutions or systems. If the institution or system also participates in faculty or staff surveys, these data can add to understanding. Further, the survey administrator may be able to produce custom analyses of an institution or system’s data. Even if an additional fee for analysis is involved, this will likely be faster and cheaper than developing a new survey in-house. It can also forestall questions about internal surveys’ objectivity or security.

For example, the Higher Education Research Institute (HERI)’s Cooperative Institutional Research Program (CIRP), cooperatively run by UCLA and the American Council on Education (ACE) as of 2023, conducts surveys of faculty, staff, and students. The annual CIRP Freshman Survey (TFS) dates back to 1966 at ACE and to 1973 at UCLA. In addition to questions about academic backgrounds and aspirations, it asks about students’ self-characterized political views (ranging from “far left” to “far right”), their level of agreement with a rotating set of statements, including “My political views closely resemble those of my parent(s)/guardian(s),” and their life objectives, including “influencing the political structure” and “keeping up to date with political affairs.”2

Other current TFS questions ask about students’ activities in the previous year, including whether they talked about politics and publicly expressed opinions about causes (e.g., via a petition). In addition, they are requested to rate their relative strengths compared to “the average person their age” in areas such as thinking critically, “ability to discuss and negotiate controversial issues,” and “openness to having my own views challenged.”3 Participating institutions can request customized data reports of aggregated data.

Another example is the National Survey of Student Engagement (NSSE). It focuses on “first-year and senior students’ participation in programs and activities that institutions provide for their learning and personal development.”4 NSSE measures both the time students report putting into their academic and extracurricular pursuits (e.g., studying for class or participating in college sports) and what they say their coursework and instructors emphasize (e.g., analysis, instructor feedback, writing). As with the HERI surveys, there are other NSSE surveys (for faculty and for entering students) and customized analyses of participating institutions’ data that are available for a fee.

NSSE questions target roughly how often students incorporated “diverse perspectives (political, religious, racial/ethnic, gender, etc.) in course discussions or assignments” or “had discussions with people with political views other than your own” during the “current school year.”5 In addition, the instrument asks students to gauge how much their institutional experience has contributed to their development of critical thinking and job-related skills, as well as “understanding people of other backgrounds (economic, racial/ethnic, political, religious, nationality, etc.).”6

As can be seen in just these examples, a wealth of national-level data exists on not only academic and co-curricular experiences, but also regarding student viewpoints. Even if an institution or system does not participate in a particular survey, published research can be consulted for broad trend data.

Garbage In, Garbage Out: Specific Design Pitfalls in Viewpoint Surveys

If campuses or systems choose or are legally required to develop an in-house viewpoint survey rather than using third-party surveys, the old phrase “garbage in, garbage out”7 should be heeded. This begins with survey design. While boards and presidents should avoid going deep into the weeds of survey construction, they should be critical consumers of survey data as part of their fiduciary duties.

Bad Question Design = Bad Data

Take the following hypothetical viewpoint-survey question designed to be sent to a campus community of faculty, administrators, staff, and students: Dogs should be allowed everywhere on campus to ensure student success (Strongly Agree, Agree, Disagree, Strongly Disagree).

Regardless of one’s viewpoint on dogs in general or on campus, the question and its possible answers are constructed as a blanket statement devoid of nuance. The presumption is that dogs everywhere will ensure student success. This ignores the reality of allergic and/or phobic students who would not succeed in a dog-centric environment. There is no option to remain neutral. Further, there is no room for respondents to provide additional input. Some hypothetical input might include:

  • “This is a waste of money! No wonder the public doesn’t value higher education anymore!”
  • “I’m severely allergic and would require an ADA accommodation allowing me to teach online full-time.”
  • “My dorm-mates can barely look after themselves. I’d be scared for the poor dogs!”
  • “Love it in theory. In reality, I wouldn’t get any work done. Especially with my goofball dog, lol.”
  • “A doggy daycare allowance for employees would be better. Some people are afraid or allergic.”
  • “No fake ’emotional support dogs’ with fake Internet ‘doctors’ notes.’ Real service dogs only.”
  • “No! Professor Z’s nasty smelly dog is bad enough! It bit my TA, but they’re scared to tell on the beast!”

As can be seen by the issues raised in these hypothetical responses if individual viewpoints are actually solicited, the framing of the original question and the possible answers are much too limited. Thus, even if results showed 99 percent of respondents selected “strongly agree” or “agree” responses, the question’s design flaws would make that data unreliable. In addition, major considerations the question does not address include, but are not limited to:

  • Liability (both organizational and individual dog owner);
  • Other legal issues (e.g., animal-control laws, ADA, workplace-safety regulations, sanitation, etc.);
  • Cultural sensitivities (potential offense to other cultures or religions represented on campus);
  • Budget impacts (increased cleaning, insurance, and landscaping costs, to name a few).

Survey Insecurity Can Compromise Survey Data

Without proper security measures, surveys are vulnerable to manipulation by external stakeholders. For example, the link to the hypothetical dogs-on-campus survey question, if not properly secured, could be circulated to any or all of the following off-campus parties who could be assumed to support the idea and thus skew the data:

  • Local and national dog shelters/rescue groups;
  • Task platforms where dog-walkers are hired;
  • Local and national businesses catering to dogs;
  • Others, possibly including freelance survey-takers.

The same could be said for circulating the survey link to unconnected parties assumed to oppose the proposal; that could influence the data in the opposite direction. Either way, a survey that is not properly secured, both in terms of external manipulation and respondent protection, risks both violating various Institutional Research Board (IRB) rules for protecting respondents and being unusable.

Political Viewpoint Questions: Illustrative Examples

With surveys of political viewpoints, careful question design is crucial. Examples from large survey-research institutes illustrate the level of nuance survey designers put into this type of measurement. One such, the General Social Survey (GSS), has been conducted since 1972 by the National Opinion Research Center (NORC) at the University of Chicago and is currently funded by the National Science Foundation (NSF). It focuses on many aspects of US society, including political views, and is conducted in person.8

The current GSS sequence of political viewpoint questions, should respondents choose to answer, is rendered in list form below.9

1) Political Party Identification→Republican or Democrat→Strong or Not Very Strong

1) Political Party Identification→Independent→Closer to Republican or Democratic Party

2) Voted in 2016 U.S. Presidential Election→Yes→Candidate Voted For

2) Voted in 2016 U.S. Presidential Election→No→Candidate Respondent Would Have Voted For

3) Voted in 2020 U.S. Presidential Election→Yes→Candidate Voted For

3) Voted in 2020 U.S. Presidential Election→No→Candidate Respondent Would Have Voted For

4) Political Identification (7-point scale from “Extremely Liberal” to “Extremely Conservative”)

Source: Adapted from GSS 2022 Ballot 1 English, 2023. Duplicate numbers indicate question flow.

The Pew Research Center asks a similar multiple-part political viewpoint question. It begins with, “In politics today, do you consider yourself a Republican, Democrat, Independent, or something else?” As in the GSS, those who answer “Independent” or “Something else,” along with those who refuse to answer, are asked whether they lean more toward the Republican or Democratic Party. Those who initially answer either “Republican” or “Democrat” are asked whether they identify “strongly” or “not strongly” with that party.10

Also, respondents exist who are more interested in answering politically than truthfully or who have signed up to take surveys for financial or other rewards; either type may influence results.11 This should be considered when reviewing data.

Campus and National Expertise Is Readily Available to Assist Boards and Presidents

While viewpoint research can be delicate, pitfalls can be avoided by consulting available expertise. On campus, institutional researchers and faculty are valuable resources to advise on, if necessary, constructing appropriate surveys and preventing them from being compromised. They can also help interpret data and provide research expertise/data that does not involve surveying—including expertise on the continually evolving AI field.

For example, natural language processing can be used to identify patterns in large data sets (see Doan and Gulla12 for a technical discussion of this specific to political opinion). The rise of AI presents other challenges, as AGB has most recently discussed in Trusteeship Magazine (see Artificial Intelligence and the Future of Higher Education, Part 1 – AGB). However, if carefully vetted and used ethically with the assistance of experts in the field, AI tools can help mine data already on hand.

Nationally, in addition to the previously mentioned sources, the Association for Institutional Research (AIR) has a wealth of resources on data, analytics, and survey-research methodology for institutions, systems, and foundations. The federal National Center for Education Statistics (NCES), while not a source of viewpoint-survey data, has many other surveys whose data can add context and nuance.

Takeaways

  • Student, faculty, and staff viewpoint data may already exist for an institution or system. Reviewing existing data from external research institutes that have spent decades refining survey instruments according to best practices is a cost-effective start.
  • Badly designed survey questions lead to bad data and bad decision-making. Nuance is key to understanding viewpoint surveys in particular. Without going into the weeds, boards and presidents should be alert to potential data pitfalls that may be traced back to badly designed survey instruments.
  • Institutional researchers (at institutional, system, or foundation level and at national research institutes), as well as faculty, have a wealth of empirical expertise to offer in responsible and effective surveying and data interpretation.

Resources

Lesley McBain, Ph.D., is AGB’s director of research.


Notes

1. Henry E. Brady, “Contributions of Survey Research to Political Science,” PS: Political Science & Politics  33, no. 1 (2000): 47–58, https://doi.org/10.2307/420775.

2. Higher Education Research Institute (HERI)/Cooperative Institutional Research Program (CIRP), American Freshman 2019 (2020), https://www.heri.ucla.edu/monographs/TheAmericanFreshman2019-Expanded.pdf; HERI/CIRP, “2022 CIRP Freshman Survey Data Tables,” (2023), https://heri.ucla.edu/wp-content/uploads/2023/10/DATA-TABLES-TFS-2022.pdf; American Council on Education (ACE), “ACE and UCLA ED&IS Announce Partnership on HERI,” (July 26, 2023), https://www.acenet.edu/News-Room/Pages/ACE-UCLA-HERI-Partnership.as.

3. HERI/CIRP, “2022 CIRP Freshman Survey Data Tables,” (2023), https://heri.ucla.edu/wp-content/uploads/2023/10/DATA-TABLES-TFS-2022.pdf.

4. National Survey of Student Engagement (NSSE), “What Does NSSE Do?”, n.d., https://nsse.indiana.edu/nsse/about-nsse/index.html.

5. National Survey of Student Engagement (NSSE), “NSSE 2024 U.S. English Version,” Questions 2 and 8, 2023, https://nsse.indiana.edu/nsse/survey-instruments/us-english.html.

6. National Survey of Student Engagement (NSSE), “NSSE 2024 U.S. English Version,” Questions 2 and 8, 2023, Question 18.

7. Generally attributed in principle first to Charles Babbage and in exact phrasing first to William Mellin. See Robert Hanna, “Babbage-In, Babbage-Out: On Babbage’s Principle,” unpublished manuscript, n.d., https://www.academia.edu/101462742/Babbage_In_Babbage_Out_On_Babbages_Principle_May_2023_version_ and William Mellin, “Work with new electronic ‘brains’ opens field for army math experts,” The Hammond Times, 1957, 10, 66.

8. NORC at the University of Chicago, “About the GSS,” n.d., https://gss.norc.org/About-The-GSS.

9. NORC, “GSS2022_Ballot1_English,” (January 2023), https://gss.norc.org/Get-Documentation/questionnaires , see “English Questionnaires, 2022 Cross-Section Ballot A,” pp. 134 – 137 of PDF.

10. Scott Keeter, Anna Brown, and Dana Popky, “Who Are You? The Art and Science of Measuring Identity,” Pew Research Center—Methodological Research, February 12, 2024, https://www.pewresearch.org/methods/2024/02/12/who-are-you-the-art-and-science-of-measuring-identity/?activeAccordion=accordion-6.

11. See Brian F. Schaffner and Samantha Luks, “Misinformation or Expressive Responding: What an Inauguration Crowd Can Tell Us About the Source of Political Misinformation in Surveys,” Public Opinion Quarterly, Volume 82, Issue 1 (Spring 2018), 135–147, https://doi.org/10.1093/poq/nfx042. See also Courtney Kennedy, Nick Hatley, Arnold Lau, Andrew Mercer, Scott Keeter, Joshua Ferno, and Dorene Asare-Marfo, Assessing the Risks to Online Polls from Bogus Respondents (February 18, 2020), Pew Research Center, https://www.pewresearch.org/methods/2020/02/18/assessing-the-risks-to-online-polls-from-bogus-respondents.

12. Tu My Doan and Jon Atle Gulla, “A Survey on Political Viewpoints Identification,” Online Social Networks and Media, Volume 30 (July 2022), https://doi.org/10.1016/j.osnem.2022.100208.

The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.