Dispatches From the Dark: Student Perceptions of a University’s Response to COVID-19
In March 2020, everything changed for students enrolled in U.S. American universities due to the COVID-19 pandemic. At one regional comprehensive university in the Mid-South, classes were suspended beginning March 16 for two weeks, allowing faculty a brief window in which to move classes and advising appointments online. Beginning March 20, all faculty and staff began working from home, while all classes and student services moved online. This scenario was repeated with variations throughout the American higher education landscape in spring of 2020 (Usher et al., 2024).
The regional university in question, University of Arkansas Fort Smith (UAFS), faced challenges that were also experienced by institutions across the country. An institution of approximately 6,000 students located along the Arkansas border, UAFS varied in its preparation to deliver programs fully online. While some programs had long been delivered online, and others were web-enhanced or hybrid, other programs had no web presence, and a number of faculty had no training in the use of learning management systems or teleconferencing software. Fewer than 10 programs were offered in the online format before March of 2020. Results from the spring 2020 semester at UAFS were mixed; some students managed the transition to online and successfully completed coursework. Other students failed, earning poor grades, dropping classes, or withdrawing from the institution. In addition to academic challenges, students faced financial challenges, health challenges, and emotional/social challenges that affected their learning.
Apparently, students at UAFS underwent similar experiences to others in U.S. universities during spring of 2020. Piotrowski and King (2020) identified a number of challenges for students in remote university operations, including the following: dealing with faculty inexperienced in online delivery; poor instructional materials; lack of in-person assessments like live performance or laboratory experiences; online proctoring of exams; no experience of learning in an online environment; lack of computer skills, technology, or reliable internet; and, for working students, the stress of unemployment or working in risky environments. In a May 2020 study, students were found to suffer more exhaustion, more cynicism, and lower professional efficacy during the COVID-19 pandemic when compared to students in earlier studies focused on measures of burnout (Gonzalez-Ramirez et al., 2021). Usher et al. (2024) described disruptions in students’ lives and social relationships which led to behavioral and psychological changes associated with motivation; students also reported increased workload, a feeling that they had learned less in most of their classes, and less certainty about career plans or continued educational plans.
COVID-19 forced a move to distance learning for which many of the UAFS students, faculty, and staff were not prepared. In a period of two weeks, faculty members moved all advising opportunities and all instruction to online delivery in compliance with state mandates. Student services staff members sought to provide social engagement through technology. This study investigated student perceptions of the resulting online learning opportunities, interactions, and challenges during the COVID-19 outbreak in Arkansas.
The purpose of this qualitative case study was to examine student perceptions and attitudes toward academic and social engagement in an emergency online university environment. The overarching research question was this: How did students perceive their experience of higher education during a pandemic? Additional questions guiding the study include the following: 1) How did students perceive changes in faculty engagement? 2) How did students perceive changes in engagement with student support staff? 3) How did students react intellectually and emotionally to their experience?
This study is a portion of a larger survey study that examined student perceptions and reactions to course delivery, technology tools, online student services, faculty engagement, and social engagement on the UAFS campus during COVID-19. In this project, the team focused on open response questions on the survey to learn more about the emotional and intellectual reactions to the student experience during the pandemic, rationalizing that motivation and academic regulation are major factors (Piotrowski & King, 2020; Gonzalez-Ramirez et al., 2021; Usher et al., 2024) in academic success and degree completion. Transactional distance theory (Moore, 1972, 1997) posits that in a distance-learning situation, students experience a transactional distance from their instructor and peers. Transactional distance includes “the impact distance can have on understanding and perceptions, which can affect motivation. Physical separation leads to psychological and communication gaps, which create a space of potential misunderstanding between the instructor and the learner” (Stavredes, 2011, p. 69). The research team wanted to understand barriers to persisting in a challenging environment that students faced. Faculty members, administrators, and student services professionals at universities throughout the country may be able to learn more about what strategies were useful and which were unsuccessful in reaching students during an emergency that required a shift to online learning.
Literature Review
Transactional distance theory was first introduced in 1972 (Moore, 1972) as a way of delineating pedagogical best practices for university correspondence education. The creator of this theory, Professor Michael Moore, sought to provide a framework that would define all aspects of distance learning that would meet the needs of learners outside of the traditional classroom setting (Moore, 2007). A basic assumption of transactional distance theory is that the pedagogical skills of the instructor have a greater impact on distance learning than the time and space that separates him/her from the learner (Gorsky & Caspi, 2005). This assumption is key in the current research, as students’ perceptions of instructors’ online teaching skills are investigated.
In transactional distance theory, distance is defined as a psychological separation that is influenced by the pedagogical constructs of structure, dialogue, and autonomy. These aspects of the theory are flexible enough to meet the needs of all programs that identify as having geographic distance as a distinguishing feature (Moore & Kearsley, 2005; Reyes, 2013). The pedagogical constructs of structure, dialogue, and autonomy must each be considered in the design of distance-learning programs.
Transactional distance, as defined by Moore and Kearsley (2005), is the “gap of understanding and communication between the teachers and learners caused by geographic distance that must be bridged through distinctive procedures in instructional design and the facilitation of interaction” (p. 223). Transactional distance is directly affected by the negative correlation between course structure and instructor/student dialogue: High levels of structure with low levels of dialogue increase the levels of transactional distance (Moore, 1991). Increasing the levels of dialogue between the instructor and student then becomes a major tenet in online course design: Increasing student interaction with both the instructor and with other students can positively affect a student’s perception of the instructor and decrease transactional distance (Dockter, 2016). However, levels of transactional distance may be acceptable for those students with high levels of autonomy; the construct of autonomy can be viewed as both the student’s ability to succeed due to individual independence levels and to the autonomous features of the learning materials (Benson & Samarawickrema, 2009).
In the context of the current COVID-19 pandemic, transactional distance has been experienced at increasing levels of intensity between the students and their professors compared to their previous experiences in traditional classroom settings. This theoretical framework can provide necessary structure and understanding of online learning best practices: For example, mobile devices are increasingly being used by students as a supplemental or primary device to access learning, and instructors may need to restructure course offerings to be more compatible with mobile formats (Awadhiya & Miglani, 2016). Today’s distance learners are “a more mobile, more heterogeneous, and more geographically dispersed group when compared to most campus-based student cohorts” (Cross et al., 2019, p. 224).
Situating the Researchers in the Site
The researchers are faculty and administrators at UAFS housed in the College of Business and Industry and the School of Education. The team includes an academic associate dean (Voelkel), a professor of education (Henehan), a senior instructor of organizational leadership (Buck, an associate professor of electrical engineering technology (Han), and a senior instructor of office management technology (Bracken). Voelkel and Henehan have been research partners since 2014, combining workforce development with teacher education concepts. All five authors are part of a research accountability group. During spring of 2020, this team frequently discussed the implications of COVID-19 for students; these discussions brought to light many similar struggles they all faced in maintaining student motivation and persistence. They felt that while some students continued to perform well, other students were lost and struggling. Consequently, they became interested in learning student perspectives and perceptions of learning during COVID-19. The team expected students who were already in online programs to handle the transition well. They anticipated students in face-to-face courses to find the transition to online learning more difficult. They hoped to identify strategies that would be helpful in maintaining strong connections between the students and their professors. Further, the team imagined sharing the findings with university administration in an effort to standardize expectations of online delivery across campus.
Potential bias
Because the research team came primarily from one of UAFS’s academic colleges, they had a similar focus on student success and considerable experience in online delivery of courses before the pandemic. As it was clear that different students had different experiences during the emergency online event, the team members had a tendency in their discussions to ascribe problematic program and class delivery to academic units other than their own. In other words, they tended to assume that their own classes were appropriately delivered online and that these classes satisfied student needs while classes in other academic colleges at UAFS were less successful. Researchers also looked at the emergency online event through the lens of faculty or administration rather than from the points of view of the students.
To overcome potential bias, the team members were careful to look at the statements provided by the students in the open-ended responses and to include examples that disagreed with the team members’ preconceived notions, as well as examples that supported those ideas. For example, the team believed that students in lower socioeconomic statuses would struggle most with access to technology. Furthermore, the team believed that generational poverty, which is common in the UAFS student body, would create added emotional and physical stress for students in that demographic. The team found that their preconceived notions did not always explain the data. For example, while the team’s assumptions about access to technology were confirmed, students in technologically advanced programs also struggled with access to computers with the appropriate processing, memory, and video capabilities to successfully complete work. Additionally, students in rural areas struggled to find access to reliable internet connections. While students who came from situations of generational poverty suffered emotional and physical stress, so did students from all demographics.
Methods
This qualitative case study is part of a larger study of student perceptions of the experience of learning in an emergency online environment. A case study approach was chosen to study the open-ended response data. A case study should be bounded, should focus on a single or collective case, and should focus on an event, process, program, or individual (Creswell & Poth, 2018). This study focuses on the experiences of students at UAFS, particularly on the event of emergency online learning in spring 2020. The data are bounded by the focus on the single institution, come only from participants enrolled in spring 2020, and center only on the events of online learning during the emergency COVID-19 shutdown.
The parent study was a 47-item survey sent to students enrolled at UAFS in summer 2020. This study was approved by the University of Arkansas-Fort Smith Institutional Review Board (IRB UAFS 20-043), based on compliance with the National Institutes of Health (NIH) Office of Extramural Research standards for Protecting Human Research Participants.
Data collection
The survey was sent electronically via Survey Monkey to students who voluntarily chose to participate. Links to the survey were also provided on several Facebook pages. Informed consent information was included in the email along with the survey. The survey consisted of four sections: demographic information, technology use before and after COVID 19, academic engagement before and after COVID 19, and social engagement before and after COVID 19. While most questions on the survey were closed responses, there were opportunities for open responses. This case study is based on analysis of the written responses to four open questions on the survey: 1) How have instructors increased engagement? 2) How have instructors decreased engagement? 3) How do you want your instructors to engage with you? and 4) Anything else you would like to share?
A total body of 1,147 answered the survey disclosure; of those participants, 1,131 (98%) opened and responded to the first question regarding the purpose and risks associated with the survey. Sixteen individuals did not consent, and their participation with the survey ended. Additionally, students under age 18 and students who were not enrolled in the spring of 2020 were removed. Individual responses from 557 individuals composed the data set. The drop from the initial 1,147 students may be attributed to two factors. First, the list of potential participants was created from student enrollment at UAFS in spring 2020, which included a large number of high-school concurrent students many of whom are under the age of 18. Second, the link to the survey was provided on Facebook, where it could be accessed by those under 18 as well as those who were not enrolled at UAFS in the spring semester. Since not every respondent answered every question, individual questions often had a smaller number of respondents than 557. Of the 47 questions, there were only three that had fewer than 500 responses:
- •
Q17. Is there another type of technology that you would prefer to use?
- •
Q43. What virtual activities would you like to see/do to engage socially with XXXX?
- •
Q45. What are some reasons you were more or less involved?
Participants
The gender breakdown of respondents is as follows: Among the respondents who answered their gender (556), 71% (392) chose female, 28% (158) chose male, and 1% chose other. The academic classification in spring 2020 semester of respondents is as follows: 12.39% freshmen, 28.73% sophomore, 34.29% junior, 20.83% senior, 0.72% graduate, and 3.05% other (such as life-long learner, 60+, high school concurrent students, and postgraduates). The race/ethnicity breakdown of respondents is as follows: White (74.10%), Hispanic (7.91%), Asian (5.94%), American Indian (4.50%), Black (2.16%), Multiple Ethnicity or Not Listed (3.24%), and Prefer Not to Reply (2.16%).
Employment and dependents data were also collected. Employment was affected due to COVID-19 for 63% of respondents, 36.33% were essential workers, 8.99% began working from home, 18.35% became unemployed, and 24.82% of students were not working prior to COVID-19.
Data analysis
In this study, the open responses from four survey questions were analyzed: 1) How have instructors increased engagement? 2) How have instructors decreased engagement? 3) How do you want your instructors to engage with you? and 4) Anything else you would like to share? Voelkelcompiled lists of the open responses from each question, separating them into four documents—one for each question. Four team members (Henehan, Buck, Bracken, and Han) individually coded the data: Each coder looked at the individual questions and then organized their codes into categories looking holistically at all four questions. First, the team looked for answers pertaining to instructor and social engagement; next they evaluated the open response answers for evidence supporting or not supporting the theoretical framework of transactional distance; then the team looked for language that represented student intellectual and emotional responses. The team also used “in vivo” coding, taking words and phrases from the participants’ responses to describe the intellectual and emotional reactions to their educational experiences during COVID-19.
To show how codes and categories led to the study themes for example, one student provided the following response to the question “How have instructors decreased engagement?”:
I can’t even count the number of emails I have sent with absolutely no response. This semester (summer II) we were notified that our professor had schedule conflicts and we would not be receiving any lectures or advanced communication. She has not been flexible or compensating in any way. In the spring, we were dumped with 150 hours of virtual clinical assignments that teachers forgot about until the last two weeks of class. I’ve had one teacher since COVID that has been accommodating and kept a schedule. I feel incredibly let down by people that are supposed to be my first line support. Accountability and communication would be a good place to start.
The passage was coded as follows by the four coders:
Passage | Coder A | Coder B | Coder C | Coder D |
---|---|---|---|---|
I can’t even count the number of emails … | Lack of communication Lack of compassion | Decreased engagement Teach myself/fend for myself | Teach myself Less caring | Self taught Lack of communication |
The final list of codes compiled included the codes Lack of Communication, “Teach Myself” (an example of “in vivo” coding taken directly from the students’ words), and Lack of Compassion.
The group compared individual coding frameworks, and how their codes led to individual categorizations. Voelkel did not participate in the initial coding but served as the reviewer by comparing individual coding schemes initially. Voelkel found that while the coding schemes were expressed differently, the codes identified were largely congruent with each other. Once a set of codes was established, each coder reviewed their coding scheme and suggested several themes in the data. Using teleconferencing software, the team met virtually three times to compare coding schemes and to decide on a final list of codes and themes in the data. The group looked at codes from all four questions and suggested themes from across the individual questions. For example, the four coders each offered suggestions for themes that were then negotiated into a list of final themes:
Suggested Themes | ||||
---|---|---|---|---|
Coder A | Coder B | Coder C | Coder D | Final Themes |
Handled well and transparent | Different experiences for different students | How students feel (emotion) | No themes suggested | Raw emotions |
Financial concerns: disgruntled by tuition, fees, parking | Loss of connection | Technology and tools | Perceived injustice | |
Health concerns | Grievance | Financial | “Confused and in the dark” | |
Online proctor issues | “Fend for myself” | Instructors and teaching method | Ghosted | |
Lack of structure from instructors | Technology issues | |||
In the dark: quality of education had declined and grades were suffering |
The final coding scheme was selected to focus on the strong emotions displayed in the open student responses.
Issues of trustworthiness
While reliability and validity have been the traditional measures of trustworthiness in quantitative studies, the methodology literature has suggested multiple perspectives on how to best assess trustworthiness in qualitative studies (Creswell & Poth, 2018; Bloomberg & Volpe, 2018). In this study, the team used Creswell and Poth’s (2018) approach to validation: “Use of a validation process for assessing the accuracy of the findings as best described by the researcher and the participants” (p. 255). Creswell and Poth suggest choosing at least two from a number of validation strategies representing the lens of the researcher, the participants, and the reader. For this study, the team used “prolonged engagement and persistent observation in the field” (p. 262), and a “peer review” or “briefing” (p. 263). In other words, the team used its experience teaching in spring 2020 as one check on the accuracy of the data. The researchers used the team member who did not participate in coding (Voelkel) as a peer reviewer. Voelkel reviewed the codes from each researcher, looked for commonalities, and provided facilitation in the meetings in which the final themes were selected.
Results
Four major themes were identified from the research: a) raw emotions; b) perceived injustice; c) confused and in the dark; and d) ghosted.
Raw emotions
“Raw emotions” was the first theme identified in the data. Students exhibited a variety of emotions in their survey responses. These ranged from positive (“My instructors we’re[sic] amazing in helping me manage last semester”) to negative (“Due to COVID, my classes sucked. I got nothing out of them”) to scared (“I am terrified. Though I struggle with some classes online, I feel safer being at home doing classes online”). Each student viewed the change to online learning through a different lens and hence each expressed different emotions. The only constant was that each student exhibited an unfiltered honesty when it came to their responses. This is most likely due to the fact the survey was anonymous. The bluntness of the student remarks revealed that the student emotions were raw and the shift to online learning was a sensitive subject.
Some students were grateful for how the university handled the pandemic. One student stated, “My online classes were difficult but fair considering I was able to put my health in priority while also getting an education.”
Another remarked,
My professors and everyone working at UAFS that I have gotten emails from have done such a great job with everything. A number of my teachers have reached out and we have kept in touch. Thank you so much for being so attentive and helpful during this time, and always. I love my school!
Others were frustrated and angry at not having a choice, being overwhelmed, and the lack of compassion they felt from instructors. One student commented,
I believe we should get a choice to go sit in the classroom or move online. Rather than the 5 different options that we don’t actually get to choose…We are paying big money to attend college and it being switched online has affected me negatively mentally. I need engagement with others (I’m an extrovert) and I’m being so deprived of that, and I am struggling because of it.
Another student said, “The struggle to get things completed has been a challenge with all of the changes going on with my work.” Students also made several comments wishing their instructors would be more compassionate and understanding when it came to issues and deadlines. When asked, “How do you want your instructors to be more engaged?” one student answered “[being] more lenient regarding the software we may or may not have access to.” Another student stated,
I had a class that allowed me to complete all the classwork at my own pace and that was great but I’ve already received an email from a professor this semester telling us students if we get coronavirus we still have to do our work :/ [Emoji entered by student] No thank you, if I’m exposed and my lungs stop working I simply cannot do my classwork as a ghost—I refuse.
Perceived injustice
The second major theme that emerged was “perceived injustice.” As discussed in the first theme, students had many raw emotions during the pandemic. These emotions boiled over to perceived injustices. Students felt as if their education had been ruined and that they were “cheated of their money, educational performance [sic], and experiences that could be beneficial to our career.” There were two main areas around which students’ perceived injustice centered: money and technology.
The first perceived injustice was money. Students were upset that they had paid fees for parking, the gym, the library, and numerous other amenities and resources that they no longer had access to. One student stated, “I understand that finances have to flow for nice things to happen and be had on campus, but I don’t really understand why we have to pay certain fees if we don’t/can’t actively use the benefits[sic].” Students also remarked, “Activities fees should be waved because there will not be as much if any activities that students can attend”; many mentioned wanting a refund on their tuition.
The second part was technology and technology issues. Understandably, some students were unequipped and unprepared to move to a completely online format. A large portion of student grievance related to the use of an online proctor system called Examity. Some students were blunt, stating, “I HATE examity.” Others provided explanations for their frustrations with the proctor system, stating, “The proctors are extremely rude and it slows down my computer where I can’t even take the test at times” and “Examity is very difficult when you are not in a setting like the library. It is very tedious and marks things I have no control over.” Students also expressed issues with technology in general, commenting that they had internet issues which made it difficult to complete work and that they lacked access to some equipment and software needed to complete projects and assignments. One student summed up the access issue by saying that “online resources should be … expanded, and the university should look into assisting those with limited resources for online classes.”
Confused and in the dark
The third major theme that appeared in the research was “confused and in the dark.” This student quote—“It is a week until school starts and I haven’t heard anything from any of my professors. I am confused and in the dark about how things are going to work”—provides an overview of how some students felt when their educational journey was set on a new path by COVID-19.
Students complained that their instructors were unorganized and provided little guidance on deadlines and assignments. One student stated,
The only deadline I had was to turn everything in by the end of the semester, so there was no structure and I didn’t know how well I was absorbing the knowledge because I didn’t have periodic assignments to get feedback on.
Another student said, “Lack of structure led to less or inefficient engagement.” Although it might be easy to think that students are happier without deadlines or regularly scheduled work, most students shared that they actually craved structure and clear communication in their classes, especially online classes. This is evident by another student statement, “Professors need to provide more structure when it comes to online teaching and supplemental materials as well as increase communication with students.” There were only a few students who did not express a need for more structure and communication in their classes. These students were those who also indicated that their “professors all pretty much stepped up after the pandemic hit” and that they stayed “very connected to my professors through email before and during.”
Ghosted
The final but likely the most telling theme is “ghosted.” According to Darxil on Urban Dictionary, ghosting can mean “the shutdown/ceasing of communication with someone without notice” (2018). Sadly, students feel as though they were ghosted by their instructors during the pandemic. Student comments such as “I had a professor fail to contact us until the middle of finals week” and “I can’t even count the number of emails I have sent with absolutely no response” were common among response options.
Not only were students upset by their instructors’ lack of response to their emails, they were also upset that the ghosting extended to information in the online class regarding assignments and deadlines. Students remarked that “little to no coursework was taught, and we did not know the material to complete the final project,” and “No explanation of assignments. Using the opportunity of an online class that NEEDS to be in person to leave the student to fend for themselves.”
Discussion
The current study investigated student perceptions of instructor engagement during a move to online education in the spring of 2020. The information revealed in the results of this analysis gave the team an understanding of how students were impacted from their point of view rather than from the presumptions of faculty and administration. The preconceived notions from researchers regarding poverty and the lack of access to technology were supported; however, the results revealed that all students struggled in one way or another. The unfiltered responses given by students bring a unique aspect to the study by allowing faculty to use their point of view to determine strategies for improving engagement.
Findings are congruent with recent studies with higher education students during COVID-19; however, while other studies relied on psychological testing or scales, the UAFS study captured the students’ own thoughts in describing their experiences. This raw emotion and dramatic language provided by students allowed them to express themselves more conversationally than if limited by a scale or predetermined responses. For example, if students only relied on a scale to express their emotions, feedback would not have been received on specific areas of improvement such as “I had a professor fail to contact us until the middle of finals week” and “I can’t even count the number of emails I have sent with absolutely no response.” A scale would not have provided this detail, and the need to investigate strategies for helping students feel less “ghosted” would have been missed.
Globally, students described strong emotions in response to the pandemic, which aligned with the theme raw emotions. Kostina et al. (2021) described students experiencing a high state of anxiety as measured by scores above 7 on a scale of 1–10. In a Turkish study of first-year students (Güner, 2021), word frequency analysis showed that when the frequency values of the words were examined, the word Sadness was mentioned 43 times, the word Anxiety 29 times, the word Bitterness 22 times, the word Confused 11 times, and the word Panic 10 times by the student participants. Students reported feelings of loneliness due to isolation that led to negative behaviors like impulsivity and laziness (Popovic & Lim, 2020). Polujanski et al. (2020) also found that, in a study of German medical students, positive emotions such as happy, proud, curiosity, and gratitude were associated with online learning during COVID-19: “Despite the COVID-19 situation, medical studies were more often associated with positive than negative emotions” (p. 3). All these studies support the current finding that students experienced strong emotions regarding online education during COVID-19. The UAFS study added to the discussion of emotions by allowing for more uncensored word choices. Word frequency was not found in the coding of raw emotions with this study, with most students feeling one way or another. However, it revealed student emotions to be all over the spectrum from positive to negative, giving the impression that students’ experiences were very individual, and the only common factor was they wanted their emotions to be expressed and heard.
The second theme of the current study, “perceived injustice,” is also supported by other recent studies. In Güner’s 2021 study of first-year students in Turkey, students reported shock and bitter feelings. Students in Gruner’s study feared falling behind and not being able to perform well in their programs. In a study of American psychology students, Usher et al. (2024) found that students perceived online assignments as unimportant or “busy work,” that work was assigned just for points without contributing to learning objectives, and that a lack of scheduled work had reduced motivation. In another American study of education and business students, Gonzalez-Ramirez et al. (2021) reported that typical sources of frustration stem from the difficulty of self-regulating work and study time, navigating unclear instructions in the online environment, a perceived imbalance in work required during group projects, and unequal commitment to coursework among peers. Additionally, moving to remote learning abruptly brought about multiple changes connected to students’ learning environment, finances, social connections, motivation, and healthy habits. These findings seem to support the UAFS study’s finding of perceived injustice, although the methodology of the other studies did not invite the sort of aggrieved personal commentary the current open-ended response questions invited. Again, the open-ended responses provided value in differentiating this study: The discussion of money and lack of resources provided outside of teaching was mentioned repeatedly in students’ perceptions of injustice. This frustration over payment for services not received and not having access to do what was needed for success was found at a higher rate within the current study than from those globally reviewed. This indicates that student perceptions are that what is taught is not the only factor that is important but that students need other services provided by the university, such as the infrastructure. One student captured this injustice by stating, “Not all of us can afford good enough technology to take classes fully online. The ‘financial aid’ some of us get doesn’t help either.”
One of the more descriptive themes is “confused and in the dark,” which comes directly from a student’s open response in the data. Again, findings from recent studies support the current descriptive approach to describing this phenomenon. This particular phrase was used to describe students who feel bemused or confused about instructions, course requirements, and future directions. Popovic and Lim (2020) describe confusion regarding technical requirements as well as a lack of persistence, self-preparedness, and self-engagement. In their study of psychology students, Usher et al. (2024) quote a participant: “I have never felt more stupid than I do right now. I feel like I’m teaching myself, and that’s very much the blind leading the blind” (p. 9). This quotation closely mirrors the UAFS student participants who felt as if they were teaching themselves. The current study added value to recent studies in this area by highlighting that students need structure and guidance. Other studies found similar information, but the open responses provided by this study specifying “structure” from many responses tell the researchers that structure and delivery are key in enhancing student perceptions.
Finally, participants in the UAFS study described a feeling of abandonment from their instructors, describing experiences that ranged from infrequent communication to perceived lack of help to no communication at all from the instructor. This feeling was described as “ghosted.” Güner (2021) described a similar phenomenon describing curiosity as akin to confusion. In other words, students had anxiety and questioned how courses would be taught, indicating lack of communication. “The lessons started remotely, and I tried to learn and manage problems on my own, but I just couldn’t. I even cried” (Güner, 2021, p. 151). Again, most contemporaneous studies did not allow the kinds of open response answers the UAFS team used to elicit reflection, so it is noted that the language is not as dramatic in other studies, nor did it appear as frequently. The fact that so many students in the study felt “ghosted” may be an indication that professors were struggling during this time as well, and it has had a negative effect on student learning and perceptions. A student response regarding lack of communication also showcased how they believed the professors may be struggling as well by saying, “…I also hate having to wait on emails and responses! No one can help that, however, it’s just the anxiety tends to settle in when you don’t hear from professors…I focus on being understanding and patient instead of bombarding the professors who are surely doing their best just like everyone else.”
Although the UAFS research team admits to a limitation of the study due to convenience sampling of the institution’s own students during a very specific context in time, the large number of respondents allowed rich textual analysis and “in vivo” coding by placing emphasis on the actual spoken words of the participants that emerge from the qualitative data. Furthermore, the student population largely comprises first-generation students eligible for need-based financial aid, possibly skewing perceptions; however, UAFS student data are comparable to the national average data, including race, gender, and the population of traditional and non-traditional students. Therefore, the team feels that the results could still be transferable to the overall population of institutions of higher education, as strategies for increased student engagement could be delineated.
The area of emergency online education is ripe for further exploration. This analysis reviewed part of the data collected from a larger survey; as the team moves forward, they hope to look closer at some of this information provided, such as differences in majors, organizational units, instructions, and class rankings, to see where students were most at risk and where they were most successful. The same methodology could be used within this analysis with the separation of the information based on these categories. This would allow researchers to see whether the dramatic responses were provided by a specific group or not. The language and detail provided by students within this study make it distinct, so exploring this avenue within set categories allows researchers to dive deeper into the information without losing its uniqueness.
With this particular study, the research team has continued to analyze the quantitative data from the original survey to learn more about differences in gender, and in response to the use of specific technology solutions. The UAFS team anticipates further quantitative and qualitative analysis to pinpoint and further study the individual differences that students experienced. Additional ideas for research include repeating the study with the same qualitative analysis to see whether perceptions have changed or improved since the beginning of the pandemic when these experiences were not the norm; using the results to implement strategies noted by students; and using further qualitative or quantitative research to study the outcome of these strategies. Given the rich descriptions and feedback gained from the open responses, opportunities for students to share anonymously and in an open format should be a part of future studies of student attitudes.
References
Awadhiya, A. K., & Miglani, A. (2016). Mobile learning: Challenges for teachers of Indian open universities. Journal of Learning for Development, 3(2), 35–46. doi: https://doi.org/10.56059/jl4d.v3i2.145
Benson, R., & Samarawickrema, G. (2009). Addressing the context of e-learning: Using transactional distance theory to inform design. Distance Education, 30(1), 5–21. doi: https://doi.org/10.1080/01587910902845972
Bloomberg, L. D., & Volpe, M. F. (2018). Completing your qualitative dissertation: A road map from beginning to end (4th ed.). Sage.
Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry & research design: Choosing among five approaches (4th ed.). Sage.
Cross, S., Sharples, M., Healing, G., & Ellis, J. (2019). Distance learners’ use of handheld technologies: Mobile learning activity, changing study habits, and the ‘Place’ of anywhere learning. International Review of Research in Open and Distributed Learning, 20(2), 223–241. doi: https://doi.org/10.19173/irrodl.v20i2.4040
Darxil (2018, February 12). Ghosting. Urban Dictionary. https://www.urbandictionary.com/define.php?term=Ghosting
Dockter, J. (2016). The problem of teaching presence in transactional theories of distance education. Computers and Composition, 40(2), 73–86. doi: https://doi.org/10.1016/j.compcom.2016.03.009
Gonzalez-Ramirez, J., Mulqueen, K., Zealand, R., Silverstein, S., Reina, C., BuShell, S., & Ladda, S. (2021). Emergency online learning: College students’ perceptions during the COVID-19 crisis. College Student Journal, 55(1), 29–46.
Gorsky, P., & Caspi, A. (2005). A critical analysis of transactional distance theory. The Quarterly Review of Distance Education, 6(1), 1–11.
Güner, H. (2021). Examining of the emotional mood about their online education of first-year students beginning their university education with distance education because of COVID-19. Higher Education Studies, 11(1), 148–159. doi: https://doi.org/10.5539/hes.v11n1p148
Kostina, L. A., Abdullaev, S. S., Tarkhanova, N. V., Sergeeva, M. A., & Kubekova, A. S. (2021). Assessment of the psychoemotional sphere in students during the coronavirus pandemic. Ilkogretim Online - Elementary Education Online, 20(5), 1369–1372.
Moore, M. (1972). Learner autonomy: The second dimension of independent learning. Convergence, 5(2), 76–88.
Moore, M. (1991). Editorial: Three types of interaction. The American Journal of Distance Education, 5(3), 1–6.
Moore, M. (1997). Theory of transactional distance. In D. Keegan (Ed.), Theoretical principles of distance education (pp. 22–38). Routledge.
Moore, M. G. (2007). The theory of transactional distance. In M. G. Moore & W. Diehl (Eds.), Handbook of distance education (4th ed., pp. 89–105). Erlbaum.
Moore, M. G., & Kearsley, G. (2005). Distance education: A systems view. Thomson Wadsworth.
Piotrowski, C., & King, C. (2020). COVID-19 pandemic: Challenges and implications for higher education. Education, 141(2), 61–66.
Polujanski, S. & Schindler, A. K., & Rotthoff, T. (2020) Academic-associated emotions before and during the COVID-19-related online semester—A longitudinal investigation of first-year medical students. GMS Journal for Medical Education, 37(7), 1–9. doi: https://doi.org/10.3205/zma001370
Popovic, B., & Lim, F. (2020). The mental health and well-being of university students during the COVID-19 pandemic. Journal of Pain Management, 13(4), 319–322.
Reyes, J. (2013). Transactional distance theory: Is it here to stay? Distance Learning, 10(3), 43–50.
Stavredes, T. (2011). Effective online teaching. Jossey-Bass.
Usher, E. L., Golding, J. M., Han, J., Griffiths, C. S., McGavran, M. B., Brown, C. S., & Sheehan, E. A. (2024). Psychology students’ motivation and learning in response to the shift to remote instruction during COVID-19. Scholarship of Teaching and Learning in Psychology, 10(1), 16–29. doi: https://doi.org/10.1037/stl0000256