QUESTION 1: How does intentional use of instructional technology influence student motivation and engagement in online college courses?
#1
Martin, F., Sun, T., & Westine, C. D. (2020). A systematic review of research on online teaching and learning from 2009 to 2018. Computers & Education, 159, 104009. https://doi.org/10.1016/j.compedu.2020.104009
Summary
Martin, Sun, and Westine synthesized online teaching and learning research published between 2009 and 2018 to identify common themes, research foci, and patterns in the field. Their review examines how course design decisions, instructional strategies, and technology-enabled practices are studied across a large body of literature, highlighting how specific technology uses are linked to learning processes and outcomes in online environments.
Evaluation
A major strength of this study is its broad scope and its ability to identify repeated instructional and design patterns across a decade of scholarship. Because it is a systematic review of published studies (rather than a single intervention), its conclusions are strongest when interpreted as “what the field emphasizes” and “what evidence trends suggest,” not as proof that any one tool causes motivation or engagement by itself.
Reflection
This review is useful for framing “intentional technology use” as a design-and-teaching decision rather than a tool-selection decision. It supports using technology to implement evidence-informed practices (e.g., interaction structures, feedback mechanisms, and active learning strategies) that plausibly strengthen motivation and engagement in online higher education, which aligns directly with my dissertation interest in technology-supported teaching that improves adult learner engagement and persistence.
#2
Hu, J., & Xiao, W. (2025). What are the influencing factors of online learning engagement? A systematic literature review. Frontiers in Psychology, 16, 1542652. https://doi.org/10.3389/fpsyg.2025.1542652
Summary
Hu and Xiao conducted a systematic literature review of 55 empirical studies (January 2020–July 2023) to identify factors associated with online learning engagement. They report that multiple theories are commonly used to explain engagement (including Community of Inquiry and Self-Determination Theory) and organize influencing factors into learner-related and environment-related categories. They also synthesize strategies suggested across the literature for improving engagement, including clear goals, instructor support, and designing supportive digital learning conditions.
Evaluation
The review’s value is its structured synthesis of recent empirical work and the way it aggregates “what repeatedly shows up” as engagement-related factors during the post-2020 expansion of online learning research. A limitation is that the included studies vary in how engagement is defined and measured, which constrains direct comparison and makes it harder to treat engagement as a single unified construct.
Reflection
This source strengthens my ability to justify that motivation, instructor support, interaction quality, and digital platform conditions work together—meaning that technology matters most when used intentionally to support psychological needs and learning behaviors. It also helps me translate broad engagement findings into practical instructor actions (e.g., goal clarity, structured interaction, and support) that can be enacted through LMS features and synchronous tools.
#3
Akpen, C. N., Asaolu, S., Atobatele, S., Okagbue, H., & Sampson, S. (2024). Impact of online learning on student’s performance and engagement: A systematic review. Discover Education, 3, 205. https://doi.org/10.1007/s44217-024-00253-0
Summary
Akpen and colleagues conducted a systematic review (PRISMA-guided) of peer-reviewed studies published from 2019 to 2024, selecting 18 studies for analysis. They report mixed effects of online learning on engagement and performance: some studies show benefits related to flexibility and accessibility, while others emphasize reduced interaction, isolation, and engagement challenges. The authors highlight that outcomes appear influenced by the quality of digital tools and internet access, as well as learner motivation and instructor–student interaction, and they emphasize interactive elements (e.g., discussion forums and multimedia) as engagement supports.
Evaluation
This review is helpful for presenting a balanced evidence narrative—online learning can improve outcomes, but engagement is not guaranteed without intentional design. Its practical strength is connecting engagement outcomes to specific implementation conditions (tool quality, interaction structures, and support). One limitation is that included studies span varied contexts and disciplines, which can blur conclusions about which technology uses are most effective for motivation/engagement in online college courses specifically.
Reflection
This article supports a central dissertation argument I want to make: technology is not inherently motivating—motivation and engagement rise when tools are used to create interaction, clarity, and support. It also reinforces that instructor actions (enabled by technology) are critical to counteracting isolation and strengthening student persistence in online higher education.
QUESTION 2: In what ways do synchronous and asynchronous tools contribute to students’ sense of presence and connection in online learning environments?
#4
Ratan, R., Miller, W., & others. (2022). How do social presence and active learning in synchronous and asynchronous online classes relate to students’ perceived course gains? Computers & Education. https://doi.org/10.1016/j.compedu.2022.104621
Summary
This study examines how online class format (synchronous vs. asynchronous) relates to social presence, active learning activities, and students’ perceived course gains. The work focuses on how different interaction conditions and instructional activities align with students’ sense of social connection and perceived learning benefits, emphasizing the role of social presence and active learning as key explanatory mechanisms.
Evaluation
A strength of this article is that it does not treat modality as the only factor; it explicitly connects modality to social presence and active learning, which are more instructionally actionable than “sync vs. async” alone. A limitation (based on typical modality comparisons) is that perceived gains and presence measures can be influenced by course context and implementation quality, so modality effects should be interpreted as conditional rather than universal.
Reflection
This source helps me argue that presence and connection can be designed through pedagogy—technology is the delivery channel, but the real driver is whether tools are used for active learning and interpersonal interaction. It supports framing synchronous sessions as a presence amplifier when they are built around interaction rather than lecture replication.
#5
Presley, R. G., Cumberland, D. M., & Rose, K. (2023). A comparison of cognitive and social presence in online graduate courses: Asynchronous vs. synchronous modalities. Online Learning, 27(2), 245–264. https://olj.onlinelearningconsortium.org/index.php/olj/article/view/3046
Summary
Presley, Cumberland, and Rose report an action research study comparing two online course designs—one fully asynchronous and one that included weekly synchronous meetings—examining learning outcomes including cognitive and social presence and student perceptions. The study is grounded in modality differences and investigates how real-time interaction opportunities may shape the presence experience in graduate-level online learning.
Evaluation
The action research framing is a strength because it reflects realistic teaching conditions and directly compares instructional approaches that instructors commonly choose. At the same time, action research is often context-specific (course, instructor, student population), which can limit generalizability; the most defensible use is to treat findings as practice-relevant evidence that informs design decisions rather than as a universal modality rule.
Reflection
This article is directly aligned with my teaching and dissertation interests because it links modality choices to presence outcomes in a way that can inform course design. It supports designing asynchronous spaces for cognitive processing while using synchronous touchpoints strategically to strengthen social connection and teaching presence—an approach that fits well with my work in live online sessions and LMS-supported instruction.
#6
Hung, C.-T., Wu, S.-E., Chen, Y.-H., Soong, C.-Y., Chiang, C.-P., & Wang, W.-M. (2024). The evaluation of synchronous and asynchronous online learning: Student experience, learning outcomes, and cognitive load. BNC Medical Education, 24, 326. https://doi.org/10.1186/s12909-024-05311-7
Summary
Hung and colleagues compared synchronous (live Webex lectures) and asynchronous (YouTube lecture videos) formats with 170 undergraduate medical students, assessing learning outcomes (pre-, post-, and retention tests), satisfaction, and cognitive load. Both formats showed improved learning outcomes and high satisfaction, while the synchronous condition showed significantly lower cognitive load than the asynchronous condition.
Evaluation
A key strength is the inclusion of multiple outcome types (achievement, satisfaction, and cognitive load), which provides a richer view of “experience” than grades alone. A limitation is that students selected their preferred modality, which may introduce self-selection effects; nonetheless, the findings are still highly useful for design implications, particularly around cognitive load management in online instruction.
Reflection
This study is especially helpful for my dissertation framing because it adds “cognitive load” to the presence/connection conversation. It suggests that synchronous tools may reduce perceived mental load in certain contexts, which can indirectly support connection and persistence by lowering friction. It reinforces the need to intentionally design asynchronous experiences with scaffolds, structure, and interaction supports so that flexibility does not come at the cost of overload or isolation.
QUESTION 3: How can technology-supported teaching practices improve both student success and instructor effectiveness in online higher education?
#7
Banihashem, S. K., Noroozi, O., van Ginkel, S., Macfadyen, L. P., & Biemans, H. J. A. (2022). A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educational Research Review. https://doi.org/10.1016/j.edurev.2022.100489
Summary
Banihashem and colleagues reviewed research on how learning analytics (LA) is implemented to improve feedback practices in technology-mediated higher education. The review maps how LA is used to support feedback processes for educators and students and synthesizes the state of practice for analytics-informed feedback approaches in higher education learning environments.
Evaluation
A major strength is the focus on feedback as the bridge between analytics and learning improvement—this keeps the discussion grounded in teaching practice rather than dashboards alone. The systematic review approach supports identifying patterns in how LA-enabled feedback is designed and used. However, LA implementations can vary widely by context, data quality, and instructor adoption, so effectiveness depends heavily on integration into instructional workflows and the clarity of the feedback action path.
Reflection
This article directly supports a dissertation argument about instructor effectiveness: when analytics are translated into actionable feedback routines, instructors can intervene earlier, target support, and improve course management. It also helps justify studying technology-supported teaching practices not just as “tools,” but as systems that change instructor decision-making and student self-regulation through feedback loops.
#8
Liu, Y., Wang, W., & others. (2025). The effectiveness of learning analytics-based interventions in enhancing students’ learning effect: A meta-analysis of empirical studies. https://doi.org/10.1177/21582440251336707
Summary
Liu and colleagues conducted a meta-analysis of empirical studies on learning analytics-based interventions, quantitatively estimating the overall impact of these interventions on student learning outcomes. The study reports a moderate overall effect, indicating that analytics-driven interventions can improve learning outcomes when implemented as purposeful instructional supports rather than passive reporting.
Evaluation
A core strength of this work is that it synthesizes evidence at the intervention level and provides an effect estimate that is useful for dissertation justification and programmatic decision-making. As with most meta-analyses, heterogeneity across interventions, settings, and outcome definitions can complicate “what works best” conclusions, but the results still strongly support the general value of analytics-based intervention design.
Reflection
This meta-analysis helps me connect student success and instructor effectiveness: analytics become valuable when they drive timely, targeted instructor actions (nudges, feedback, scaffolds) that influence learner behavior and outcomes. It also supports my focus on technology integration that amplifies human teaching practices—especially in large online courses where instructors need scalable ways to identify who needs support and what kind.
#9
Cabı, E., & Türkoğlu, H. (2025). The impact of a learning analytics based feedback system on students’ academic achievement and self-regulated learning in a flipped classroom. International Review of Research in Open and Distributed Learning, 26(1), 175–196. https://doi.org/10.19173/irrodl.v26i1.7924
Summary
Cabı and Türkoğlu investigated a learning analytics-based feedback system implemented in a flipped learning environment, focusing on student academic achievement and self-regulated learning (SRL). The study positions LA as a practical teaching support that can shape learner behaviors by providing feedback aligned to engagement and performance indicators within a technology-mediated instructional model.
Evaluation
A strength of this article is its applied, practice-facing focus: it evaluates LA in a feedback system rather than treating analytics as an abstract concept. It also links outcomes to SRL, which is critical in online learning success. A likely limitation for broader dissertation generalization is that results may depend on the specific course design (flipped structure), how feedback was framed, and how consistently students used the feedback, all of which can vary across institutions and disciplines.
Reflection
This study is a strong fit for my dissertation direction because it demonstrates a concrete pathway for improving student success (achievement and SRL) while also improving instructor effectiveness (more targeted, data-informed feedback). It supports researching “technology-supported teaching practices” as feedback-enabled workflows that increase the precision and timeliness of instructional support in online higher education.
Leave a comment