Abstract:
This case study explores the integration of generative AI tools, specifically XIPU AI, to enhance student engagement with formative feedback in two EAP courses, including a Y1 Advanced EAP course and a Y2 Business EAP course. The intervention involved training students to craft effective prompts and structuring coursework to incorporate AI tools throughout the feedback engagement process. Despite potential drawbacks such as overreliance on Gen-AI and inaccurate suggestions, the approach showed promise in improving feedback comprehension and application. To maximize these benefits, teachers should provide guidance to ensure the balanced use of AI and emphasize independent revision and ethical considerations. The pedagogical intervention highlights the need for AI literacy and suggests strategies for responsible Gen-AI integration in academic settings.
Key Words: XIPU AI, Feedback Engagement, EAP
Introduction
Formative assessments are designed to measure learning and provide feedback throughout a course to foster growth and improvement. According to Biggs (2003), this type of feedback helps students understand their grasp of course concepts, identify gaps in their knowledge and learn ways to enhance and deepen their understanding. It is a vital component of the learning process. However, effective student engagement with formative feedback remains a significant challenge (Lee, 2008; Winstone et al., 2017). Many students fail to act on feedback, often repeating the same mistakes in subsequent tasks. Even when feedback is comprehended, students may not consistently apply it across different assignments.
Key barriers preventing students from effectively engaging with feedback are their learning goals and beliefs, language proficiency and feedback literacy. According to Han (2017) and Han and Hyland (2015), students' goals and beliefs about learning can affect how they perceive and use feedback, with an example that a student focused on grades might overlook constructive criticism to improve their skills. Zhang and Hyland (2018) highlight that students with lower language proficiency may struggle to understand the nuances of feedback, especially if it includes complex language structures or unfamiliar terminologies. Additionally, Han and Xu (2021) note that many students have difficulty decoding feedback and lack knowledge about effective strategies for using it, which includes understanding specific comments and knowing how to implement the suggestions in their work.
Addressing these barriers is essential to foster deeper learning and improve writing outcomes. In this context, integrating generative AI (Gen-AI) tools into the revision process offers promising solutions (Ipek et al., 2023). Specifically, tools like XIPU AI provide mechanisms to simplify complex feedback, enhance understanding and promote actionable engagement by the following:
Example-based learning: Gen-AI can create custom examples and explanations tailored to each student’s needs, helping them better understand feedback.
Translation and paraphrasing: For students who struggle with decoding feedback or with lower linguistic proficiency, Gen-AI can translate and paraphrase teachers’ comments into simpler language, making them easier to comprehend.
Generating complex and deep answers: Gen-AI can provide sophisticated answers and explanations, helping students grasp difficult writing concepts and apply feedback effectively.
Grading and assessment: Gen-AI can assist in the grading process, helping students reevaluate their work and allowing them to track their writing progress.
Recognizing these advantages, this case study explores the integration of Gen-AI tools, specifically XIPU AI, as a means to address barriers in feedback engagement within EAP modules. By embedding Gen-AI into the feedback process, we aim to enhance students’ understanding and application of feedback while fostering critical thinking and writing skills. This study highlights best practices for incorporating Gen-AI into academic curricula through a structured approach that includes training students to craft effective prompts and utilize Gen-AI for interactive feedback engagement. The following sections detail the process, outcomes, and reflections from implementing this approach in two EAP modules: a Year 1 Advanced course and a Year 2 Business course, offering practical insights for educators seeking to optimize feedback practices with Gen-AI.
Integrating Gen-AI into EAP Coursework: The Process and Practice
To explore the potential of Gen-AI in enhancing student engagement with feedback, we implemented XIPU AI in two EAP modules: a Year 1 Advanced course and a Year 2 Business course. Both cohorts consisted of students at a B2+ level on the CEFR scale. The Year 1 students collaborated in groups of five to produce a 1,000-word discursive essay on workplace dress codes, while the Year 2 students worked individually on a 300-word argumentative essay about leadership styles.
To maximize the benefits of Gen-AI, we designed a structured approach that incorporated AI into three main stages of the feedback process:
Step 1: Before Class – Feedback Checklist
Before the in-class activities, students received formative feedback on their writing through a detailed checklist. This checklist, aligned with the marking criteria, highlights areas where improvement is needed, such as structure, language, or argument development, without providing explicit solutions. For instance, the checklist may include items like:
‘Is there a brief outline given in the thesis statement?’
‘Does the author mostly follow the academic style of writing?’
Below is a part of the sample checklist used for the discursive essay in Y1 EAP.
This feedback is designed to encourage students to reflect on their writing and identify areas to address during the revision process. By providing this feedback ahead of time, students can come to class prepared to actively engage with Gen-AI tools to address their specific challenges.
Step 2: In Class – AI-Assisted Revision
During the class session, students used XIPU AI to interpret the formative feedback provided in Step 1 and make revisions to their writing. The following elements support this process:
• Basic Guidance on Using XIPU AI:
Students were provided with a short introduction on how to interact with the AI, including basic instructions such as logging in, navigating the interface and understanding its primary features. This ensures they all have access to XIPU AI and understand how to communicate their needs to the tool.
• Crafting Effective Prompts:
The quality and relevance of AI-generated responses depend heavily on how prompts are designed. To ensure students could effectively engage with XIPU AI, we provided training on crafting structured and precise prompts. An incremental, dialogic approach is recommended; rather than posing a single complex question in a lengthy prompt, it is more effective to start with simple questions and gradually increase their complexity (Bager, 2023). The prompt structure should follow the scheme proposed by Coursera (2024), which includes:
• Assigning a role to the AI (defining who or what is being simulated),
• Defining the goal of the prompt,
• Providing instructions or assistance (outlining what to do and in what order),
• Specifying application examples or content (detailing exactly what it concerns),
• Including useful details (such as secondary conditions or restrictions to consider),
• Identifying the recipient of the text (clarifying for whom the text is written).
For example, in our Year 1 EAP module, for teacher feedback such as ‘some of the expressions in the essay are not academic’, we taught students to write a similarly structured prompt:
“I’m a Year 1 college student working on a discursive essay on workplace dress codes. Can you review the following paragraph, identify informal expressions and offer suggestions for formal alternatives? Please keep all citations intact and focus only on language adjustments. This essay is for my EAP teacher to review. [copy and paste the paragraph]”
Below is a screenshot demonstrating the application of the prompt above using XIPU AI.
Although students were encouraged to generate their own queries based on their specific needs, we also provided a set of sample prompts as examples to guide students in refining specific aspects of their writing. Detailed sample prompts were paired with corresponding checklist items. The following screenshot shows some sample prompts for the Y2 checklist items.
Comprehending Feedback and Hands-On Revision:
Students used XIPU AI to comprehend the teacher’s feedback and revised their work using the further suggestions provided by Gen-AI, engaging in a process of trial and reflection to determine which changes were appropriate. They were encouraged to critically evaluate the AI’s recommendations rather than accept them directly.
The following screenshots show the application of XIPU AI in helping Y2 EAP students better comprehend teacher feedback on their argumentative essays and implement changes from different perspectives.
The use of XIPU AI in understanding certain terms:
The use of XIPU AI in providing detailed examples:
The use of XIPU AI in correcting grammar errors
Step 3: In Class – Reflection and Feedback
The final stage of the process involves a critical reflection session and teacher-led general feedback:
Critical Reflection on Gen-AI Use:
Students participated in a guided discussion to reflect on their experience with XIPU AI. Key reflection questions might include:
• “Which AI suggestions were most helpful, and why?”
• “Which AI-generated suggestions did you reject or accept, and why?”
• “What challenges did you face when using XIPU AI and how did you overcome them?”
This reflection helps students evaluate the strengths and limitations of AI tools, fostering a critical perspective on their use in academic work.
Teacher Feedback Presentation:
Finally, the teacher provided general feedback on the class’s overall performance, addressing common issues observed during the revision process. By presenting this feedback in a collective format, the teacher ensures that all students can benefit from insights beyond their individual work.
Challenges and Possible Solutions
While Gen-AI tools like XIPU AI have great potential to enhance student engagement with feedback, their integration into academic writing contexts is not without challenges. These challenges could hinder the effectiveness of AI in supporting learning if left unaddressed. Below, we share common difficulties students faced when using Gen-AI to engage with teacher feedback, along with potential strategies to mitigate these issues.
1.Limited Familiarity with Gen-AI Tools
Some students were unfamiliar with using generative AI tools, including XIPU AI, and found the technology intimidating or overwhelming. This lack of familiarity can prevent them from fully leveraging AI for feedback engagement and revision.
Possible Solutions:
Integrate AI Literacy into Regular Teaching: With AI playing an increasingly important role in academic and professional settings, our modules can consider better integrating it into curriculum design. For example, using Gen-AI tools to support feedback engagement, as discussed in this context, allows students to interact with feedback more effectively and develop a deeper understanding of academic writing. By embedding AI-enhanced activities, students can gain more understanding of the functions and potential applications of AI in academic settings and effectively use AI for learning enhancement.
Provide Basic Guidance: Before assigning tasks that involve XIPU AI, provide students with clear, step-by-step instructions on how to use the tool. This may include basic instructions such as logging in, navigating the interface and understanding its primary features.
Demonstrate Through Examples: Teachers can showcase practical examples of how to interact with AI, including crafting effective prompts and interpreting AI-generated feedback.
2.Technical Issues
A couple of students encountered technical problems, such as slow response times, connection issues or platform outages, which can disrupt their workflow and cause frustration.
Proposed Solutions:
Backup Platforms: Ensure that alternative AI tools or offline activities are available to mitigate the impact of technical issues. For instance, if XIPU AI is unavailable, students can use other platforms such as Kimi or Doubao or engage in manual peer feedback.
Technical Support: Provide a troubleshooting guide or access to technical support to help students resolve minor issues independently.
3.Difficulty in Crafting Precise Prompts
Effective interaction with Gen-AI relies on students’ ability to ask clear and specific questions. Many students struggled with formulating precise prompts, which can result in irrelevant or unhelpful responses from the AI.
Proposed Solutions:
Provide Sample Prompts: Offer a set of well-constructed example prompts that address common feedback issues.
Teach Prompting Strategies: Encourage students to break down complex queries into simpler, incremental steps. For example, instead of asking for a complete revision, they can start by requesting feedback on a specific sentence or paragraph.
Task Context in Prompts: Train students to include contextual details in their queries, such as the essay topic or audience, to guide the AI’s responses effectively.
4.Challenges in Evaluating AI-Generated Responses
Not all AI-generated suggestions are accurate or relevant. Students may struggle to determine the validity of the feedback, particularly if it contradicts teacher instructions or course materials.
Proposed Solutions:
Critical Reflection Sessions: Include class activities where students critically analyze AI-generated suggestions, comparing them with lesson materials and teacher feedback. This encourages students to approach AI output with a critical mindset. Students can also be encouraged to evaluate how they used Gen-AI in their writing tasks, considering questions such as: “What AI suggestions were most helpful?” or “Were there any suggestions you chose not to apply, and why?” These reflections not only promote critical thinking but also help students recognize the limitations of AI tools. By reinforcing the need for balance and independence, educators can ensure that students develop confidence in their own abilities while benefiting from the capabilities of Gen-AI.
General Feedback from Teachers: After students use AI tools, teachers can provide overarching feedback in presentation form, addressing common pitfalls in AI suggestions and reinforcing key learning objectives.
Consultation of External Resources: Encourage students to cross-check AI-generated responses with trusted resources, such as textbooks, lesson slides or teacher-provided guides. For instance, in the Year 2 EAP module, students were required to follow the XJTLU Harvard Reference System for citation and referencing. However, AI tools often adhere to the general Harvard system and may not provide institution-specific guidance. In such cases, students should be directed to consult the XJTLU Harvard Reference Guide to ensure compliance with university-specific requirements.
Broader Concerns and Ethical Considerations
Beyond these operational challenges, broader concerns such as overreliance on AI, academic integrity and fairness in education were also raised.
One significant concern is the potential for students to become overdependent on AI-generated answers. While AI tools can enhance the revision process, there is a risk that students may rely too heavily on these suggestions, losing the ability to write independently. To mitigate this, teachers are encouraged to implement a balanced approach by encouraging students to first attempt revisions independently and then consult AI, which means AI feedback is used as a supplementary tool rather than a primary resource. Another concern is that not all suggestions offered by AI are correct. Therefore, teachers can advise students to evaluate AI output by consulting teachers or other external resources such as lesson slides or workbooks, especially when the AI’s answers contradict what teachers instruct or what the marking descriptors indicate.
Academic integrity is another important concern when integrating AI into academic work. Using AI tools raises questions about originality and authorship, particularly if students rely on AI-generated content for their assignments. In addition, AI tools can provide plausible but sometimes inaccurate or incomplete information. As a result, students may unknowingly present this as their own work, leading to ethical dilemmas. To uphold academic integrity, teachers need to establish clear guidelines on the responsible use of AI, emphasizing the importance of maintaining originality in students’ work. Reflective practices, such as asking students to evaluate how they used AI in their writing tasks, can further encourage critical engagement with the technology, helping students recognize its limitations while building confidence in their independent abilities. In the interest of safeguarding academic integrity, it is recommended that teachers request students to disclose their usage of Generative AI when submitting the essay.
Another critical concern is the issue of fairness and equity in the use of AI tools. Not all students have equal access to AI technology or possess the same proficiency level, creating disparities in academic outcomes. Students with greater AI literacy or better access to technology may gain a significant advantage while others struggle to keep up. To address this, educators can provide training sessions to improve AI literacy and ensure all students are equipped to use these tools effectively. Additionally, integrating AI literacy into the curriculum and establishing clear guidelines for ethical AI use can promote responsible and equitable application.
Conclusion
Gen-AI tools like XIPU AI offer significant potential for enhancing student engagement with formative feedback. By integrating AI-assisted activities into the feedback process, educators can address persistent barriers in feedback literacy, improve students’ understanding and application of teacher feedback, and foster a stronger sense of ownership over their learning. However, the successful adoption of these tools requires addressing key challenges, including unfamiliarity with Gen-AI, difficulties in crafting precise prompts, technical limitations and variability in the quality of AI-generated responses. A structured approach that incorporates AI literacy, practical training and guidance on ethical and effective use can mitigate these issues. By addressing these challenges and promoting the ethical and thoughtful use of Gen-AI, educators can create an environment where technology enhances academic development while preserving the importance of critical thinking and independent problem-solving. This balance ensures that AI serves as a tool to empower students, helping them grow into reflective, confident and autonomous learners.
References
Bager, J. (2023) Instruieren und verifizieren: Tipps und Tools, mit denen Sie Sprachmodelle produktiv nutzen. c’t, Heft, 21, pp.26-19.
Biggs, J. (2003) Teaching for quality learning at university. Maidenhead: Open University Press.
Coursera (2024) How To Write ChatGPT Prompts: Your 2024 Guide. Available at: https://www.coursera.org/articles/how-to-write-chatgpt-prompts [Accessed 6 June 2024].
Han, Y. (2017) ‘Mediating and being mediated: Learner beliefs and learner engagement with written corrective feedback’, System, 69, pp.133-142.
Han, Y. and Hyland, F. (2015) ‘Exploring learner engagement with written corrective feedback in a Chinese tertiary EFL classroom’, Journal of Second Language Writing, 30, pp.31-44.
Han, Y. and Xu, Y. (2021) ‘Student feedback literacy and engagement with feedback: A case study of Chinese undergraduate students’, Teaching in Higher Education, 26(2), pp.181-196.
İpek, Z.H., Gözüm, A.İ.C., Papadakis, S. and Kallogiannakis, M. (2023) ‘Educational Applications of the ChatGPT AI System: A Systematic Review Research’. Educational Process: International Journal, 12(3), pp.26-55.
Lee, I. (2008) ‘Ten mismatches between teachers’ beliefs and written feedback practice’, ELT Journal, 63(1), pp.13-22. doi:10.1093/elt/ccn010.
Winstone, N.E., Nash, R.A., Parker, M. and Rowntree, J. (2017) ‘It’d Be Useful, but I Wouldn’t Use It: Barriers to University Students’ Feedback Seeking and Recipience’, Studies in Higher Education, 42(11), pp.2026-2041. doi:10.1080/03075079.2015.1130032.
Zhang, Z. and Hyland, K. (2018) ‘Student engagement with teacher and automated feedback on L2 writing’, Assessing Writing, 36, pp.90-102.