The COVID pandemic caused a number of problems for Higher Education Institutions (HEIs) worldwide, and Xi'an Jiaotong-Liverpool University (XJTLU) was no exception. There was uncertainty over whether the campus would remain open to students for any length of time, and even when it was open, some students could not return due to local government and University regulations. As computer programming teachers, we understand how vital it is for students to get sufficient hands-on practical experience in the process of code-writing. But with these restrictions, our ability to interact directly with our students during on-site programming Labs was severely hampered. We identified three key challenges posed by this context, including overcoming the language barrier, assessing programming assignments and ensuring students' engagement, and designed HyFlex solutions to overcome these challenges. Our solutions help improve student retention, engagement, and participation across our courses and can be adapted to various blended learning contexts.
Some students in a computer lab session, with Teaching Assistants (TAs) and module leader support. Such sessions were rarely possible during the pandemic.
XJTLU is a Transnational HEI. As such, our students come from diverse backgrounds, and for the vast majority of them, English is not their native language. This makes learning computer programming a challenging task, since the discipline comes with a very specific terminology that students must learn and master. In addition, most modern, and indeed not-so-modern, programming languages use a number of common English words and phrases in their syntax.
Common language mistakes made by students
To overcome this barrier, we maintain an online interactive glossary of technical terms with English definitions, updated weekly on the Learning Mall, our University's Virtual Learning Environment (VLE). This ensures that students are learning the language of programming at the same time as they learn the programming language. The technology used allows teachers and students to co-create and co-edit the glossary content anytime and anywhere. We also created English language activities using Moodle Quizzes and H5P Activities to test students' knowledge. One example below asks students to find words relevant to a particular lecture in a letter grid puzzle game. These activities show students that learning languages can be fun!
A letter grid where students had to find words relevant to a particular lecture
In modern computer programming courses, we teach students to write computer code on a computer. This includes teaching them to enlist the help of supportive tools such as Integrated Development Environments (IDEs) and visualizers. It therefore seems ironic that we should then assess their programming skills by asking them to write the code by hand on a piece of paper. But that is what happens in a traditional paper-based exam. In such assessments, students cannot undo a mistake, erase incorrect answers, or insert a new line — all easy to do on a computer. They cannot employ automatic formatting or the typo-checking functions they are used to when working on their machines. As a result, the answer papers they produce can be unintelligible and very hard to grade. We have encountered examples of students' hand-written code where we did not even know where they began!
The whole process becomes even more complex when we have a large class, something which is very common in our University. Imagine having to grade 1200 hand-written exam papers! There could be hundreds of variations of algorithms, some partially correct that we have to mark proportionally. We have to act both as a compiler and a grader — parsing and figuring out how correct a particular answer is, and how we should allocate marks. Sometimes we also need to act as a calculator, adding points that we award according to some rubric, which is also prone to mistakes. Because of this, it has also become prohibitive to give feedback to formative assessment, even though having routine programming exercises is essential for a beginner programmer's growth.
An example of marking, or attempting to mark, a student's hand-written code. Imagine needing to mark 1200 of these!
To overcome these challenges, we started to use a Moodle plugin called CodeRunner in the Learning Mall three years ago. This plugin allows code to be automatically graded and also gives feedback based on test cases' correctness. Students submit their codes online and receive instantaneous feedback based on whether their code passes the test cases. According to questionnaire comments, our students felt a real sense of achievement when the system reports that they managed to pass all the test cases (displayed in large green blocks as below).
The student's code has passed all tests! Imagine the sense of accomplishment felt!
CodeRunner provides flexibility in many ways - it supports a variety of different programming languages and grading rubrics. Another useful feature here is that changes to an existing CodeRunner question’s rubric will be retroactively applied to student submissions. Partial grades can be given on passing a subset of the test cases and do not need manual computations. We also used this auto-grader for the exams conducted in a computer lab.
When COVID happened shortly after, with the problems already discussed above, we were so grateful that we had been prepared with this technology! We used it for online programming exams, during which students could use their usual IDEs and other supportive tools. More importantly, we used it to conduct weekly programming labs and to give sufficient weekly coding exercises as formative assessments. Supported by the TAs in online teaching, we could still give further feedback in an online Forum on the Learning Mall after privately viewing students' code. Once the scores are published, students will be able to view all of their feedback with code solutions, which helps them understand their performance in a much quicker turnaround.
Due to its practicalities outlined above, now that we employ blended learning in our university, we still use CodeRunner for students doing Labs in-person or remotely. There is one drawback after being so used to doing everything online: we need to motivate them to return to the labs!
When the pandemic hit, one of the most common concerns teachers had revolved around student engagement: were students even listening to our lectures as we talked to a computer screen? Did they understand anything that we were saying? What topics did we need to spend more time on?
Our proposed solution uses JazzQuiz ( https://moodle.org/plugins/mod_jazzquiz ), a polling technology available in our Learning Mall. We use this tool to poll students "live" during a lecture or lab session, allowing us to check their understanding of key concepts. We can also track engagement by seeing how many students respond to the polls. This is incredibly powerful when we need crucial feedback on students' progress, allowing us to make adjustments to our pace of delivery, repeat certain important points, and so on, as necessary.
Example of a JazzQuiz question
However, there's no denying that setting up 5-10 JazzQuiz questions before each lecture is a lot of work. The overloaded teachers cannot help but ask: is it worth the time and the effort to do so? Would students appreciate it? Could it have a positive impact on the active participants' grades? To explore the answers, we conducted action research on using JazzQuiz, receiving feedback from students via questionnaires and surveys. For example, many students think using JazzQuiz can help remind them to pay more attention during class: "It forces distracted students to focus during lectures" said one. For international students who did not have access to on-site learning, they found it very helpful to increase the interactive opportunities with the teacher and peers: "I was most involved when I participated in [a] quiz but gave the wrong answer and the teacher explained it to me […] classroom interaction is very necessary". Some students considered using JazzQuiz as a practical approach to achieve self-actualization: "Whether or not my grades get improved is less important, at least I think I have a deeper understanding […] it is a sense of accomplishment". Some students found JazzQuiz helpful in helping them build knowledge: "When you are more engaged in learning, you will learn more actively because the problem you solve is your own problem".
Overall, our research showed that students who better understood the educational value of JazzQuiz were more comfortable using the technology and gained better academic performance in their studies. For more details, please read this article:
Na Li, Erick Purwanto*, Xiaojun Zhang, Feng Cao, Kok Hoe Wong, and Xiangru Chen (2022). Understanding the perceived pedagogical value of JazzQuiz in interactive hybrid learning among university students: a technology acceptance analysis. Interactive Learning Environments. https://doi.org/10.1080/10494820.2022.2129393.