1
Application of Competitive Activities to Improve
Students’ Participation
Virginia Francisco, Pablo Moreno-Ger, and Raquel Hervás
Abstract—Making students become intrinsically motivated to
participate in daily class activities is an open challenge that has
been addressed in different ways. In this paper, we evaluate the
impact of an educational innovation project, named TrivialCV, in
terms of student motivation, engagement, and learning outcomes.
We analyze the impact of two types of activities: multiplayer
team-based competitions created for delivery during a live class
session, and single player competitions created to be played
asynchronously from home. We deployed these activities in two
different computer engineering courses (programming fundamentals and operating systems) and used questionnaires, grades,
and activity tracking in the virtual campus to evaluate their
impact. After the analysis, we can assert that the use of TrivialCV
activities was useful for providing additional motivation to the
students and improving their engagement with the courses in
which they were deployed. TrivialCV activities were very well
received by the students, who considered them fun, engaging,
and useful.
Index Terms—Competitive and collaborative learning experiences, engagement, participation, learning outcomes, motivation,
single player activities, team-based competition activities.
I. I NTRODUCTION
The interest in new teaching methodologies that focus on
improving students’ individual work and class participation
has increased in the framework of the European Higher
Education Area. The main objective is to move away from the
“master class” model, with unidirectional instruction followed
by a final examination, to models where students must actively
participate in class and perform individual activities [1].
However, increasing class participation and capturing student attention is not easy since students tend to be passive
agents in the classroom. There are various ways of involving
students, ranging from punitive approaches (students either
take part in or do not pass the course) to reward-based
approaches (positive points are given if students are involved
in in-class activities). Making students to become intrinsically
motivated to participate in daily class activities is an open
Manuscript received November 04, 2020; revised August 13, 2021 and
October XX, 2021; accepted TBD. Date of publication TBD; date of current
version October XX, 2021. This work was partly supported by Universidad
Complutense de Madrid through its Innovation and Improvement of Teaching
Quality Projects (PIMCD) program under Grants PIMCD-2011-161 and
PIMCD-2012-72. (Corresponding author: Pablo Moreno-Ger.)
P. Moreno-Ger was with the Department of Software Engineering and Artificial Intelligence, Universidad Complutense de Madrid, 28040 Madrid, Spain.
He is now with the School of Engineering and Technology, Universidad Internacional de La Rioja, 26006 Logrono, Spain (e-mail:
[email protected]).
V. Francisco and R. Hervás are with the Department of Software Engineering and Artificial Intelligence and with the Instituto de Tecnologı́a del
Conocimiento (ITC), Universidad Complutense de Madrid, 28040 Madrid,
Spain (e-mail:
[email protected],
[email protected]).
challenge which has been addressed by different authors in
the academic field [2]–[4].
Among the approaches to improving education, there is a
growing interest in the possibility of using video games for
academic purposes. Since the turn of the century, different
authors have consistently defended this approach. Their arguments are based on the importance of using activities that
motivate people and the benefits provided by games regarding
engagement, participation, and appropriation [5]–[14].
A. Harnessing Games in the Classroom
When considering what kind of elements make video games
so attractive and motivating, different authors identify some
of the characteristics that make them especially engaging for
players [15]. Two of the most commonly referenced features
are collaboration and competition [16]–[19]. Therefore, a good
approach is to use activities based on digital games (or activities which incorporate some kind of gamified experience). This
could stimulate collaboration among students and make use of
students’ competitive nature to increase their motivation.
The field of education, in fact, usually mentions collaboration and competitiveness as ways to increase intrinsic motivation among students. To make the most of these approaches,
many authors consider the possibility of organizing structured
competitive and collaborative activities in the classroom [20]–
[22], as well as the addition of these activities in the evaluation
process [16], in a positive light. In particular, competitive
and/or collaborative mechanics designed to be played among
peers in a classroom environment have demonstrated significant effects in terms of motivation and engagement [23]–[27].
They can also add value in the form of social interaction,
an element which is often missing in game-based learning
approaches [28], [29].
In practice, however, organizing collaborative or competitive
activities using games or other tools is often in conflict with the
reality of everyday teaching activity. These approaches always
demand a significant amount of extra work for instructors, who
tend to already be overloaded [30].
Fortunately, the actual workload in delivering these activities can be alleviated in digital learning scenarios. In past
years, universities have adopted models of blended learning
[31], which combine regular face-to-face classroom activity
with the use of virtual learning environments (VLEs), such as
Moodle, Sakai, or BlackBoard. Usually, this adoption is not
merely the implementation of an isolated system, but these elearning systems are integrated into the universities’ essential
services, thereby creating so-called virtual campuses (VCs).
2
In this way, it is common to use these tools for communicating with students, collecting work, or assigning grades.
Therefore, it is possible to use these digital platforms to
simplify the management and delivery of different forms of
engaging activities, as well as for storing and processing
grades and other learning outcomes obtained through these
activities.
This idea is in line with the interest that comes from using
a VLE as a connection between regular classroom activity and
other activities supported by information technologies, such as
the use of virtual worlds [32] or educational video games [33].
In fact, these e-learning environments are some of the most
affordable approaches to employ in these types of activities.
We have joined these three notions (the interest in gamified competitive or collaborative activities, the challenge of
integrating them in the classroom, and the availability of
VLEs) in the development of an educational innovation project
named TrivialCV (CV is the Spanish abbreviation for “virtual
campus” (campus virtual)). The idea behind this project is
to increase participation and engagement using simple gamification principles. We approach engagement as a measure
of whether the introduction of these experiences results in
better and more active student participation in the day-to-day
activities of the course. To do so, we intend to help teachers
prepare competitive and collaborative learning experiences
in which students are asked questions based on the course
syllabus while reducing the teachers’ workload as much as
possible.
B. The TrivialCV Project
TrivialCV is designed based on the mechanics of the popular
Trivial Pursuit game, where students answer preformulated
questions. As in Trivial Pursuit, some elements of the game are
due to chance. For example, questions are selected randomly
from a group of categories. This kind of Trivial Pursuit design
offers the advantages of being easy for students to understand
and, at the same time, the excitement of competitive mechanics
in a game [34].
The TrivialCV system, however, is not merely limited
to showing questions and answers. It is also designed to
facilitate the instructors’ task by providing a visual editor to
create question groups and, most importantly, to be integrated
with a virtual learning environment to manage student lists,
team creation and the storage and management of learning
outcomes.
The project proposes, in turn, two types of activities: multiplayer team-based competitions created for delivery during
a live class session and single-player competitions created
to be played asynchronously from home. As we describe in
detail, each tool serves a different purpose when deployed in
a specific course. In short, the multiplayer version of TrivialCV serves to encourage class participation, while the single
player version encourages individual work in complementary
activities.
C. Research Questions
In this work, we present our analysis of the impact of
both approaches in terms of student motivation, participation,
and engagement, and learning outcomes. We have focused
on students enrolled in a computer engineering program, an
often-targeted audience for serious game approaches [35]. In
particular, we deploy the activities in two specific courses: a
first-year course on programming fundamentals (PF) delivered
in the first semester and a more advanced course on operating
systems (OS) delivered in the fifth semester.
Both courses have a strong technical part, which is taught
and graded using practical programming assignments. However, both courses also incorporate a share of theoretical
content. This is especially the case in the operating systems
course, which is graded using a two-part examination which
includes short questions about theoretical concepts, and openended questions about practical solutions. These courses are
nontrivial and are chosen as a vehicle for answering the
following three research questions:
RQ1. Are these tools effective in motivating students? This
research question studies whether these games and
their competitive/collaborative approaches are effective in motivating students by providing an attractive
experience. Are they fun to play? Do students spend
time improving their results? Do they recommend
that their fellow students participate?
RQ2. Does the application of these tools result in increased
participation and engagement? We approach engagement as a measure of whether the introduction of
these experiences results in better and more active
student participation in the day-to-day activities of
the course. Typical students tend to decrease their
class participation as the course progresses and the
examination period approaches, and they tend to procrastinate participating in these activities in favor of
the final examination, an attitude which goes against
the principles of progressive skills development.
RQ3. Is there an increase in student performance after
using these tools? Higher motivation and better student engagement are important goals on their own,
although it is desirable for educational intervention
to result in a better transfer of knowledge. To answer
this research question, we take a threefold approach.
The four main notions covered by these research questions
form a cycle (see Fig. 1), in which providing fun and attractive
activities should result in higher motivation. In turn, this motivation should result in higher engagement with the content,
which is manifested as a higher level of student participation in
day-to-day activities. This higher participation should result in
increased performance, which in turn can boost motivation in
students (even if it is only a perceived increase in motivation).
The rest of this article is structured as follows: In Section
II we describe the developed gamified tools, as well as how
we structure different teaching and research activities, aimed
at answering our three research questions. In Section III, we
present the results of the different activities and then discuss
their implications in Section IV. Finally, in Section V, we
summarize our main conclusions and present some further
steps in this line of work.
3
Fig. 1. Relationship among the proposed research questions.
II. M ETHODS
The aforementioned research questions we established according to typical assesment approaches for serious games
[36], with a special focus on the least intrusive methods, especially after-action questionnaires and indirect measurements
such as time spent interacting with the games, comparison of
activity in the virtual campus or comparison of actual grades.
Aiming for simplicity and low-impact interventions, we
opted for direct questions as opposed to using formal evaluation instruments, an approach which is consistent with similar
case studies in the literature [36]–[38].
A. Materials
To explore the effect that competitive activities can have on
student participation and motivation, multiplayer team-based
competition using TrivialCV is carried out in three scenarios:
a laboratory session in the programming fundamentals (PF)
course, and two theoretical sessions during a course in operating systems (OS). Students are also provided with a singleplayer version of the game to prepare for their examinations
at home. After the competitions and after using the single
player version of TrivialCV, the students answered different
questionnaires.
1) TrivialCV: The TrivialCV tools were developed as part
of a series of teaching innovation projects, funded by the
Universidad Complutense de Madrid (UCM) University, aimed
at exploring new approaches to engage students and improve
their day-to-day involvement as opposed to focusing on a final
examination. Two different tools were created (see Fig. 2).
• TrivialCV multiplayer version. In this version, students
are divided into four teams and receive points for correct
answers to preformulated questions. This version was
designed to be used during a classroom session, streamlining the preparation, management, and development of
competitive activities in the classroom. In addition to this,
•
the instructors’ work is facilitated through integration
with an existing virtual campus. The system downloads
lists of students from the virtual campus, prepares groups,
manages competitions and, finally, uploads the results
into the same virtual campus as an additional activity
to be included in evaluations.
TrivialCV single player version. In this version, the player
competes against the stopwatch instead of another team
of students. Instead of being given a specific time limit
for answering each question, the player has 3 minutes
to answer as many questions as possible. As the player
answers questions correctly and incorrectly, bonuses or
penalties are accumulated on the time counter, respectively. The initial time, as well as bonuses and penalties,
are designed for a total game time of approximately 10
minutes. The students are expected to play at home, at
their own pace, as many times as they wish. In addition to
this, the tool can publish rankings on the virtual campus
using the scores obtained by all students, tapping into the
competitive nature of some students.
The tools were also created with a special focus on lowering
the level of instructor effort, who are already overloaded by the
new methodologies required in the European Higher Education
Area. TrivialCV’s questions editor, which generates question
sets compatible with both tools, is available to the teachers
for these courses. For these experiments, they created two
question databases to be displayed in TrivialCV, one per course
(programming fundamentals and operating systems).
2) Data collection: The collection of data was aligned with
the evaluation of each specific activity, rather than having separate interventions for each research question. The three main
approaches to data collection were the use of questionnaires,
the statistical analysis of access logs to the virtual campus, and
the analysis of participation and performance results during the
final examinations of these courses.
Two questionnaires were designed by the researchers: one
4
Fig. 2. Screenshots of the TrivialCV tools. (a) Multiplayer version for in-class team competitions. (b) Single-player version designed to compete against the
clock.
for the in-class multiplayer team competition and the other for
the single player competition.
In the questionnaire for the in-class multiplayer team competition, students were asked about their perceptions of the
competition, the tools and their overall experience. The questions, detailed in Table I, all used a 5-point Likert-scale format,
where 1 meant “strongly disagree” and 5 meant “strongly
agree.” In addition to this, information was collected on
whether or not the team to which the student belonged was the
winner of the competition. A blank space was also provided
for students to include additional suggestions for improvement.
In the questionnaire for the single player competition,
students were asked about their perceptions and usage of the
tool. This questionnaire is shown in Table II. Students rated
most questions in a 5-point Likert-scale format, where 1 meant
“strongly disagree” and 5 meant “strongly agree.” Exceptions
were questions Q2.24 and Q2.25, which were answered in
a true-false format. In addition to this, information about the
time the student was using the tool was collected (Q2.20), and
a blank space was available for students to provide additional
feedback or suggestions for improvement.
Students filled out both questionnaires anonymously.
B. Participants
Participants were students in the programming fundamentals
(PF) and operating systems (OS) courses from the computer
engineering program at Complutense University of Madrid.
Participation in the experiments was voluntary, and students
provided their informed consent.
Regarding the in-class multiplayer team competition, the PF
session was attended by 35 students (7 females and 28 males)
ranging in age from 18 to 23 years, and the OS session was
attended by 25 students (5 females and 20 males) ranging in
age from 20 to 32 years old. Participation in PF was above
the average class attendance, which was usually between 20
and 30 students, while in OS, it was in line with regular class
attendance.
In the single player competition, which was only available
in the OS course, 33 students participated (8 females and 25
males), ranging in age from 20 to 32 years old.
Given that the experiments were conducted during computer
engineering courses, all participants were university students
with an above-average proficiency in using computer systems.
C. Procedure
In one academic year, we performed different teaching and
research activities around the two versions of the TrivialCV
tool. These activities were scheduled and designed prioritizing
a coherent approach for students and teachers, focusing on
their learning experience over rigid experimental restrictions.
Therefore, rather than setting up different experiments for
each research question, we leveraged each teaching activity
to collect data potentially pertaining to different research
questions. More specifically, we performed two different types
of activities.
1) In-class multiplayer team competitions: Two in-class
team competitions with the multiplayer version were organized: one with the students of the programming fundamentals
(PF) course in the last week of the first semester (this is a
two-semester course) and another in the course of operating
systems (OS), halfway through the course.
At the beginning of these sessions, students accessed the
laboratory and were asked to organize themselves in teams of
4 to 8 students. Before starting, they were informed of the
experimental nature of the session and that at the end of the
experience, they would be asked for their opinion through an
anonymous questionnaire.
The game session lasted approximately an hour, and after
finishing the competition but before leaving the laboratory, all
students completed the questionnaire for the in-class multiplayer team competition shown in Section II-A2.
2) Single-player competitions: After the team competition
and until the day of the examination, the students in the operating systems course could access the single player version
deployed in the virtual campus of the UCM. It was configured
so that students could play as many times as they wanted, and
the virtual campus would keep the highest score obtained so
far. In addition to this, a public ranking was regularly updated
in the virtual campus with the scores obtained by all students
5
TABLE I
C ONTENT OF Q UESTIONNAIRE 1, W HICH WAS C OMPLETED BY S TUDENTS C ONCERNING THE M ULTIPLAYER V ERSION OF T RIVIAL CV
Q1.1
Q1.2
Q1.3
Q1.4
Q1.5
Q1.6
Q1.7
Q1.8
Q1.9
Q1.10
Q1.11
Q1.12
Q1.13
Q1.14
Question
The activity was fun
The activity was more fun than a problem-solving class
I liked the format of the activity (team competition)
I prefer individual competitions
I prefer noncompetitive activities
I liked the mechanics of the game
I liked the appearance of the game
I would prefer a more colorful appearance
I would prefer a more serious appearance
The activity was more useful than a problem-solving class
This activity should be done in future editions of this course
I would like to do this activity again during this course
I would like to do this activity in other courses
The activity was effective as a review exercise
The RQ column indicates the research question associated with each questionnaire question. The mark
and usability.
RQ
RQ1
RQ1
RQ1
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ2
RQ2
RQ2
RQ2
RQ3
*
designates specific questions about game design
TABLE II
C ONTENT OF Q UESTIONNAIRE 2, W HICH WAS C OMPLETED BY S TUDENTS C ONCERNING THE S INGLE -P LAYER V ERSION OF T RIVIAL CV
Q2.1
Q2.2
Q2.3
Q2.4
Q2.5
Q2.6
Q2.7
Q2.8
Q2.9
Q2.10
Q2.11
Q2.12
Q2.13
Q2.14
Q2.15
Q2.16
Q2.17
Q2.18
Q2.19
Q2.20
Q2.21
Q2.22
Q2.23
Q2.24
Q2.25
Question
The activity was fun
The activity was more fun than a theory review class
The activity was more fun than an hour of theory study
The activity was more fun than a team competition in class
The tool was easy to use
I liked the format of the activity (individual competition with public ranking)
I prefer team competitions
I prefer private rankings
I liked the mechanics of the game
I liked the appearance of the game
I would prefer a more colorful appearance
I would prefer a more serious appearance
I would prefer to play only the multiplayer version
I would prefer to play only the single player version
I would prefer that both versions of the tool were offered
This activity should be done in future editions of this course
I would like to do this activity in other courses
The activity was more effective than an hour of theory study
I have studied more than I would have studied without the tool
How long do you think you have been playing with the single player version
in the virtual campus?
The activity was effective as a review exercise
The activity was more effective than a theory review class
The activity was more effective than a team competition in class
Has the tool helped you study?
Has the tool helped you refresh your knowledge?
The RQ column indicates the research question associated with each questionnaire question. The mark
and usability.
to explore their competitive nature. In this stage, statistical data
on access were collected. In addition to this, a message board
was opened on the virtual campus to discuss the problems and
limitations of the tool with the students.
Students were informed that it was a voluntary activity and
that it would have no impact on their final grade. Even so,
many students used the application and participated very actively in the message board, providing comments, suggestions,
and bug reports.
*
RQ
RQ1
RQ1
RQ1
RQ1
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ1*
RQ2
RQ2
RQ2
RQ2
RQ2
RQ3
RQ3
RQ3
RQ3
RQ3
designates specific questions about game design
Immediately before the examinations, students who had
used the single player tool were asked to complete the questionnaire for the single player competition shown in Section
II-A2.
D. Data Analysis
All the data collected from the questionnaires were analyzed, calculating for each Likert-scale question the mean, the
6
standard deviation, the median, and the quartiles, together with
the frequency, for each possible Likert value.
We were also interested in whether having won the team
competition would influence the assessment, for which we
used the Wilcoxon two-sample test to analyze. To study
whether the time spent with the single player version of
TrivialCV was related to the answers given by the students to
Questionnaire 2, we used Spearman correlation coefficients.
These questionnaires and the other sources of data served
in different ways toward reaching the goal of answering each
of the three research questions, as detailed below:
1) RQ1—Assessing student motivation: To study motivation, we analyzed responses to questions Q1.1 to Q1.3 from
Questionnaire 1 and Q2.1 to Q2.4 from Questionnaire 2. We
gathered data from both courses (programming fundamentals
and operating systems) separately since different course features could alter student perceptions.
In addition to this, questions Q1.4 to Q1.9 from Questionnaire 1 and Q2.5 to Q2.15 from Questionnaire 2 (marked with
an * in both tables) were used to validate the specific game
approach, including aspects such as game mechanics, usability,
and other design choices within the game. These perceptions
are also related to student motivation since an adequate and
attractive experience can also be a driver for motivation.
2) RQ2—Assessing student participation and engagement:
We measured participation and engagement by studying responses to questions Q1.10 to Q1.13 from Questionnaire 1
and to Q2.16 to Q2.20 from Questionnaire 2.
In addition to this, we collected access logs to the virtual
campus during the course and compared them with access data
from the previous academic year. An increase in access rates
would indicate a higher engagement. To determine whether
the difference between the access rates in both years was
significant, we used Student’s t-test.
3) RQ3—Assessing transfer of knowledge: To assess the
transfer of knowledge, we analyzed responses to questions
Q1.14 from Questionnaire 1 and to Q2.21 to Q2.25 from
Questionnaire 2.
In addition to this, we checked whether students with a high
performance while playing the game obtained better grades
in their examinations using Spearman correlation coefficients.
This approach is informative since this correlation does not
necessarily imply causation.
Finally, we compared the average grades of students from
the study year with the grades from the previous academic
year, checking for significance using Student’s t-test.
and the overall game experience, usability and effectiveness.
The responses, a total of N = 35 for the course of programming fundamentals and a total of N = 18 for operating
systems, were generally positive and are detailed in Tables III
and IV, respectively. The distributions of the responses are
presented in Figs. 3 and 4.
We also analyzed whether there were any correlations
between winning the team competition and the responses to
Questionnaire 1. To do this, we used the Wilcoxon two-sample
test. The results are shown in Table V. The only relevant
correlation observed appeared in the responses to Q1.14 in the
programming fundamentals course. In this case, students who
did not win the competition tended to consider the exercise
more useful as a study activity than those who won.
Questionnaire 2 (see Table II) collected student impressions
about the single player version of TrivialCV, which was available to students of the operating systems course. A total of 33
students accessed this single player version, but only N = 13
completed the questionnaire. Their responses also indicated a
good perception of the tool. The results are presented in Table
VI, and the distribution of responses is presented in Fig. 5.
We also analyzed whether the time spent using the singleplayer version of TrivialCV (Q2.20) is related to the answers
given to Questionnaire 2. To do this, we again used Spearman
correlation coefficients. The results are shown in Table VII. As
can be observed, the only statistically significant relationship
was that the greater the time students spent playing TrivialCV
was, the lower the value given to question Q2.11 (I would
prefer a more colorful appearance).
B. Activity in the Virtual Campus
In this section, we present the results of the different
questionnaires, tests, and evaluations used during our study.
They are presented focusing on each individual activity, while
their relevance in answering the different research questions
is discussed in Section IV.
Another relevant measure was whether the presence of the
single player tools would attract students to interact more with
the virtual campus as a measure of their engagement with dayto-day coursework. This measure was only considered in the
operating systems course because it was the only course of the
two involved in the study (PF and OS) in which the singleplayer version was available.
For this, we used the activity logs from the previous year as
an informal control group, where a total of 95 students were
enrolled in the course and interacted with the virtual campus
an average of 140.59 times (SD = 115.6).
In turn, for the year in which the TrivialCV tools were
available, a total of 65 students were enrolled in the course
and interacted with the virtual campus an average of 176.85
times (SD = 128.37). While there was a 25.8% increase,
an independent-sample t-test was conducted to compare the
values, demonstrating borderline significance (p = .06).
We also measured the number of times that each student
interacted with the TrivialCV tools. A total of 46 students
registered 515 play sessions during the weeks in which the
tool was available, with an average of 11.20 play sessions per
student.
A. Questionnaire Results
C. Student Grades
Questionnaire 1 (see Table I) gathered student opinions
about the attractiveness of the multiplayer team competition
We also analyzed the relationship between the results obtained in TrivialCV and the grades obtained by the students in
III. R ESULTS
7
TABLE III
R ESULTS OF Q UESTIONNAIRE 1 IN THE P ROGRAMMING F UNDAMENTALS (PF) C OURSE : N UMBER OF R ESPONSES , M EAN , S TANDARD D EVIATION ,
M EDIAN AND Q UARTILES FOR E ACH Q UESTION
Question
N
Mean
Q1.1
Q1.2
Q1.3
Q1.4
Q1.5
Q1.6
Q1.7
Q1.8
Q1.9
Q1.10
Q1.11
Q1.12
Q1.13
Q1.14
35
35
35
35
35
35
35
35
35
35
35
35
35
35
4.69
4.57
4.66
2.60
1.80
4.46
4.26
3.26
1.69
3.49
4.89
4.51
4.77
4.31
Std.
Dev.
0.63
0.81
0.59
1.63
1.21
0.74
0.82
1.42
1.08
1.25
0.40
0.78
0.49
0.99
Median
5.00
5.00
5.00
2.00
1.00
5.00
4.00
3.00
1.00
3.00
5.00
5.00
5.00
5.00
Quartile
Range
0.00
1.00
1.00
3.00
1.00
1.00
1.00
3.00
1.00
2.00
0.00
1.00
0.00
1.00
Lower
Quartile
5.00
4.00
4.00
1.00
1.00
4.00
4.00
2.00
1.00
3.00
5.00
4.00
5.00
4.00
Upper
Quartile
5.00
5.00
5.00
4.00
2.00
5.00
5.00
5.00
2.00
5.00
5.00
5.00
5.00
5.00
TABLE IV
R ESULTS OF Q UESTIONNAIRE 1 IN THE O PERATING S YSTEMS (OS) C OURSE : N UMBER OF R ESPONSES , M EAN , S TANDARD D EVIATION , M EDIAN AND
Q UARTILES FOR E ACH Q UESTION
Question
N
Mean
Q1.1
Q1.2
Q1.3
Q1.4
Q1.5
Q1.6
Q1.7
Q1.8
Q1.9
Q1.10
Q1.11
Q1.12
Q1.13
Q1.14
18
18
18
18
18
18
18
18
18
18
18
18
18
18
4.72
4.72
4.39
2.06
2.00
4.00
4.00
2.50
1.78
3.53
4.72
4.56
4.89
4.44
Std.
Dev.
0.57
0.57
0.61
1.21
1.24
0.84
0.97
1.04
0.81
1.07
0.46
0.70
0.32
0.78
Median
5.00
5.00
4.00
2.00
1.50
4.00
4.00
2.50
2.00
4.00
5.00
5.00
5.00
5.00
TABLE V
R ESULTS OF THE W ILCOXON T WO -S AMPLE T EST FOR E ACH Q UESTION
C LASSIFIED ACCORDING TO W HETHER OR N OT THE S TUDENT W ON THE
C OMPETITION IN E ACH OF THE S TUDIED C OURSES : P ROGRAMMING
F UNDAMENTALS (PF) AND O PERATING S YSTEMS (OS).
Q1.1
Q1.2
Q1.3
Q1.4
Q1.5
Q1.6
Q1.7
Q1.8
Q1.9
Q1.10
Q1.11
Q1.12
Q1.13
Q1.14
PF
.0699
.086
.1682
.3344
.4475
.1476
.8811
.572
.056
.2822
.9717
.2892
.2111
.0096 †
OS
.7588
.7588
.9599
.5727
.2218
.3894
.2249
.1135
.1764
.3316
.3088
.5576
.8062
.8391
Statistically Significant Results are marked with †.
Quartile
Range
0.00
0.00
1.00
2.00
2.00
1.00
1.00
1.00
1.00
1.00
1.00
1.00
0.00
1.00
Lower
Quartile
5.00
5.00
4.00
1.00
1.00
4.00
4.00
2.00
1.00
3.00
4.00
4.00
5.00
4.00
Upper
Quartile
5.00
5.00
5.00
3.00
3.00
5.00
5.00
3.00
2.00
4.00
5.00
5.00
5.00
5.00
the operating systems course since this course does include an
explicit theoretical evaluation. In fact, the operating systems
examination includes two blocks: a theoretical block (accounting for 40% of the examination grade) and a practical block
(accounting for the other 60%). Conversely, the programming
fundamentals examination is 100% practical.
Additionally, the grading model for the operating systems
course also contemplates additional points for “class participation,” in which different day-to-day assignments and activities
are evaluated.
1) Grade correlations: We analyzed the relationship between the results obtained in TrivialCV and the grades obtained in 1) the final examination (distinguishing between the
theoretical and the practical part), 2) class participation, and
3) the final grade for the course. For this analysis, we used
Spearman correlation coefficients, and the results are shown
in Table VIII.
Students took their examinations on two different dates,
with similar but not identical examinations. Therefore, we
separated the results depending on which examination they
took. Most results, especially those in Group 1, were found
8
Fig. 3. Distribution of responses to Questionnaire 1 in the programming fundamentals course.
Fig. 4. Distribution of responses to Questionnaire 1 in the operating systems course.
9
TABLE VI
R ESULTS OF Q UESTIONNAIRE 2 IN THE O PERATING S YSTEMS C OURSE : N UMBER OF R ESPONSES , M EAN , S TANDARD D EVIATION , M EDIAN AND
Q UARTILES FOR E ACH Q UESTION
Question
N
Mean
Q2.1
Q2.2
Q2.3
Q2.4
Q2.5
Q2.6
Q2.7
Q2.8
Q2.9
Q2.10
Q2.11
Q2.12
Q2.13
Q2.14
Q2.15
Q2.16
Q2.17
Q2.18
Q2.19
Q2.21
Q2.22
Q2.23
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
13
4.69
4.85
4.69
3.00
4.77
4.69
3.00
1.69
4.54
3.85
2.62
1.77
1.46
2.46
4.31
4.92
4.85
3.92
4.69
4.77
4.46
3.77
Std.
Dev.
0.63
0.55
0.63
1.29
0.44
0.63
0.91
1.25
0.52
0.99
1.45
0.93
0.78
1.51
1.03
0.28
0.38
1.38
0.48
0.44
0.88
1.30
Median
5.00
5.00
5.00
3.00
5.00
5.00
3.00
1.00
5.00
4.00
3.00
1.00
1.00
3.00
5.00
5.00
5.00
4.00
5.00
5.00
5.00
4.00
Quartile
Range
0.00
0.00
0.00
1.00
0.00
0.00
0.00
1.00
1.00
2.00
2.00
2.00
1.00
3.00
1.00
0.00
0.00
1.00
1.00
0.00
1.00
2.00
Lower
Quartile
5.00
5.00
5.00
3.00
5.00
5.00
3.00
1.00
4.00
3.00
1.00
1.00
1.00
1.00
4.00
5.00
5.00
4.00
4.00
5.00
4.00
3.00
Upper
Quartile
5.00
5.00
5.00
4.00
5.00
5.00
3.00
2.00
5.00
5.00
3.00
3.00
2.00
4.00
5.00
5.00
5.00
5.00
5.00
5.00
5.00
5.00
Fig. 5. Distribution of the responses to Questionnaire 2 in the operating systems course.
to be statistically significant (p < .05). Students in Group
2 displayed lower correlations, but they were nevertheless
present.
2) Grade improvement: We also checked whether students
who had been exposed to the different versions of the game
performed better than those who had not, comparing their
grades with those of students from the previous academic year.
IV. D ISCUSSION
The experiments performed with these students were tailored to suit their learning experience but still provided useful
insights and lessons learned for our specific research questions.
While the relative power of the experiment was limited due
to the population size, the reaction from the students was
overwhelmingly positive.
10
TABLE VII
R ESULTS OF THE S PEARMAN C ORRELATION C OEFFICIENTS FOR E ACH
Q UESTION C LASSIFIED BY THE T IME S PENT ON THE S INGLE -P LAYER
V ERSION OF T RIVIAL CV
Q2.1
Q2.2
Q2.3
Q2.4
Q2.5
Q2.6
Q2.7
Q2.8
Q2.9
Q2.10
Q2.11
Q2.12
Q2.13
Q2.14
Q2.15
Q2.16
Q2.17
Q2.18
Q2.19
Q2.21
Q2.22
Q2.23
Spearman Correlation Coefficients
-.08607
-.35309
-.13725
-.16467
.25355
-.06547
-.06958
-.32102
-.36596
.15691
-.59585 †
-.15816
-.04648
-.22324
-.00792
-.22068
-.13093
.09543
.02588
-.14086
-.28002
.03349
p
.7903
.2602
.6706
.6091
.4265
.8398
.8299
.3090
.2420
.6262
.0409
.6235
.8859
.4855
.9805
.4907
.6850
.7680
.9364
.6624
.3780
.9177
Statistically significant results are marked with †.
This perception is aligned with the results of the different
analyses described in Section III and provides useful insights
into our initial research questions, providing support for them
to different degrees.
A. RQ1—Are These Tools Effective in Motivating Students?
As shown in Section III-A, the responses to Questionnaires
1 and 2 indicate that students, in general, are quite satisfied
with the TrivialCV tool in its two versions (single player and
multiplayer). In particular, most of the students were satisfied
with the experience (e.g., Q1.1 and Q2.1), and praised the
activity when compared with other teaching activities (e.g.,
Q1.2, Q1.3, Q2.2, Q2.3, and Q2.6).
Regarding the overall game design and usability, questions
marked with asterisks in Tables I and II were also valued
very highly by students, even though the game design and
appearance were relatively simple. These trends were also
visible in the questions that were proposed with an inverted
scale (e.g., Q1.5, Q1.9, Q2.4, Q2.8, and Q2.12).
Therefore, it seems that the use of these tools increased
the students’ motivation and was perceived as a positive
experience in general. This is consistent with most serious
games studies, where students tend to embrace the approach
[39] even though this is often connected to the novelty effect
of the intervention [40].
B. RQ2—Does the Application of These Tools Result in Increased Participation and Engagement?
While motivation or student acceptance are relatively
straightforward terms, engagement and participation are far
more elusive [41]. Different studies have targeted the evaluation of participation in terms of interactions with the game
itself [42], [43], while others focus on whether the interventions result in behavioral changes [44], [45]. Our goal was
two-fold, seeking an active engagement of the students with
the games, but also with the courses and day-to-day classroom
activities in general.
Increased engagement and participation were measured in
different ways. Regarding responses, students were willing to
repeat the activities in this and other courses (Q1.11 to Q1.13,
Q2.16, and Q2.17) and considered the activity more relevant
than other forms of study (Q1.10 and Q2.18). Perhaps most
importantly, all students reported in Questionnaire 2 that with
the tool, they spent more time studying than they would have
without it (Q2.19).
In turn, it is also remarkable that once the single player
game was made readily available, 46 out of 65 students
accessed the game at least once, and a total of 515 sessions
were completed, for an average of 11.20 play sessions per
student.
Finally, when measuring the difference in overall virtual
campus activities, the average access rate per student to
the virtual campus increased more than 25%, although the
difference was barely statistically significant at p = .06.
Regardless, we consider that the results indicate a positive
attitude toward the game. Many students used it frequently,
and the overall assessment of the tool was that it was an
interesting addition to their set of study resources. The game
also drove them to pay more attention to the course during dayto-day activities, as opposed to the traditional approach of only
assigning importance to the course as the final examinations
draw near.
C. RQ3—Is There an Increase in Student Performance After
Using these Tools?
The transfer of knowledge proved to be the most elusive
measure in all our experiments. The perception of students was
positive in general. All students considered these activities to
be effective as review exercises (Q1.14 and Q2.21) and even
more effective than traditional classes (Q1.10 and Q2.22). As
their instructors, it was surprising (and humbling) to identify
that our students considered a one-hour gameplay session more
useful than a one-hour lecture.
It is also worth mentioning the inverse correlation between
the answers to Q1.14 (The activity was effective as a review
exercise) and performance during team competitions. Students
who did not win the competitions found the exercise more
effective. We believe that students on the losing side realized
that they still needed further effort to be on par with their
classmates, and therefore, it had a deeper impact on them
than on the winning students, who probably were already
performing well in the classroom. This is very relevant for
our research since it may have more deeply impacted students
needing further support, which is a very positive outcome.
Most remarkably, the students generally perceived the tools
as helpful in their examination preparations, especially the
single player version which could be used from home (Q2.24
and Q2.25).
11
TABLE VIII
R ESULTS OF THE S PEARMAN C ORRELATION C OEFFICIENTS (SCC) FOR E ACH G RADE ACCORDING TO THE P REVIOUS R ESULTS IN THE S INGLE -P LAYER
V ERSION AND THE M ULTIPLAYER T EAM C OMPETITION P ERFORMED D URING THE C OURSE
Single-player
Group 1
Multiplayer
Single-player
Group 2
Multiplayer
SCC
p
SCC
p
SCC
p
SCC
p
Theoretical
Part
.62128 †
.0005
.48055 †
.0112
.32083
.1678
.29453
.2075
Examination
.70327 †
< .0001
.42511 †
.0271
.40578
.0759
.49386 †
.0269
Class Participation
.78574 †
< .0001
.54916 †
.0030
.47913
.0207
.63147 †
.0012
Final Grade
.75410 †
< .0001
.45701 †
.0166
.43503
.0552
.55591 †
.0109
Statistically significant results are marked with †.
In addition to this, we did observe in Section III-C significant correlations between the time spent playing the game
and the grades obtained on the examination, as well as other
graded activities such as assignments. These are, however,
correlations. It could be argued that good students simply
performed better in the game and then naturally performed
better on the examination.
Finally, we tried to compare their average grades with those
of students from the previous academic year. However, we did
not find statistically significant differences in the performance
of either group. This is consistent with other existing analyses,
where the learning outcomes of using serious games tend to
be less significant in higher education [46].
This does not mean that the tools were not useful since
there is no way to measure the extra examination preparation
work which students in different years and groups required.
Students might have needed less traditional study effort having
the TrivialCV tools available but still achieved the same result.
However, since we could not measure this, the argument about learning outcomes relies exclusively on the selfperception of students when they reported how useful the tools
had been during their study efforts.
V. C ONCLUSION
As stated above, in this work, we have presented a summary
of different game-based interventions in two different computer science courses taught at a university level: programming
fundamentals and operating systems. The addition of these
game activities was useful in providing additional motivation
for students and improving their engagement with the courses.
These interventions were very well received by the students,
who considered them fun, engaging, and useful.
In this work, we measured the impact according to three
specific research questions, as stated in the introduction. Each
research question was addressed through different research
activities, including questionnaires, analysis of interaction logs
provided by the virtual campus and statistical analysis of the
impact on student grades. The research questions, research
activities and main findings are summarized in Fig. 6.
We first explored whether these tools were effective in
motivating students. We found very positive attitudes toward
these game activities and a desire to increase their frequency
and availability in other courses. This is aligned with firstperson observations and feedback during the interventions,
which were undeniably highlights in their respective courses.
We then explored whether this motivation translated into
an increase in student participation in day-to-day activities, a
cornerstone of the instructional design of these two courses.
Higher motivation should naturally result in higher engagement, and we observed clear indications of increased engagement through our questionnaires and through our analysis of
the frequency of interactions with the virtual campus. Remarkably, the balance between competition and collaboration
was also positive, given the concern that some students may
naturally prefer either competitive or collaborative approaches
[47].
Finally, this additional participation should have resulted
in better student performance. The students reported that this
had impacted their study and performance very positively,
although we did not observe significant variations in their
final examination grades. As discussed above, this does not
mean that the tools failed to produce any significant transfer
of knowledge, but we cannot consider that it demonstrated that
the use of the TrivialCV tools leads to an increase in student
performance.
In addition to this, we must acknowledge the relatively small
sample size and further intervening factors, such as the impact
of the specific instructors or other effects local to specific
teaching groups (e.g., whether the impact changes in a firstyear course).
In fact, these two factors represent our main future line of
work; more studies and interventions are needed to support the
idea that this type of game-based activity increases motivation,
therefore increasing participation and student performance.
Thus far, we can confidently assert that we have demonstrated the first two steps for our experimental groups. However, we consider that this does not diminish the value of the
interventions. If we managed to increase the motivation of
the students and their participation in class while achieving
similar academic performance, then we can be satisfied with
the impact of these kinds of activities.
ACKNOWLEDGEMENT
The authors greatly appreciate the helpful support of Ricardo Garcı́a Mata, from the Universidad Complutense, who
assisted with data analysis for this paper.
12
Fig. 6. Summary of research questions, research activities and main findings.
R EFERENCES
[1] E. Tovar, “Analyzing the problems of the implementation of the European
credit transfer system in a technical university,” in Proc. 34th ASEE/IEEE
Annu. Frontiers in Education Conf. (FIE’04), Savannah, GA, USA, Oct.
20–23, 2004, pp. T3D/11–T3D/16, doi: 10.1109/FIE.2004.1408528.
[2] M. R. Lepper and D. I. Cordova, “A desire to be taught: Instructional
consequences of intrinsic motivation,” Motivation and Emotion, vol. 16,
no. 3, pp. 187–208, Sep. 1992, doi: 10.1007/BF00991651.
[3] T. Malone, M. R. Lepper, R. E. Snow, and M. J. Farr, “Making learning
fun: A taxonomy of intrinsic motivations for learning,” Aptitude, Learn.,
Instruction, vol. 3, pp. 223–253, Jan. 1987.
[4] T. Malone, “Toward a theory of intrinsically motivating
instruction,” Cogn. Sci., vol. 5, pp. 333–369, Oct. 1981, doi:
10.1207/s15516709cog0504 2.
[5] G. Rosemary, A. Robert, and E. D. James, “Games, motivation and
learning: A research and practice model,” Simul. & Gaming, vol. 33,
no. 4, pp. 441–467, Dec. 2002, doi: 10.1177/1046878102238607.
[6] M. Papastergiou, “Digital game-based learning in high school Computer Science education: Impact on educational effectiveness and student
motivation,” Comput. & Educ., vol. 52, pp. 1–12, Jan. 2009, doi:
10.1016/j.compedu.2008.06.004.
[7] H. Tuzun, S. Yilmaz, Y. Karakus, Y. Inal, and G. Cumaoğlu, “The
effects of computer games on primary school students’ achievement and
motivation in geography learning,” Comput. & Educ., vol. 52, pp. 68–77,
Jan. 2009, doi: 10.1016/j.compedu.2008.06.008.
[8] P. Sancho and B. Fernández-Manjón, “Experiences in using a MUVE for
enhancing motivation in engineering education,” in Proc. 1st IEEE Global
Engineering Education Conf. (EDUCON’10), Madrid, Spain, Apr. 14–16,
2010, pp. 775–782, doi:10.1109/EDUCON.2010.5492498.
[9] W. Huang, “Evaluating learners’ motivational and cognitive processing
in an online game-based learning environment,” Comput. Human Behav.,
vol. 27, no. 2, pp. 694–704, March 2011, doi: 10.1016/j.chb.2010.07.021.
[10] A. Janssen, T. Shaw, and P. Goodyear, “Using video games to enhance
motivation states in online education: protocol for a team-based digital
game,” JMIR Res. Protocols, vol. 4, no. 3, p. e114, Sep. 2015.
[11] M. E. W. Dankbaar, J. Alsma, E. E. H. Jansen, J. J. G. van Merrienboer,
J. L. C. M. van Saase, and S. C. E. Schuit, “An experimental study on
the effects of a simulation game on students’ clinical cognitive skills
and motivation,” Advances in Health Sciences Educ., vol. 21, no. 3, pp.
505–521, Aug. 2016, doi: 10.1007/s10459-015-9641-x.
[12] G. Hookham, K. Nesbitt, and F. Kay-Lambkin, “Comparing usability
and engagement between a serious game and a traditional online program,” in Proc. Australasian Computer Science Week Multiconference
(ACSW ’16), Canberra, Australia, Feb. 1–5, 2016, PAPER NO. 54,
doi:10.1145/2843043.2843365.
[13] W. Westera, “Why and how serious games can become far more effective: accommodating productive learning experiences, learner motivation
and the monitoring of learning gains,” Educ. Technol. & Soc., vol. 22,
no. 1, Jan. 2019.
[14] F. Almeida and J. Simoes, “The role of serious games, gamification
and industry 4.0 tools in the education 4.0 paradigm,” Contemporary
Educational Technol., vol. 10, no. 2, Apr. 2019, doi: 10.30935/cet.554469.
[15] J. Kirriemuir and A. Mcfarlane, “Literature review in games and
learning,” Futurelab, Bristol, U,K., Rep. No. 8, 2004. [Online]. Available:
https://web.archive.org/web/20061013014941/http://www.futurelab.org.uk
/download/pdfs/research/lit reviews/Games Review1.pdf.
[16] M. Becker, Pedagogy in commercial video games in Games and Simulations in Online Learning: Research and Development Frameworks, D.
Gibson, C. Aldrich, and M. Prensky, Eds., Hershey, PA, USA: Information
Science Publications, 2007, pp. 21–48, doi: 10.4018/978-1-59904-3043.ch002.
[17] R. T. Hays, “The effectiveness of instructional games: a literature review and discussion,” Naval Air Warfare Center Training Systems Division, Orlando, FL, USA, Tech. Rep. 2005-004, Nov. 2005.
[Online]. Available: https://apps.dtic.mil/sti/pdfs/ADA441935.pdf, doi:
10.21236/ADA441935.
[18] L. Annetta, J. Minogue, S. Holmes, and M. Cheng, “Investigating the
impact of video games on high school students’ engagement and learning
about genetics,” Comput. & Educ., vol. 53, pp. 74–85, Aug. 2009, doi:
10.1016/j.compedu.2008.12.020.
[19] D. B. Porter, “Computer games: Paradigms of opportunity,” Behav. Res.
Methods, Instruments, & Comput., vol. 27, no. 2, pp. 229–234, Jun. 1995,
doi: 10.3758/BF03204737.
[20] D. Johnson and R. Johnson, Learning Together and Alone: Cooperative,
Competitive, and Individualistic Learning. Allyn & Bacon, Boston, MA,
USA, 1999.
[21] L. Harasim, “Online education. an environment for collaboration and
intellectual amplification,” Online Educ. Perspectives on a New Environ.,
pp. 39–64, 1990.
[22] M. Alavi, “Computer-mediated collaborative learning: An empirical
evaluation,” MIS Quart., vol. 18, no. 2, pp. 159–174, Jun. 1994, doi:
10.2307/249763.
[23] D. Buchinger and M. da Silva Hounsell, “Guidelines for designing and
using collaborative-competitive serious games,” Comput. & Educ., vol.
118, pp. 133–149, Mar. 2018, doi: 10.1016/j.compedu.2017.11.007.
[24] J.-Y. Wang, W. Lin, and H.-P. Yueh, “Collaborate or compete? How
will multiplayers’ interaction affect their learning performance in serious
games,” in Cross-Cultural Design. Culture and Society, P. Rau, Ed., New
York, NY, USA, 2019, vol. 11577, pp. 482–491, doi: 10.1007/978-3-03022580-3 36.
[25] T. H. Laine and R. S. Lindberg, “Designing engaging games for
education: A systematic literature review on game motivators and design
principles,” IEEE Trans. Learn. Technol., pp. 804–821, Aug. 2020, doi:
10.1109/TLT.2020.3018503.
13
[26] S. Y. Chen and Y.-M. Chang, “The impacts of real competition and
virtual competition in digital game-based learning,” Comput. Human
Behav., vol. 104, Mar. 2020, doi: 10.1016/j.chb.2019.106171.
[27] K. Graham et al., “Cyberspace Odyssey: A competitive team-oriented
serious game in computer networking,” IEEE Trans. Learn. Technol.,
vol. 13, no. 3, pp. 502–515, Jul. 2020, doi: 10.1109/TLT.2020.3008607.
[28] A. Diniz dos Santos, F. Strada, and A. Bottino, “Approaching sustainability learning via digital serious games,” IEEE Trans. Learn. Technologies,
vol. 12, no. 3, pp. 303–320, Jul. 2019, doi: 10.1109/TLT.2018.2858770.
[29] J. Moon and F. Ke, “Exploring the relationships among middle school
students’ peer interactions, task efficiency, and learning engagement in
game-based learning,” Simul. & Gaming, vol. 51, no. 3, pp. 310–335,
Jun. 2020, doi: 10.1177/1046878120907940.
[30] S. Greipl, K. Moeller, and M. Ninaus, “Potential and limits of gamebased learning,” Int. J. Technol. Enhanced Learn., vol. 12, no. 4, pp.
363–389, Oct. 2020, doi: 10.1504/IJTEL.2020.110047.
[31] D. Garrison and H. Kanuka, “Blended learning: Uncovering its transformative potential in higher education,” Internet & Higher Educ., vol. 7,
no. 2, pp. 95–105, 2nd Quarter 2004, doi: 10.1016/j.iheduc.2004.02.001.
[32] P. Sancho, J. Torrente, E. J. Marchiori, and B. Fernández-Manjón,
“Enhancing moodle to support problem based learning. The nucleo
experience,” in IEEE Global Engineering Education Conf. (EDUCON),
Apr. 4–6, 2011, pp. 1177–1182, doi: 10.1109/EDUCON.2011.5773296.
[33] P. Moreno-Ger, D. Burgos, and J. Torrente, “Digital games in eLearning
environments: Current uses and emerging trends,” Simul. & Gaming,
vol. 40, no. 5, pp. 669–687, Jul. 2009, doi: 10.1177/1046878109340294.
[34] T. Lim et al., “Strategies for effective digital games development and
implementation,” in Cases on Digital Game-Based Learning: Methods,
Models, and Strategies, Y.Baek and N. Whitton, Eds., Hershey, PA, USA:
IGI Global, 2013, pp. 168–198, doi: 10.4018/978-1-4666-2848-9.ch010.
[35] C. Johnson et al., “Game development for computer science education,”
in Proc. 2016 ITiCSE Working Group Reports (ITiCSE ’16), Arequipa,
Peru, Jul. 9, 2016, pp. 23–44, doi:10.1145/3024906.3024908.
[36] J. Baalsrud Hauge et al., “Study design and data gathering guide
for serious games’ evaluation,” Psychology, Pedagogy, & Assessment in
Serious Games, T. M. Connolly, T. Hainey, E. Boyle, G. Baxter, and
P. Moreno-Ger, Eds. IGI Global, 2014, pp. 394–419, doi: 10.4018/9781-4666-4773-2.ch018.
[37] R. Yanez-Gomez, D. Cascado-Caballero, and J.-L. Sevillano, “Academic
methods for usability evaluation of serious games: A systematic review,”
Multimedia Tools & Appl., vol. 76, no. 4, pp. 5755–5784, 2017.
[38] F. Bellotti, B. Kapralos, K. Lee, P. Moreno-Ger, and R. Berta, “Assessment in and of serious games: An overview,” Advances in Human-Comp,
Interact., vol. 2013, 2013, Art. no. 136864, doi: 10.1155/2013/136864.
[39] W. Westera, “Comparing bayesian statistics and frequentist statistics in
serious games research,” Int. J. Serious Games, vol. 8, no. 1, pp. 27–44,
Mar. 2021, doi: 10.17083/ijsg.v8i1.403.
[40] C. Girard, J. Ecalle, and A. Magnan, “Serious games as new educational tools: How effective are they? a meta-analysis of recent studies,”
J. Comput. Assisted Learn., vol. 29, no. 3, pp. 207–219, 2013, doi:
10.1111/j.1365-2729.2012.00489.x.
[41] G. Hookham and K. Nesbitt, “A systematic review of the definition and
measurement of engagement in serious games,” in Proc. Australasian
Computer Science Week Multiconference (ACSW 2019), Sydney, Australia, Jan. 29–31, 2019, doi: 10.1145/3290688.3290747.
[42] J. Hauge, H. Duin, and K.-D. Thoben, “Increasing the resiliency of
global supply network by using games,” in Proc. 13th International
Symposium Logistics (ISL 2008), Bangkok, Thailand, Jul. 6–8, 2008, pp.
125–132.
[43] F. Bellotti et al., “A gamified short course for promoting entrepreneurship among ict engineering students,” in Proc. 13th IEEE International
Conference Advanced Learning Technologies (ICALT 2013), Beijing,
China, Jul. 15–18, 2013, pp. 31–32, doi: 10.1109/ICALT.2013.1
[44] B. Bonnechère, “Serious games in rehabilitation,” in Serious Games in
Physical Rehabilitation, Springer-Verlag, New York, NY, USA, 2018, pp.
41-109, doi: 10.1007/978-3-319-66122-3.
[45] A. M. Ahmed, Q. H. Mehdi, R. Moreton, and A. Elmaghraby, “Serious
games providing opportunities to empower citizen engagement and participation in e-government services,” in Proc. 2015 Computer Games:
AI, Animation, Mobile, Multimedia, Educational and Serious Games
(CGAMES), Louisville, KY, USA, Jul. 25–28, 2015, pp. 138–142, doi:
10.1109/CGames.2015.7272971.
[46] R. L. Lamb, L. Annetta, J. Firestone, and E. Etopio, “A meta-analysis
with examination of moderators of student cognition, affect, and learning
outcomes while using serious educational games, serious games, and
simulations,” Comp. Human Behav., vol. 80, pp. 158–167, 2018, doi:
10.1016/j.chb.2017.10.040.
[47] E. Lavoue, B. Monterrat, M. Desmarais, and S. George, “Adaptive gamification for learning environments,” IEEE Trans. Learn. Technologies,
vol. 12, no. 1, pp. 16–28, Jan. 2019, doi: 10.1109/TLT.2018.2823710.
Pablo Moreno Ger was born in Madrid in 1981.
He obtained the Ph.D. degree in computer engineering from the Complutense University of Madrid,
Madrid, Spain, in 2007 and thereafter was an associate professor in the Department of Software Engineering and Artificial Intelligence at that university.
He is now with the International University of La
Rioja (UNIR), where he currently serves as the vicerector for research. Formerly, he was the director of
the School of Engineering and Technology at UNIR,
as well as vice-dean for innovation in the School of
Computer Engineering at UCM. His main research interests are in technologyassisted teaching, artificial intelligence, learning analytics, and serious games.
He has published more than 150 academic works in these fields.
Virginia Francisco was born in Segovia, Spain, in
1980. She obtained the Ph.D. degree in computer
engineering from the Complutense University of
Madrid (UCM), Madrid, Spain, in 2008. She is
currently an associate professor in the Department
of Software Engineering and Artificial Intelligence
at UCM, where she teaches and conducts research
in human-computer interaction through natural language user interfaces. She also carries out various
initiatives related to instructional innovation.
Raquel Hervás was born in Cuenca, Spain, in 1980.
She obtained the Ph.D. degree in computer engineering from the Complutense University of Madrid,
Madrid, Spain, in 2009. She is currently an associate
professor in the Department of Software Engineering
and Artificial Intelligence at UCM, where she serves
as vice-dean for studies and quality assurance. Her
research interests are focused on human–computer
interaction through natural-language user interfaces.
She is also responsible for various initiatives related
to instructional innovation.