Next Article in Journal
Handling of Ion-Selective Field-Effect Transistors (ISFETs) on Automatic Measurements in Agricultural Applications Under Real-Field Conditions
Previous Article in Journal
A Novel Face Swapping Detection Scheme Using the Pseudo Zernike Transform Based Robust Watermarking
Previous Article in Special Issue
Comparative Analysis of Prompt Strategies for Large Language Models: Single-Task vs. Multitask Prompts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Gamified Method for Teaching Version Control Concepts in Programming Courses Using the Git Education Game

by
Hsi-Min Chen
1,
Bao-An Nguyen
2,*,
You-Wei Chang
1 and
Chyi-Ren Dow
1
1
Department of Information Engineering and Computer Science, Feng Chia University, Taichung City 407102, Taiwan
2
Department of Information Technology, Tra Vinh University, Tra Vinh City 87000, Vietnam
*
Author to whom correspondence should be addressed.
Submission received: 11 November 2024 / Revised: 11 December 2024 / Accepted: 13 December 2024 / Published: 16 December 2024
(This article belongs to the Special Issue Advances in Software Engineering and Programming Languages)

Abstract

:
Using version control tools is an indispensable skill for engineers in the software industry. This study introduces a gamification approach together with a serious game called the Git Education Game (GEG) to teach Git concepts and usage, intending to improve students’ motivation and learning performance compared to traditional lectures. An experiment was designed with two classes of the same course to compare the effect of GEG. A post-test was designed to verify whether the game could help students achieve better learning outcomes and higher motivation. The results show that our approach had a positive effect on students’ motivation, so the experimental group had a higher pass rate than the control group for most items in the post-test. Based on this study’s results, we emphasize the impact of interactive learning environments in software engineering education.

1. Introduction

The ability to use version control tools is an essential skill for software engineers [1], but this essential skill is not always taught or only taught in a very short time in computer science courses. This has led to confusion among students regarding the concept and the use of version control tools, which is further exacerbated by traditional teaching methods, making it difficult for students to be motivated to learn and understand the content.
Among the non-traditional educational approaches, gamification is considered a promising teaching method in which increasing learner motivation is considered a key feature of gamification [2]. Since game-based learning can strengthen students’ intrinsic motivation [3], it boosts their willingness to learn actively. The virtual environment in games can simulate various situations that are difficult to reproduce in the classroom. Thus, it makes teaching abstract concepts easier through interaction. Games also can provide students with the opportunity to keep trying without worrying about the risk of failure, and immediate feedback increases the efficiency of learning. Due to the intrinsic characteristics of games, such as competition, challenge, and interaction, we can transform the learning process into a fun experience and achieve deep learning within an acceptable teaching time and teacher burden [3].
In this article, we introduce our gamification approach to teaching Git [4], a version control tool widely adopted by the software industry, via a game called Git Education Game (GEG). GEG is a web-based serious game developed by Unity [5] combined with the backend Java Spring Boot framework, which facilitates users to learn version control concepts and Git commands. GEG aims to effectively improve students’ motivation and learning efficiency of Git and to complement the insufficient parts of the school curriculum. The game divides Git concepts and commands into several levels and introduces gamification mechanisms, such as points, medals, leaderboards, etc., to motivate students to participate in learning and gain a sense of accomplishment from learning. Concurrently, the logs of students’ learning behaviors are sent to the backend database via APIs so that teachers can monitor students’ learning status in real-time.
To examine the effectiveness of the proposed approach, we conducted an experiment on two classes of students studying the same course, one for control and one for treatment (experimental group). While the experimental group studied Git via GEG, the control group learned Git in traditional mode. The learning effects were evaluated via the experiment to answer the following research questions (RQs):
  • RQ1: Does adding GEG as a teaching aid have higher learning outcomes than traditional teaching methods?
  • RQ2: Does GEG as a teaching aid have a positive impact on students’ attitudes and behavior towards Git?
  • RQ3: What is the percentage of students who are motivated to learn with GEG? Do students who are active learners have higher learning outcomes?
  • RQ4: What do students think are the advantages and disadvantages of GEG?
To answer the RQs, this study applied and extended the models UTAUT2 [3] and PLS-SEM [6] to examine whether the inclusion of GEG positively affected students’ attitudes toward Git and to examine the effects of gamification elements on factors such as self-efficacy, performance expectancy, and hedonic motivation. This study employs performance expectancy, effort expectancy, hedonic motivation [7], self-efficacy, attitudes, behavioral intention, and use behavior [8] and introduces the two gamification effectiveness constructs, called gamification usefulness and gamification motivation, to the analysis model. By applying and extending UTAUT2 and PLS-SEM, this study aimed to provide insights into how GEG positively influences students’ attitudes toward Git and how gamification elements impact key motivational and behavioral factors. The results of this research can contribute to a better understanding of the effectiveness of gamification in educational contexts and shed light on the potential benefits of incorporating GEG in teaching and learning practices.

2. Related Work

2.1. Serious Game and Gamification

In recent decades, both “Serious Game” and “Gamification” have been used to develop for “serious” purposes. The definitions of the two are different: serious games are based on complete games, with entertainment as a secondary concern and education as the center [9,10], while gamification is the addition of game elements to non-game environments [11], but both try to use games or game elements to educate, change behavior patterns [12], and to educate, encourage, and convince users in educational, health, and other environments [13,14,15]. The main purpose of this study is to educate and change students’ knowledge of version control software through serious games by introducing gamification elements, establishing game mechanisms, etc., so that serious games can achieve beneficial results different from ordinary games [16].

2.2. Educational Games in Software Engineering

In the world of digital educational games for engineering, software engineering (SE) stands out as the most attractive field, offering a plethora of games that focus on various aspects of the SE process, including coding, looping, object-oriented analysis, design, SE project management, code review, etc. [17]. Recognizing the limitations of passive learning in traditional teaching methods, many educators have embraced the gamification approach to enhance SE courses and provide students with practical knowledge about the SE process. One of the pioneering educational games in the field of SE was introduced by Baker et al. [18], who developed a card game to simulate the SE process. By incorporating game mechanics, they fully engaged students and facilitated their understanding of SE concepts. The results of this study showed that students agreed the game significantly eased the comprehension of SE concepts, making the learning experience more enjoyable and effective. Building upon the success of such approaches, Von Wangenheim et al. [3] further explored the use of intrinsic properties of games to enhance the teaching of Scrum development methods. They found that games not only achieved deep learning within acceptable teaching time and teacher burden but also transformed the learning process into a fun and engaging experience. The competitive nature of the game stimulated students’ motivation and active participation in learning, leading to positive effects on their overall learning experience.
The adoption of gamification in SE education has demonstrated its potential to bridge the gap between theory and practice, fostering a more hands-on and immersive learning environment. By infusing elements of play, competition, and interactivity into SE courses, educators can create a dynamic and enjoyable educational experience that resonates well with students, promoting better learning motivation and effective knowledge acquisition [19]. These gamified approaches not only facilitate knowledge retention but also encourage critical thinking, problem-solving, and teamwork, all of which are essential skills in the field of software engineering [20]. As the demand for skilled software engineers continues to grow, the integration of gamification in SE education holds promise in preparing students for real-world challenges and better equipping them for successful careers in the industry.

2.3. Industry Expectations and Computing Education

In the study investigating graduates’ readiness for the job market, team skills, collaboration, and the tools associated with these skills were identified as areas in need of improvement. Particularly, students with limited experience in industry-standard tools faced challenges in participating in internships [21]. A study by Radermacher and Walia [22] highlighted the discrepancies between software industry expectations and the competencies of graduates, particularly in how they manage software development as a team. To address this gap, Git has emerged as a critical industrial tool that is being increasingly taught in educational institutions. Kelleher (2014) advocated for Git as a well-established and respected version control system capable of facilitating collaborative coding environments. Git was even incorporated as a mechanism for publishing course exercises to enhance students’ practical experience. Lawrance and Jung [22] shared their experience of using Git as a platform for computer science courses, suggesting that it should be treated as a foundational tool for programming rather than limited to just SE. Furthermore, Eraslan et al. [23] integrated GitLab metrics into coursework consultation sessions for a software engineering course, emphasizing the importance of using Git in practical teaching settings.
Building upon this ongoing trend, the present study proposes a Git-themed gamification education experiment to further enhance students’ mastery of Git while also fostering motivation and a positive attitude toward learning. By infusing elements of gamification into the teaching of Git, this study aims to create an engaging and interactive learning experience, thereby helping students become proficient in using this crucial tool effectively.

2.4. The UTAUT2 Model

In the educational technology context, the Technology Acceptance Model (TAM) and its descendants are widely adopted to validate the efficacy and effectiveness of proposed technology models [24]. The community also leveraged the Unified Theory of Acceptance and Use of Technology (UTAUT), a unified model consisting of six main structures integrated from eight major theories of technology acceptance [8] and its extension UTAUT2 [25].
For the same purposes. The UTAUT2 model was leveraged to identify that computer self-efficacy and computer gameplay had significant effects on the perceived ease of use of e-learning systems [26]. In their gamification research, Baptista and Oliveira [27] studied the effect of gamification on the use of mobile banking services, in which the UTAUT2 models showed a direct and strong relationship between gamification and intention to use mobile banking services. Consequently, UTAUT2 is considered a suitable model for our study to develop and validate the theory.

3. The Git Education Game

GEG is designed to simulate real-world development scenarios using a Git Command Line (Git CLI) environment, where players must operate a virtual Git CLI to perform various tasks. The game comprises ten levels of increasing complexity, each presenting different challenges that students must overcome by using in-game prompts to learn and practice Git operations.
The initial levels are relatively straightforward, guiding players through the basic instructions and commands. As the game progresses, the difficulty level increases, requiring players to apply the concepts they have learned in previous levels to solve more intricate problems successfully, ultimately seeking to conquer all ten levels.
To incentivize and recognize students’ achievements, the game incorporates a scoring system. Players earn points for successfully passing each level and can earn additional rewards in the form of badges for specific accomplishments. The game keeps track of all players’ scores and achievements on a leaderboard, fostering healthy competition and motivating students to excel.
By presenting a gamified learning experience that combines practical challenges with a sense of achievement and healthy competition, GEG aims to enhance students’ engagement, understanding, and mastery of Git. Through interactive and progressively challenging gameplay, the game strives to provide a fun and effective way for students to learn and become proficient in using Git in real-world software development scenarios.

3.1. Game Design Principle

This study adopts the design principles proposed by [28] to combine flow experience with instructional design to create a highly immersive gaming experience, which in turn makes the learning process more effective, so the system proposed in this study is designed in the following steps (see Appendix A for detailed descriptions of design principles):
  • Analyze learners.
  • Set clear teaching objectives and select appropriate gaming material.
  • Design teaching instructions according to teaching objectives and game content.
  • Consider teaching as the primary goal and use games as supplementary tools.
  • Make good use of the characteristics of computer games.
  • Place students at the center of the process and help them enjoy studying.
  • Periodically assess students’ learning and constantly improve teaching.
Students’ activities are recorded and sent to the backend of the server. From the number of levels passed, the number of instructions spent by students, the time spent by students, etc., we can determine the learning status of students and immediately adjust levels that may be too difficult or add more levels to help students clarify confusing concepts. In addition, tests and surveys are also conducted to understand students’ learning status.
This study classifies the Git competencies acquired by students in terms of four levels according to Bloom’s taxonomy [29] as follows:
Remember: Students can remember the conceptual terms and some of the commands associated with Git, such as repository, commit, branch, merge, etc.
Understand: Students can explain the difference between local and remote repositories, distinguish between Git and Git servers, and interpret the roles of distributed version control systems, branches, merges, and conflict resolution in a visualized Git workflow.
Apply: Students use commands to solve Git-related problems, such as cloning, committing, pushing, pulling, creating branches, merging, and resolving conflicts through repeated practice.
Analyze: Students diagnose synchronization issues, compare merge strategies, and analyze conflict resolution methods to understand the overall Git workflow and command relationships.

3.2. Overall Design

The GEG is a web-based system; hence, its interactive environment is based on web browsers. Figure 1 shows the start menu, where students are required to register with a student number to start the game. Figure 2 shows the chapter menu, where Git concepts and commands are included in the levels. Students must pass the corresponding levels to unlock the subsequent levels.
Learning Git through a graphical interface offers a more visual and user-friendly experience for students. However, exclusive reliance on such interfaces may hinder their ability to adapt to the command-line interface (CLI), which remains a standard tool in professional development environments [30]. The reasons for adopting the teaching CLI in this study are as follows:
  • It is easy for beginners to use the GUI directly to confuse their understanding of how Git works, and they may mistake the functions designed by the GUI as direct functions of Git.
  • The Git CLI is the same on all environments and machines, while the GUI may not work due to different operating systems.
  • The CLI is more complete than the GUI in that all of Git’s features are included, while the GUI is not.
  • It is easier to get help when you are having trouble using it, as the GUI does not always have complete and good documentation, while the CLI makes it easier to get help online.
Figure 3 shows the game interface, which is divided into six blocks corresponding to the following descriptions:
  • The brief description of the level and the instructions for the level open button.
  • The task objective of the level, the completed objective, is shown in green; otherwise, it is shown in red.
  • The file section, where students must sometimes manipulate files to modify and save them.
  • The Git console interface, where students enter the appropriate Git commands.
  • The Git visualization block contains the remote (top half) and local (bottom half) sections of Git, with a level restart button in the upper right corner for when students are using irreversible operations.
  • Git command prompt cards display the relevant Git commands for each level and enlarge when hovered over with the mouse for better visibility.
When a level is completed, in addition to the basic congratulatory message, the time spent and the number of commands entered in the level are also displayed to the students, as shown in Figure 4. These records are sent to the database server, which can be used for subsequent analysis and also as part of the in-game leaderboard.

3.3. Gamification Elements

To attract students to actively learn Git in this study, the following gamification elements were introduced in GEG: points, leaderboards, and badges.

3.3.1. Points

Point is considered one of the most common game elements [31]. Star [32] found that the mere use of points adds a quantitative indicator of task performance. To match the progressive difficulty of the levels, the higher the level is passed, the more points the student earns, which allows the student to view his or her progress, and the points are used as a basis for ranking the leaderboard, which the player can open from the start menu to see his or her total score.

3.3.2. Leaderboards

A leaderboard is also one of the most adopted game elements. There are two types of leaderboards in GEG: the first is the level leaderboard, which ranks students according to their time spent on the level. The second is the overall ranking, where students are ranked according to the points they have earned throughout the game levels, and their achievements are also listed.
Considering that ranking may be counterproductive for students with lower ranks [33], GEG set a switch to control whether or not to display all rankings, which is placed at the bottom right corner of the ranking interface. By default, only the top 10 are displayed.

3.3.3. Achievement and Badges

Badges are usually used to symbolize the player’s achievements and provide a unique marker for achievements [34]. There are ten achievements in GEG, which can be earned when students complete specific tasks. Students who earn achievements not only collect badges but also earn a certain number of points; this encourages students to complete specific actions to rank higher on the leaderboard. For example, when students pass the first level of the game, they can get an achievement called “Getting Started”. Figure 5 shows the screen that pops up when students obtain the achievement.
In addition, students can also get an achievement when they are the first in a level ranking or when they pass a level in 30 s; GEG has also set up some interesting achievements, such as the achievement called “I do not need hints” when students close the game without reading a page of instructions or the achievement called “King of the card” when students read a card more than 25 times. These achievements increase student engagement, encourage students to try as much as possible during the game, and increase replayability.
Figure 6 shows the achievement viewer, where students can view their currently unlocked achievements and those that have not been unlocked. The unlocked achievements are hidden from the view of the achievements viewer, but the descriptions of the achievements provide an idea of what needs to be achieved in order to encourage students to try to accomplish these goals.

3.4. Level Topics and Student Activities Monitoring

In GEG, the first level of the game (level zero) introduces why a student needs to use version control software, and the remaining eight levels in total teach Git commands and concepts directly on the following topics:
  • Create a Git repository (git init).
  • Commit operations (git commit).
  • Push operations (git push).
  • Create a branch (git branch).
  • Merge and delete branches (git merge and git branch-D).
  • Synchronize and resolve conflicts (git pull and resolve conflicts).
  • Git merges and conflict solving.
Students’ activities in GEG are recorded and sent to the database so that teachers can know the status of students’ learning in real-time. Teachers can also check the number of students who passed each barrier to determine whether students encountered difficulties in certain areas. Recorded students’ activities and information are as follows:
  • Starting the level.
  • Completing an objective in the level.
  • Completing the level (including time spent and the number of lines).
  • Turning on the in-game hint.
  • Closing the in-game hint.
  • Entering commands in the CLI.
  • The current number of badges.
  • The number of students who have passed each level.

4. Experiment

To validate our system, we conducted a quasi-experiment in which the GEG was introduced in an object-oriented programming course during the first semester of the 2021 academic year at Feng Chia University, Taiwan. The course was divided into two classes, with one group randomly selected as the experimental group and the other as the control group, consisting of 54 and 59 students, respectively. Most of the students in this course were sophomore computer science majors. Additionally, the learning platform used in this OOP course required students to use Git for submitting their homework, including but not limited to the command-line interface (CLI).

4.1. Procedure

The flow of the experiment is shown in Figure 7. Before the experiment began, both groups received a non-practical Git tutorial and completed a multiple-choice pre-test to establish baseline knowledge. In the week following the pre-test, the control group attended a two-hour Git course covering topics from setting up a Git repository to resolving conflicts. The experimental group participated in one hour of game-based learning followed by one hour of course instruction, covering the same content as the control group. Post-tests were scheduled a few weeks later to give students time to absorb the knowledge.
In the experimental group, students were first taught how to operate the GEG game and were shown how to pass the first level and then allowed to learn on their own. In the second hour, the same content was taught as in the control group, and the scope of instruction was the same as that taught in the game levels, where students played the game and learned the concept and use of Git.
After completing the test, students were asked to fill out a questionnaire about their gaming experience and attitude toward Git. The questionnaire consisted of 25 items categorized into nine constructs: Performance Expectancy (PE), Effort Expectancy (EE), Self-Efficacy (SE), Hedonic Motivation (HM), Gamification Motivation (GM), Gamification Usefulness (GU), Behavioral Intention (BI), Use Behavior (UB), and two open-ended questions. The purpose of the questionnaire was to explore students’ experiences with the system, including the perceived usefulness of the game for learning Git and its ability to motivate learning. A five-point Likert scale was used for the questionnaire items, ranging from 1 (strongly disagree) to 5 (strongly agree). The constructs included in the questionnaire and their descriptions are summarized in Table 1. Specific questions for the constructs and their descriptive statistics for the responses are provided in Appendix B. The model structure is illustrated in Figure 8.
The following assumptions are made based on the relationship between the various components of the study model:
Hypothesis 1 (H1):
The mechanism of gamification affects students’ performance expectancy using games to learn Git.
Hypothesis 2 (H2):
The mechanism of gamification affects students’ self-efficacy in using the game to learn Git.
Hypothesis 3 (H3):
The gamification mechanism affects students’ effort expectancy using the game to learn Git.
Hypothesis 4 (H4):
The mechanism of gameplaying affects students’ hedonic motivation to use the game to learn Git.
Hypothesis 5 (H5):
Performance expectancy of learning Git using games affects students’ attitudes towards Git.
Hypothesis 6 (H6):
Self-efficacy in learning Git using games affects students’ attitudes towards Git.
Hypothesis 7 (H7):
Effort expectancy of using the game to learn Git affects students’ attitudes towards Git.
Hypothesis 8 (H8):
Hedonic motivation to use games to learn Git affects students’ attitudes towards Git.
Hypothesis 9 (H9):
Students’ attitudes towards Git will influence their behavioral intentions towards Git.
Hypothesis 10 (H10):
Students’ behavioral intentions toward Git will affect their use behavior of Git.

4.2. Result

4.2.1. Pre-Test Result

The pre-test, consisting of multiple-choice questions covering version control concepts, staging, local and remote repositories, and basic Git commands, aimed to evaluate the students’ baseline knowledge. As shown in Table 2, an independent sample t-test was conducted to compare the mean scores of the control group (N = 54) and the experimental group (N = 59). The mean scores were 72.593 for the control group and 76.271 for the experimental group. The results indicated no significant difference in variance (homogeneity of variance test: F = 1.1762, p = 1.4487) or mean scores (independent sample t-test: t = −1.1918, p = 0.2359) between the two groups.

4.2.2. Post-Test Result

In the post-test section, echoing the hypothetical questions of the study, the requirements of the test questions are divided into three parts:
  • Be able to fork projects and perform clone, commit, and push.
  • Be able to add a new branch and switch branches for commits.
  • Be able to merge and resolve conflicts.
The percentage of students who passed the test questions in three parts of the test in the experimental group was higher than that of students in the control group, as shown in Figure 9.

4.2.3. Questionnaire Result

After the experimental session, we distributed the questionnaire to students in the experimental group. Out of 59 students, we collected 39 valid responses, with approximately 22 students providing input on the open-ended questions regarding their opinions on GEG. The level of agreement among students was high for most questions; however, a significantly lower value was observed for UB1 (I use the version control tool frequently in all my software projects). Detailed questionnaire items and corresponding response statistics can be found in Table A1 in Appendix B. The analysis results based on the UTAUT2 model, conducted using SmartPLS 4 software [35], are presented below.

4.3. Reliability and Validity Test

We used PLS algorithms to assess the reliability, internal consistency, and convergent validity of the items, with the results presented in Table 3 and Figure 10. Figure 10 presents the results of the PLS-SEM analysis, including the R-squared values for each construct (displayed within the constructs), path coefficients (shown between constructs), and outer weights (illustrating the relationships between items and their respective constructs).
Indicators are considered reflective if they are highly correlated and interchangeable, requiring an evaluation of reliability and validity [36]. Specifically, the criteria stipulate that ‘CL,’ ‘α,’ and ‘CR’ values should be greater than 0.7, and ‘AVE’ values should be greater than 0.5 [6]. As shown in Table 2, all cross-loading values exceed 0.764, and Cronbach’s alpha values are above 0.801, indicating good reliability for the measurement model. Furthermore, all composite reliability values are satisfactory, and the average variance extracted (AVE) is above 0.708, demonstrating a high degree of convergent validity.
The discriminant validity test is an essential step in model validation that ensures constructs in a model are distinct from one another. This is commonly assessed using the Fornell–Larcker criterion, which compares the diagonal values (square roots of Average Variance Extracted, or AVE) with the cross-loading values of other constructs. Discriminant validity is considered acceptable when the diagonal values are higher than the corresponding cross-loading values, indicating that each construct captures phenomena not represented by other constructs in the model [6]. Good discriminant validity signifies that the constructs have sufficient divergence and are accurately measured. As shown in Table 4, the values on the diagonal of the matrix are the highest within their respective rows and columns, demonstrating that the constructs are well discriminated from one another.

4.4. Assessment of Structural Model

We evaluated the relationship between potential variables in the structural model. We used the blindfolding technique to evaluate the predictive relevance of the study model, under which a model is considered to have predictive relevance if the cross-validated redundancy measure “Q2” is greater than zero. The model is considered to have a predictive correlation when the cross-validated redundancy measure “Q2” is greater than zero and is considered to have a small predictive correlation when it is greater than 0.02, a medium predictive correlation when it is greater than 0.15, and a large predictive correlation when it is greater than 0.35. The explained variance “R2” is considered “substantial” when it is greater than 0.67 [6,36], and the results are shown in Table 5.
P-values were obtained through bootstrapping with two-tailed tests. Bootstrapping was conducted by randomizing the original dataset to estimate the statistical significance of the PLS path model [9,36]. The results, presented in Table 6, indicate that several internal path coefficients are statistically significant. The analysis revealed the following:
  • Gamification Usefulness significantly influenced Performance Expectancy (p = 0.000, t = 3.946) and Self-Efficacy (p = 0.000, t = 13.693), supporting H1 and H2.
  • Gamification Motivation significantly influenced Effort Expectancy (p = 0.000, t = 6.648) and Hedonic Motivation (p = 0.000, t = 28.222), supporting H3 and H4.
  • Performance Expectancy (p = 0.012, t = 2.516), Self-Efficacy (p = 0.008, t = 2.663), and Effort Expectancy (p = 0.007, t = 2.701) significantly influenced Attitude, supporting H5, H6, and H7.
  • Attitude significantly influenced Behavioral Intention (p = 0.000, t = 20.191), and Behavioral Intention significantly influenced Use Behavior (p = 0.000, t = 16.097), supporting H9 and H10.
  • Unexpectedly, H8 was not supported, as Hedonic Motivation did not significantly predict Attitude (p = 0.740, t = 0.332).

5. Discussion

The previous sections detailed the experimental results and the analysis and validation of the structural model. In this section, these findings are synthesized to answer the stated research questions (RQs).

5.1. RQ1: Does Adding GEG as a Teaching Aid Have Higher Learning Outcomes than Traditional Teaching Methods?

The post-test results showed that the experimental group passed all three requirements better than the control group, so adding GEG as a teaching aid has higher learning outcomes than the traditional teaching method.
However, the pass rate of the conflict resolution questions was not high. We believe that this is due to the students not having many chances to practice collaborative coding in the course because no team projects were assigned, so the students have no experience in branching, merging, and solving conflict, so they have trouble solving questions with high levels of difficulty.

5.2. RQ2: Does GEG as a Teaching Aid Have a Positive Impact on Students’ Attitude and Behavior Towards Git?

According to the supported hypotheses of H1, H2, H3, and H4, the gamification element of the GEG design positively influenced students’ perceptions of learning Git. According to the supported hypotheses of H5, H6, and H7, performance expectations, effort expectations, and self-efficacy assessments positively influenced their attitudes toward learning and using Git. According to the supported hypotheses of H9 and H10, attitudes toward Git influenced students’ subsequent use of Git.
The hypotheses of H9 and H10 were supported by the hypotheses of H5, H6, and H7, which showed that the students’ attitudes toward learning and using Git had a positive impact on the students’ willingness to use Git and further influenced their subsequent use behavior.
Therefore, it can be concluded that GEG as a teaching aid can positively influence students’ attitudes toward learning and using Git, as well as their subsequent behaviors. Surprisingly, H8 did not hold, as this study suggests that it is because hedonistic motivation cannot directly affect the attitude towards Git but only enhances students’ motivation when engaging in learning.

5.3. RQ3: What Is the Percentage of Students Who Are Motivated to Learn with GEG? Do Students Who Are Active Learners Have Higher Learning Outcomes?

Our analysis shows that 55 students participated in Git-related teaching activities. Among these, 34 students (over 60%) maintained activity records after the experiment, indicating active engagement with the GEG system.
Due to the lack of identifiable student numbers in some post-experiment activity records, we selected 27 students with clear activity data for comparison with the entire experimental group. The post-test analysis revealed that these active learners consistently outperformed their peers across all assessed tasks, as illustrated in Figure 11. The results confirm that students who actively engaged with GEG demonstrated higher performance, reinforcing the conclusion that active participation in GEG-based learning significantly enhances learning outcomes.

5.4. RQ4: What Do Students Think Are the Advantages and Disadvantages of GEG?

We consolidated the OEQ1 and OEQ2 responses from the questionnaire, integrated similar feedback, and listed the strengths mentioned in the feedback:
  • Fun is less boring and more motivated to learn (42%).
  • Easy to understand and know the mistakes; good to learn with feedback (30%).
  • The interface design is easier to get started with, and the website is convenient and good-looking (19%).
  • The learning difficulty is gradual and easier to learn (19%).
  • The following are some of the deficiencies identified in the feedback:
  • It will crash or get stuck (30%).
  • The instructions are simplified, so you may get it wrong when you use it (30%).
  • Some hints or instructions are not clear and will not be understood (30%).
In addition, there was one feedback that mentioned wishing the course had more time and another feedback that some levels were too difficult (presumably because of level 8 and level 9).
Regarding this negative feedback, we plan to first find out when students are crashing according to the backend data, fix these situations, and return the simplified commands to the original Git commands to avoid students’ confusion when using them in practice, and finally, strengthen the hints by demonstrating and adding more icons and descriptions in the instructions to make it easier for students to read and understand the game’s level hints.

5.5. Future Works

While the proposed method demonstrates the potential of the gamification approach for teaching Git in academic settings, several enhancements can be explored in future studies to maximize its impact and effectiveness.

5.5.1. Incorporating Behavioral Metrics

Behavioral metrics, such as the number of levels completed, instructions followed, time spent on tasks, and other Git-related activities, can be integrated with learning analytics to provide valuable insights into students’ learning progress. This integration allows instructors to better understand individual performance, enabling targeted feedback, support, or intervention. Additionally, Git-based metrics can highlight specific topics where students need improvement, offering actionable insights to refine the learning process. Furthermore, predictive models can be developed to forecast students’ learning performance, aiding in proactive educational planning.

5.5.2. Real-Time Feedback

With the current advancements in Large Language Models (LLMs), feedback and hints can be generated in real-time based on the problem and the current state of the solution. This enables immediate support in various forms, such as text guides, code hints, and sub-task suggestions. By integrating LLMs into the system with appropriate prompts, instructors can customize the type and timing of feedback to best support students. This approach ensures timely assistance and enhances the learning experience.

6. Threats to Validity

Despite the positive feedback from students on the gamified method, this research faces several threats to validity. First, the small sample size may limit the generalizability of the findings. This study collected 39 responses, representing 66% of the experimental group, constrained by the total number of students enrolled in the course for that academic year. In future iterations of the course, with potentially larger enrollment, a greater sample size could improve the robustness and reliability of the results. Second, since this study employs PLS-SEM for data analysis, assumptions of linearity and multivariate normality were not tested, as PLS-SEM is known to be robust against non-linearity and violations of normality. Nevertheless, the small sample size may still impact the goodness of fit of the PLS-SEM model, even though the method generally handles non-normal data distributions well [37].

7. Conclusions

The ability to use version control software is crucial for software engineers, yet little attention has been paid to it in school education, and most of the related research has been conducted in conjunction with version control application software, while the game-based learning approach has considerable potential. Therefore, this study developed a serious game for educational Git, as well as designed and conducted an educational research experiment to evaluate the educational effect of the system and designed a research model based on the UTAUT2 model to fit the data from the survey through PLS-SEM.
Based on the results of this study, we found that the group that included GEG as an educational aid had higher learning outcomes compared to the group that was taught by the traditional classroom alone and that more than half of these students were active learners, and these students had higher learning outcomes compared to the overall experimental group. In addition, all the factors in the model, except for the hedonic motivation, had a significant direct effect on students’ attitudes toward Git, and the gamification-related factors also had an indirect effect on students’ attitudes toward Git, which in turn influenced students’ behavioral intentions and subsequent behaviors.
For future research, we plan to change the content of the game so that it is closer to real Git commands when used and to fix the problems mentioned in the feedback, such as crashes and unclear instructions. In addition, we plan to add more levels, which can be collectively called “challenges,” to provide students with more opportunities to practice. By integrating the parts that are not easily understood by students into various contextual challenges, the game will grow from a simple introductory tool to a tool that can be used for both introductory and practice purposes. The gameplay part is to design more scenarios for the point mechanism and add more achievements and badges.

Author Contributions

H.-M.C.: Conceptual, funding acquisition, methodology, analysis results interpreting. B.-A.N.: formal analysis, writing—original draft preparation; Y.-W.C. software, data curation, investigation; C.-R.D.: project administration, writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported in part by the National Science and Technology Council, Taiwan R.O.C., under grant no. NSTC112-2221-E-035-029-MY2.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The research team acknowledges the support of the National Science and Technology Council, Taiwan.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. GEG Design Principles

GEG was designed respecting the following principles:
1
Analyze learners:
Our learners are mainly second-year college students in computer science with basic programming skills, but most of them cannot use version control tools and do not have long-term experience in maintaining projects or developing large projects. Therefore, they need to be equipped with basic knowledge of version control and the ability to use version control tools.
2
Set clear teaching objectives and select appropriate gaming material:
The educational goal of this study was to equip students with the ability to use Git commands, such as creating a Git repository, committing, pushing, etc., and to understand the reasons for using version control tools and the workflow of Git. Therefore, this study chose to use a simulated development scenario in which students operate a simulated Git tool in a game to accomplish the task goals we set.
3
Design teaching instructions according to teaching objectives and game content:
In this study, Git instructions were designed on several levels, and students were asked to complete the tasks we set in a simulated situation. Students are provided with in-game mechanisms as learning aids, and the competitive game mechanics and rewards we designed can motivate students to participate in the learning process and increase their motivation to learn.
4
Consider teaching as the primary goal and use games as supplementary tools:
In this study, GEG is taught as part of the curriculum, and Git is taught in a traditional lecture format in subsequent classes so that students who have difficulty learning can play games for self-directed learning and choose a learning style that suits them.
5
Make good use of the characteristics of computer games:
Our game is a simulation game with some features of a competitive game, which brings the opportunity and motivation to try again and again based on the characteristics of a computer game, and the hints and instruction cards are used as tools to assist students’ understanding.
6
Place students at the center of the process and help them enjoy studying:
In the learning process, based on the game’s points mechanism, achievement mechanism, etc., students take the initiative to try various possibilities and increase their motivation to actively participate in learning so that students can learn more about Git and how to use it.
7
Periodically assess students’ learning and constantly improve teaching:
Students’ activities are recorded and transmitted to the server’s backend, where an analytics module processes the data to extract key metrics, such as the number of levels completed, instructions followed, and time spent on tasks. This information is displayed on the instructor interface, enabling teachers to assess students’ learning progress, identify and soften levels that may be too difficult, and add more levels to help students clarify confusing concepts. In addition, tests and surveys are also conducted to understand students’ learning status.

Appendix B. The Questionnaire

Table A1. Questionnaire items and corresponding response descriptive statistics.
Table A1. Questionnaire items and corresponding response descriptive statistics.
#QuestionsMeanStd
PE1I learned faster with GitEducation Game4.1026 0.8824
PE2Using GitEducation Game has improved my learning4.0769 0.8701
EE1GitEducation Game is not too taxing to use3.8974 1.0462
EE2The features and interface of GitEducation Game are very clear and easy to understand3.8974 0.9678
SE1When there is an error with Git, if there is no one around to teach me, I can play the GitEducation Game to understand and solve the problem3.8205 0.9966
SE2When I have trouble learning Git, I can use the GitEducation Game to clarify the concepts or operations of Git if there is no one around to teach me how to do it.3.7949 1.0047
HM1I think learning Git through the GitEducation Game is fun4.1282 0.7320
HM2I think learning Git through the GitEducation Game gives me more motivation4.0513 0.7930
GM1When I learn with GitEducation Game, the in-game score and leaderboard system make me more motivated to participate in learning3.8974 0.9946
GM2The achievement system in the game gives me more motivation to learn with GitEducation Game3.9487 0.9986
GM3The in-game hints and tutorials give me more motivation to understand Git concepts than a pure lecture.4.1538 0.8441
GU1When learning Git with the GitEducation Game, the in-game hints or tutorials help me understand the Git concepts being taught in the levels.4.1795 0.7564
GU2When learning Git with the GitEducation Game, I can understand how to use the Git commands by completing the levels4.1538 0.7793
GU3When learning Git with the GitEducation Game, the visualization, and mechanics of the game helped me understand the concept or process of working with Git.4.0256 0.9315
AT1I think that using a version control tool to control source code is a good idea4.3077 0.7662
AT2Studying to use a version control tool is a good idea4.4103 0.7511
AT3I generally favor the use of the version control tool4.3590 0.7776
AT4I am positive about the use of the version control tool4.2051 0.8639
BI1I intend to use version control tools in the future4.1538 0.9330
BI2I intend to use version control tools frequently4.0513 0.9445
BI3I intend to be a heavy user of version control tools4.1282 0.9509
UB1I use the version control tool frequently in all my software projects3.4359 1.3533
UB2I recommend my classmate to use a version control tool like me4.0000 1.0260
OEQ1What do you think makes GitEducation Game appealing to you in terms of teaching or gaming? Or what are the learning benefits for you?XX
OEQ2What do you think are the shortcomings of the GitEducation Game in terms of teaching or gaming? Or what is missing in teaching?XX

References

  1. Haaranen, L.; Lehtinen, T. Teaching Git on the Side—Version Control System as a Course Platform. In Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE), New York, NY, USA, 4–8 July 2015; pp. 87–92. [Google Scholar] [CrossRef]
  2. Zichermann, G.; Linder, J. Game-Based Marketing: Inspire Customer Loyalty Through Rewards, Challenges, and Contests; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar]
  3. von Wangenheim, C.G.; Savi, R.; Borgatto, A.F. SCRUMIA—An Educational Game for Teaching SCRUM in Computing Courses. J. Syst. Softw. 2013, 86, 2675–2687. [Google Scholar] [CrossRef]
  4. Git Community. About Git. Available online: https://git-scm.com/about (accessed on 5 January 2024).
  5. Unity Technologies. Unity. Available online: https://unity.com/ (accessed on 5 January 2024).
  6. Wong, K.K.-K. Partial Least Squares Structural Equation Modeling (PLS-SEM) Techniques Using SmartPLS. Mark. Bull. 2013, 24, 1–32. [Google Scholar]
  7. Huang, F.-H. Adapting UTAUT2 to Assess User Acceptance of an E-Scooter Virtual Reality Service. Virtual Real. 2020, 24, 635–643. [Google Scholar] [CrossRef]
  8. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  9. Uskov, A.; Sekar, B. Serious Games, Gamification and Game Engines to Support Framework Activities in Engineering: Case Studies, Analysis, Classifications and Outcomes. In Proceedings of the IEEE International Conference on Electro/Information Technology, Milwaukee, WI, USA, 5–7 June 2014; pp. 618–623. [Google Scholar]
  10. Fleming, T.M.; Cheek, C.; Merry, S.N.; Thabrew, H.; Bridgman, H.; Stasiak, K.; Shepherd, M.; Perry, Y.; Hetrick, S. Serious Games for the Treatment or Prevention of Depression: A Systematic Review. Rev. Psicopatol. Y Psicol. Clín. 2014, 19, 227–242. [Google Scholar] [CrossRef]
  11. Deterding, S.; Dixon, D.; Khaled, R.; Nacke, L. From Game Design Elements to Gamefulness: Defining “Gamification”. In Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, Tampere, Finland, 28–30 September 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 9–15. [Google Scholar]
  12. Fleming, T.M.; Bavin, L.; Stasiak, K.; Hermansson-Webb, E.; Merry, S.N.; Cheek, C.; Lucassen, M.; Lau, H.M.; Pollmuller, B.; Hetrick, S. Serious Games and Gamification for Mental Health: Current Status and Promising Directions. Front. Psychiatry 2017, 7, 215. [Google Scholar] [CrossRef]
  13. Long, B.; Simson, J.; Buxó-Lugo, A.; Watson, D.G.; Mehr, S.A. How Games Can Make Behavioural Science Better. Nature 2023, 613, 433–436. [Google Scholar] [CrossRef]
  14. Burke, J.W.; McNeill, M.D.J.; Charles, D.K.; Morrow, P.J.; Crosbie, J.H.; McDonough, S.M. Optimising Engagement for Stroke Rehabilitation Using Serious Games. Vis. Comput. 2009, 25, 1085–1099. [Google Scholar] [CrossRef]
  15. Chatham, R.E. Games for Training. Commun. ACM 2007, 50, 36–43. [Google Scholar] [CrossRef]
  16. Mayer, I.S. Playful Organisations & Learning Systems; NHTV Colin: Breda, The Netherlands, 2016. [Google Scholar]
  17. Udeozor, C.; Toyoda, R.; Abegão, F.R.; Glassey, J. Digital Games in Engineering Education: Systematic Review and Future Trends. Eur. J. Eng. Educ. 2023, 48, 321–339. [Google Scholar] [CrossRef]
  18. Baker, A.; Navarro, E.O.; van der Hoek, A. An Experimental Card Game for Teaching Software Engineering Processes. J. Syst. Softw. 2005, 75, 3–16. [Google Scholar] [CrossRef]
  19. Gordillo, A.; López-Fernández, D.; Tovar, E. Comparing the Effectiveness of Video-Based Learning and Game-Based Learning Using Teacher-Authored Video Games for Online Software Engineering Education. IEEE Trans. Educ. 2022, 65, 524–532. [Google Scholar] [CrossRef]
  20. Gurbuz, S.C.; Celik, M. Serious Games in Future Skills Development: A Systematic Review of the Design Approaches. Comput. Appl. Eng. Educ. 2022, 30, 1591–1612. [Google Scholar] [CrossRef]
  21. Kelleher, J. Employing Git in the Classroom. In Proceedings of the 2014 World Congress on Computer Applications and Information Systems (WCCAIS), Hammamet, Tunisia, 17–19 January 2014; IEEE: New York, NY, USA, 2014; pp. 1–4. [Google Scholar]
  22. Radermacher, A.; Walia, G. Gaps between Industry Expectations and the Abilities of Graduates. In Proceedings of the 44th ACM Technical Symposium on Computer Science Education, Denver, CO, USA, 6–9 March 2013; Association for Computing Machinery: New York, NY, USA, 2013; pp. 525–530. [Google Scholar]
  23. Eraslan, S.; Kopec-Harding, K.; Jay, C.; Embury, S.M.; Haines, R.; Ríos, J.C.C.; Crowther, P. Integrating GitLab Metrics into Coursework Consultation Sessions in a Software Engineering Course. J. Syst. Softw. 2020, 167, 110613. [Google Scholar] [CrossRef]
  24. Granić, A.; Marangunić, N. Technology Acceptance Model in Educational Context: A Systematic Literature Review. Br. J. Educ. Technol. 2019, 50, 2572–2593. [Google Scholar] [CrossRef]
  25. Venkatesh, V.; Thong, J.Y.L.; Chan, F.K.Y.; Hu, P.J.-H.; Brown, S.A. Extending the Two-Stage Information Systems Continuance Model: Incorporating UTAUT Predictors and the Role of Context. Inf. Syst. J. 2011, 21, 527–555. [Google Scholar] [CrossRef]
  26. Tamilmani, K.; Rana, N.P.; Dwivedi, Y.K. Consumer Acceptance and Use of Information Technology: A Meta-Analytic Evaluation of UTAUT2. Inf. Syst. Front. 2021, 23, 987–1005. [Google Scholar] [CrossRef]
  27. Baptista, G.; Oliveira, T. Understanding Mobile Banking: The Unified Theory of Acceptance and Use of Technology Combined with Cultural Moderators. Comput. Hum. Behav. 2015, 50, 418–430. [Google Scholar] [CrossRef]
  28. Liu, E.Z.F. Avoiding Internet Addiction When Integrating Digital Games into Teaching. Soc. Behav. Personal. Int. J. 2011, 39, 1325–1335. [Google Scholar] [CrossRef]
  29. Anderson, L.W.; Krathwohl, D.R.; Airasian, P.W.; Cruikshank, K.A.; Mayer, R.E.; Pintrich, P.R.; Raths, J.; Wittrock, M.C. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, Abridged Edition; Longman: Harlow, UK, 2000; ISBN 978-0801319037. [Google Scholar]
  30. Lawrance, J.; Jung, S.; Wiseman, C. Git on the Cloud in the Classroom. In Proceedings of the 44th ACM Technical Symposium on Computer Science Education—SIGCSE’13, Denver, CO, USA, 6–9 March 2013; ACM Press: New York, NY, USA, 2013; p. 639. [Google Scholar]
  31. Seaborn, K.; Fels, D.I. Gamification in Theory and Action: A Survey. Int. J. Hum. Comput. Stud. 2015, 74, 14–31. [Google Scholar] [CrossRef]
  32. Star, K. Gamification, Interdependence, and the Moderating Effect of Personality on Performance; Coventry University: Coventry, UK, 2015. [Google Scholar]
  33. Hanus, M.D.; Fox, J. Assessing the Effects of Gamification in the Classroom: A Longitudinal Study on Intrinsic Motivation, Social Comparison, Satisfaction, Effort, and Academic Performance. Comput. Educ. 2015, 80, 152–161. [Google Scholar] [CrossRef]
  34. Hamari, J.; Koivisto, J.; Sarsa, H. Does Gamification Work?—A Literature Review of Empirical Studies on Gamification. In Proceedings of the 47th Annual Hawaii International Conference on System Sciences, Waikoloa, HI, USA, 6–9 January 2014; pp. 3025–3034. [Google Scholar] [CrossRef]
  35. Ringle, C.M.; Wende, S.; Becker, J.-M. SmartPLS 4; SmartPLS GmbH: Bönningstedt, Germany, 2024. [Google Scholar]
  36. Hair, J.F., Jr.; Ringle, C.M.; Sarstedt, M. Partial Least Squares Structural Equation Modeling: Rigorous Applications, Better Results and Higher Acceptance. Long Range Plan. Int. J. Strateg. Manag. 2013, 46, 1–12. [Google Scholar] [CrossRef]
  37. Ainur, A.K.; Sayang, M.D.; Jannoo, Z.; Yap, B.W. Sample Size and Non-Normality Effects on Goodness of Fit Measures in Structural Equation Models. Pertanika J. Sci. Technol. 2017, 25, 575–586. [Google Scholar]
Figure 1. Start menu: Students must register and log in before playing the game.
Figure 1. Start menu: Students must register and log in before playing the game.
Electronics 13 04956 g001
Figure 2. Chapter menu: A chapter represents a Git concept and instruction.
Figure 2. Chapter menu: A chapter represents a Git concept and instruction.
Electronics 13 04956 g002
Figure 3. The six blocks of the game.
Figure 3. The six blocks of the game.
Electronics 13 04956 g003
Figure 4. Level Passed.
Figure 4. Level Passed.
Electronics 13 04956 g004
Figure 5. Getting achievement.
Figure 5. Getting achievement.
Electronics 13 04956 g005
Figure 6. Achievement viewer.
Figure 6. Achievement viewer.
Electronics 13 04956 g006
Figure 7. The experiment procedure.
Figure 7. The experiment procedure.
Electronics 13 04956 g007
Figure 8. The research model.
Figure 8. The research model.
Electronics 13 04956 g008
Figure 9. Post-test passing rate.
Figure 9. Post-test passing rate.
Electronics 13 04956 g009
Figure 10. PLS-SEM analysis results of the research model calculated using SmartPLS.
Figure 10. PLS-SEM analysis results of the research model calculated using SmartPLS.
Electronics 13 04956 g010
Figure 11. Post-test passing rate (compared between active learners and the overall experimental group).
Figure 11. Post-test passing rate (compared between active learners and the overall experimental group).
Electronics 13 04956 g011
Table 1. Constructs used in the research framework.
Table 1. Constructs used in the research framework.
IDConstructDescription
1PE (Performance Expectancy)The degree to which students believe the system will enhance their performance in learning Git.
2EE (Effort Expectancy)The perceived ease of use of the system in the context of learning Git.
3SE (Self-Efficacy)Students’ belief in their ability to successfully use the system to learn and apply Git.
4HM (Hedonic Motivation)The enjoyment or pleasure derived from using the system.
5GM (Gamification Motivation)The motivation generated by gamification features, such as rewards or challenges, to engage with the system.
6GU (Gamification Usefulness)The perceived usefulness of the gamification elements in supporting the learning of Git.
7BI (Behavioral Intention)Students’ intention to continue using the system in the future.
8UB (Use Behavior)The actual usage of the system, including frequency and duration.
Table 2. Pre-test results for experiment and control groups.
Table 2. Pre-test results for experiment and control groups.
GroupCountMeanMedianMinimumMaximumStd. Dev.
Control group5472.5926704010015.6838
Experimental group5976.2712803010017.0090
Table 3. Internal consistency, item reliability, and convergent validity.
Table 3. Internal consistency, item reliability, and convergent validity.
ConstructsItemsCL (>0.7)α (>0.7)CR (>0.7)AVE (>0.5)
Gamification
Usefulness
GU1
GU2
GU3
0.894
0.988
0.904
0.9200.9500.864
Gamification
Motivation
GM1
GM2
GM3
0.875
0.898
0.764
0.8010.8790.708
Performance
Expectancy
PE1
PE2
0.996
0.996
0.9920.9960.992
Effort
Expectancy
EE1
EE2
0.927
0.964
0.8860.9440.895
Self-EfficacySE1
SE2
0.983
0.984
0.9660.9830.967
Hedonic
Motivation
HM1
HM2
0.954
0.954
0.9010.9530.910
AttitudeAT1
AT2
AT3
AT4
0.893
0.920
0.943
0.929
0.9410.9570.849
Behavioral
Intention
BI1
BI2
BI3
0.973
0.973
0.977
0.9730.9830.950
Use BehaviorUB1
UB2
0.938
0.937
0.8620.9360.879
Notes: CL—cross-loadings; α—Cronbach’s alpha; CR—composite reliability; AVE—average variance extracted.
Table 4. Results of the Fornell–Larcker criterion analysis (discriminant validity test).
Table 4. Results of the Fornell–Larcker criterion analysis (discriminant validity test).
UBATBIEEGMGUHMPESE
UB0.938
AT0.7930.921
BI0.8270.8880.974
EE0.3710.3450.4110.946
GM0.5250.4830.5000.6320.841
GU0.4420.3930.4130.6050.7650.930
HM0.4210.4610.3340.5840.8530.7670.954
PE0.4250.4920.4470.8480.6170.5710.6400.996
SE0.6770.5680.6670.6560.7970.7690.5740.5410.984
Notes: UB—use behavior; AT—attitude; BI—behavior intention; EE—effort expectancy; GM—gamification motivation; GU—gamification usefulness; HM—hedonic motivation; PE—performance expectancy; SE—self-efficacy.
Table 5. Explained variance (R2) and predictive relevance (Q2).
Table 5. Explained variance (R2) and predictive relevance (Q2).
Dependent VariablesR2 (>0.5)Q2 (>0)Result
Performance Expectancy0.3270.306weak
Self-Efficacy0.5920.670moderate
Effort Expectancy0.4000.314weak
Hedonic Motivation0.7270.583moderate
Attitude0.4800.378weak
Behavioral Intention0.7890.741substantial
Use Behavior0.6840.593moderate
Table 6. Results.
Table 6. Results.
HRelationOriginal
Sample
(O)
Sample
Mean
(M)
Standard
Deviation
(SD)
T
Statistics
(>1.96)
P
Values
(<0.05)
H1GU→PE0.5710.5670.1453.9460.000
H2GU→SE0.7690.7760.05613.6930.000
H3GM→EE0.6320.6380.0956.6480.000
H4GM→HM0.8530.8690.03028.2220.000
H5PE→AT0.7050.6760.2802.5160.012
H6SE→AT0.5940.6170.2232.6630.008
H7EE→AT−0.681−0.6700.2522.7010.007
H8HM→AT0.0660.0700.2000.3320.740 *
H9AT→BI0.8880.8900.04420.1910.000
H10BI→UB0.8270.8900.04416.0970.000
Notes: UB—use behavior; AT—attitude; BI—behavior intention; EE—effort expectancy; GM—gamification motivation; GU—gamification usefulness; HM—hedonic motivation; PE—performance expectancy; SE—self-efficacy; H—hypothesis; *—Non-support.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, H.-M.; Nguyen, B.-A.; Chang, Y.-W.; Dow, C.-R. A Gamified Method for Teaching Version Control Concepts in Programming Courses Using the Git Education Game. Electronics 2024, 13, 4956. https://doi.org/10.3390/electronics13244956

AMA Style

Chen H-M, Nguyen B-A, Chang Y-W, Dow C-R. A Gamified Method for Teaching Version Control Concepts in Programming Courses Using the Git Education Game. Electronics. 2024; 13(24):4956. https://doi.org/10.3390/electronics13244956

Chicago/Turabian Style

Chen, Hsi-Min, Bao-An Nguyen, You-Wei Chang, and Chyi-Ren Dow. 2024. "A Gamified Method for Teaching Version Control Concepts in Programming Courses Using the Git Education Game" Electronics 13, no. 24: 4956. https://doi.org/10.3390/electronics13244956

APA Style

Chen, H.-M., Nguyen, B.-A., Chang, Y.-W., & Dow, C.-R. (2024). A Gamified Method for Teaching Version Control Concepts in Programming Courses Using the Git Education Game. Electronics, 13(24), 4956. https://doi.org/10.3390/electronics13244956

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop