Abstract
Prior research acknowledges the need to further develop undergraduate mathematics instruction. The aim of this study is to investigate the relationship between instructional design and quality of learning. This quantitative study approaches the learning environment by comparing students’ approaches to learning, self-efficacy, and experiences of the teaching-learning environment in two undergraduate mathematics courses using different pedagogical approaches. The first course functioned within a traditional lecture-based framework with the inclusion of student-centred elements, and the second course was implemented with Extreme Apprenticeship, a novel student-centred teaching method. The analysis is based on the same cohort of students in these two contexts (N = 91). Students were clustered based on their deep and surface approaches to learning and three clusters were identified: students applying a deep approach, students applying a surface approach, and students applying a context-sensitive surface approach. The results show that the more student-centred course design succeeded in supporting more favourable approaches to learning, higher self-efficacy levels, and more positive experiences of the teaching-learning environment. In addition, all three clusters benefited from the more student-centred course design, with students applying a context-sensitive surface approach benefiting the most. Overall, the results suggest that it is possible to promote the quality of university mathematics learning with instructional designs that, besides content, take a holistic approach to the learning environment.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Traditionally in Finland, as in many other European countries, the main emphasis of university mathematics instruction has been on content delivery, and instruction relies heavily on lectures built around axioms, definitions, theorems and their proofs. During the last decades, opposite to passively receiving information from the lecturer, new student-centred instructional practices have emerged with the emphasis on students’ own activity, responsibility and independence for learning (Baeten et al. 2010). Accordingly, many current researchers support more student-centred mathematics instruction. For example, Freudenthal (1991) suggests that students should have the opportunity to reinvent mathematics for themselves, and Schoenfeld (2007) stands behind mathematics curricula that focus on sense making. Similarly, Hmelo-Silver (2004) promotes problem-based learning (PBL) by stating that “PBL offers the potential to help students become reflective and flexible thinkers who can use knowledge to take action”. In an undergraduate mathematics education context, Rasmussen and Kwon (2007) argue for an inquiry-oriented approach, as it promotes conceptual understanding and positively influences students’ beliefs about mathematics. In addition, Lesseig and Krouss (2017) report encouraging results from using flipped learning in collegiate mathematics. Konstantinou-Katzi et al. (2013) call for an overall approach to university mathematics instruction that is adaptive to individual student’s needs, as such an approach has a positive impact on learning and on attitudes towards mathematics. However, implementing this kind of adaptive instruction is challenging on a mass course scale, as recognised by Konstantinou-Katzi et al. (2013).
In line with global trends, much effort has been put into developing the educational setting at the University of Helsinki’s Department of Mathematics and Statistics. The department’s teaching has undergone major changes during recent years as some of the traditional lecture-based courses have been developed in a more interactive direction (see e.g. Oikkonen 2009). In addition, many of the undergraduate-level mass-lecture courses are now using the Extreme Apprenticeship method (XA; Rämö et al. 2015) as their pedagogical framework.
Baeten et al. (2010) acknowledge multiple ways to implement student-centred instruction in practice. However, all of the different types of student-centred instruction share the aim to foster deep learning and understanding (Baeten et al. 2010). The aforementioned ways of designing mathematics instruction have a similar aim to move from rote memorisation and manipulative techniques towards holistic mathematical understanding. This kind of understanding can be expressed through deep approach to learning (Baeten et al. 2010). Therefore, the aim of this paper is to articulate the role of different instructional designs in the quality of learning in undergraduate mathematics from the perspective of the Student Approaches to Learning (SAL) tradition (Entwistle and Tait 1990; Marton and Säljö 1976). According to Crawford et al. (1998), mathematics students’ conceptions of mathematics are linked to their approaches to learning and experiences of the learning environment. Further, Crawford et al. (1998) link fragmented conceptions of mathematics to a surface approach to learning, and a cohesive conception of mathematics to a deep approach to learning, positive experiences of the learning environment and higher achievement in university mathematics studies. As a consequence, they argue that learning and teaching of mathematics are not independent activities; rather, they call for research that, besides the presentation of content, focuses on the learning environment as a whole (see also Crawford et al. 1994). As a need to develop new instructional practices in university mathematics education yet remains, it is important to identify the characteristics of a learning environment that positively contribute to the quality of students’ mathematical understanding. In essence, the present study is motivated by the need to further identify effective instructional designs and transfer knowledge from research to teaching practices.
Theoretical Framework
The theoretical framework of the study is built on the intertwined relationships between approaches to learning, self-efficacy, and students’ experiences of the teaching-learning environment. The following subsections describe these connections in more detail and elaborate their links to university mathematics education.
Approaches to Learning
Marton and Säljö (1976) first distinguished between two qualitatively different ways students approach learning: deep-level and surface-level processing. They later reformulated the names of these categories as a deep approach and surface approach to learning, in order to emphasise the involvement of both process and intention (Marton and Säljö 1984). A deep approach to learning refers to a student’s intention to maximise understanding by focusing on meaning and relating ideas to existing knowledge. By contrast, students applying a surface approach focus on rote learning and memorising, and they aim at reproduction rather than understanding (Biggs 1991; Coertjens et al. 2016; Marton and Säljö 1984). In a recent paper, Lindblom-Ylänne et al. (2018) state that a surface approach constitutes unreflective studying and experience of a fragmented knowledge base.
Biggs (1987) later added a third approach to learning, an achieving approach, which is more commonly referred to as organised studying. Organised studying refers to students organising and managing the context of studying rather than the learning itself (Biggs 1987; Coertjens et al. 2016; Marton and Säljö 1984). A central element of organised studying is also time effort and time management (see e.g. Hailikari and Parpala 2014). Therefore, the organised approach is sometimes viewed as an approach to studying rather than an approach to learning.
Several studies show that a deep approach to learning has a positive effect on learning outcomes (see e.g. Marton and Säljö 1976; Trigwell and Prosser 1991). As a result, a deep approach is the most valued of the three approaches to learning. In the context of undergraduate mathematics education, Maciejewski and Merchant (2016) argue that proof-based mathematics courses require a deep approach to learning. In addition, Murphy (2017) associates a deep approach to learning with higher mathematical performance. By contrast, Maciejewski and Merchant (2016) find a strong negative correlation between a surface approach and course grades, implying that a surface approach leads to lower achievement. Therefore, deep and surface approaches to learning play a significant role in the quality of learning in university mathematics.
Students have tendencies towards certain approaches to learning when entering a learning environment. However, there is mixed evidence on the stability of these tendencies (Lindblom-Ylänne et al. 2013). For example, Varunki et al. (2017) suggest that multiple factors influence the stability of students’ approaches to learning and that the importance of these factors greatly depends on the individual student. In turn, some studies suggest that students who apply a deep approach to learning are more stable in their approaches to learning compared to students applying a surface approach (Coertjens et al. 2016; Wilson and Fowler 2005). By contrast, Wilson and Fowler (2005) suggest that students who generally apply a surface approach can sometimes move towards a deep approach in a student-centred learning environment. Similarly, Baeten et al. (2010) argue that student-centred learning environments together with possibilities for independent study promote a deep approach to learning. An answer to what extent instructional practices can support the more favourable approaches to learning is yet to be clarified. However, student satisfaction with the quality of teaching and the overall characteristics of the learning environment is necessary to promote a deep approach to learning (Baeten et al. 2010).
Teaching-Learning Environment
Researchers agree that students’ approaches to learning are related to their experiences of the teaching-learning environment, consisting of the instructional practices including interactions with teachers and other students. At the same time, researchers acknowledge the challenge of enhancing a deep approach to learning through traditional instruction (Baeten et al. 2013; Marton and Säljö 1984). Thus, it is tempting to hypothesise that more student-centred learning environments support the implementation of a deep approach to learning. However, although the overall findings support this claim, the relationship is not so straightforward, as is well illustrated in a literature review by Baeten et al. (2010). In fact, a later study by Baeten et al. (2013) demonstrated that a student-centred learning environment can sometimes push students towards a surface approach.
In general, students who apply a deep approach to learning perceive the teaching-learning environment more positively than students applying a surface approach (Baeten et al. 2010; Parpala et al. 2010). Moreover, a deep approach is positively correlated, and surface approach negatively correlated, with experiences of the amount of feedback and support given to the students, course alignment, relevance and peer support (Baeten et al. 2010; Coertjens et al. 2016; Entwistle and Tait 1990; Parpala et al. 2010). In addition, a surface approach to learning is on one hand related to experiencing lack of challenges (Coertjens et al. 2016) and on the other hand, to the perception of an overly heavy workload (Baeten et al. 2010; Entwistle and Tait 1990).
Self-Efficacy
Self-efficacy is a person’s belief in their ability to perform a specific task in a specific context, and such beliefs determine how people feel, think, motivate themselves and behave (Bandura 1994). These beliefs also play a crucial role in learning mathematics, as self-efficacy is related to academic achievement (Pajares 1996; Peters 2013). In terms of instructional design, Peters (2013) demonstrates that students’ self-efficacy is higher in teacher-centred classrooms compared to learner-centred classrooms. By contrast, Kogan and Laursen (2014) argue that female students tend to obtain affective gain from student-centred courses, as their confidence in their mathematical abilities is higher than that of male students. In light of these findings, Zimmerman’s (1995) suggestion to approach educational development from the perspective of efficacy beliefs seems relevant.
Aims and Research Questions
This paper focuses on the development of university mathematics instruction. The study aims to identify the characteristics of an instructional design that promotes quality of learning. The research is performed by quantitatively comparing the same cohort of students’ approaches to learning, self-efficacy, and experiences of the teaching-learning environment in the context of two mathematics courses at the Department of Mathematics and Statistics, University of Helsinki. Besides content, the courses also differ in their pedagogical approaches.
The paper begins with a comparison of students’ course-level responses and addresses the first research question:
-
1.
How do different instructional designs relate to individual students’ approaches to learning, self-efficacy, and experiences of the teaching-learning environment?
The paper then continues with a student-level analysis, the aim being to identify groups of students with different approaches to learning. These groups are then investigated in more detail within and between the two course contexts. Here, the second and third research questions are addressed:
-
2.
What approaches to learning do students use and how do these approaches vary between the two course contexts?
-
3.
How do students’ different approaches to learning relate to their experiences of the two learning environments and their self-efficacy? And how do these experiences differ between the two course contexts?
As stated in the previous section, a large body of research demonstrates that approaches to learning, experiences of the learning environment and efficacy beliefs are intertwined. Therefore, as Heikkilä and Lonka (2006) argue, it is important to develop instructional practices that support the development of all these aspects of learning. The aim of the study is thus to identify student groups that act differently in the two learning environments and to lay the ground for future research explaining the characteristics of these student groups. On a larger scale, the study aims to further develop effective instructional practices in university mathematics education.
Methods
This study approaches the research questions through a quantitative analysis of students’ approaches to learning, self-efficacy, and experiences of the teaching-learning environment in the context of two different undergraduate mathematics courses. The following subsections describe the context and the procedures of data collection and analysis in greater detail.
Context
The study was conducted in the mathematics department of a research-intensive university in Finland. The data were collected from two different courses usually taken by students during the first semester of their university mathematics studies. In Finland, the first university mathematics courses are already proof-based courses. The two courses run parallel and both are six-week, five-credit (European Credit Transfer and Accumulation System, ECTS) courses with approximately 200 students. In addition to mathematics majors, the courses are taken by many students studying mathematics as their minor subject; these students are usually majoring in physics, computer science, chemistry or education.
The two courses were ideal for quantitative research due to the large number of students taking both courses at the same time. However, the main interest in studying the courses arose from their different pedagogical approaches. In practice, the main differences between the courses centred on the role of lectures, the design of the tasks and the form of support given to the students by the teaching assistants. These are summarised in Table 1.
Course A is an analysis course that is part of students’ basic studies, which means it is compulsory and students take it at the very beginning of their university mathematics studies. It should be noted that the course is an analysis course rather than a calculus course, as exact definitions and proof construction are emphasised. The main content of the course includes the limit of a function, continuity, the derivative, and its applications. To further clarify the expected level of mathematical competence, the most difficult (as defined by the lecturer of the course) course exam question is provided below:
Consider the function f : ℝ → ℝ, defined via\( f(x)=\frac{x^2\sin \left({e}^{x^2}\right)}{\left({x}^4+1\right){e}^{\sin x}} \). Show that there exists a real number a ∈ ℝ, such that f(x) ≤ f(a) holds for all x ∈ ℝ. NB: It is not useful to consider the derivative here!
Course A functions within a traditional lecture-based course setting; however, for over a decade the course has been developed in a more interactive direction in order to respond to students’ challenges at the beginning of their university mathematics studies. The lecturer of the course is acknowledged for their excellent teaching by both students and the university. The teaching of the course focuses on the interplay between the human and the formal sides of mathematics. In practice, the 4 h of lectures every week focus on the main topics of the course and encourage students’ active participation, especially when unravelling the mathematical thinking behind the formal proofs. In addition to lectures, there are two kinds of small group sessions. One is allocated to the problems students have solved prior to the class. Usually, in the beginning of this session, students discuss their solutions in small groups and then take turns in writing and explaining them on the blackboard. The other small group session is allocated to solving problems during the session together with other students and with the help of a teaching assistant. The course uses an online platform for publishing weekly problem sheets and their model solutions. It also serves as a platform for discussion. In addition, an anonymous online chat platform is used actively for discussion during and outside of lectures.
Course XA is a linear algebra course. This proof-based course is part of student’s intermediate studies, which means that the course is compulsory for mathematics majors and students studying mathematics as an intermediate minor subject (60 ECTS). Students take the course at the beginning of their university mathematics studies. The main content of the course includes general vector spaces, subspaces, linear mappings and scalar products. Besides the mathematical content, the course emphasises such skills as reading mathematical texts, oral and written communication and proof construction. To further clarify the expected level of mathematical competence, the most difficult (as defined by the lecturer of the course) course exam question is provided below. Note that the theorem students are asked to prove is new to them, as it has not been covered during the course.
Let V be a vector space with basis\( \left({\overline{v}}_1,\dots, {\overline{v}}_n\right) \). Assume that L : V → W is a linear mapping. Show that if\( \left(L\left({\overline{v}}_1\right),\dots, L\left({\overline{v}}_n\right)\right) \)is a basis, then L is an isomorphism.
Similarly to course A, the lecturer of the course XA is acknowledged for their excellent teaching by both students and the university. The course is implemented using the Extreme Apprentice (XA) method. The XA method is a student-centred educational method developed at the Department of Computer Science and the Department of Mathematics and Statistics, University of Helsinki. The method is constructed upon the ancient process of apprenticeship, where a skilled master supervises a novice apprentice, and its theoretical background is in situated view on learning and in Cognitive Apprenticeship (Collins et al. 1991; Rämö et al. 2015). The method emphasises learning by doing, instructional scaffolding and continuous bi-directional feedback, and the core idea is to support students in becoming experts in their field by having them participate in activities that resemble those carried out by professionals (Rämö et al. 2015; Vihavainen et al. 2011).
In the XA method, the teaching of the course consists of weekly problems, course material, guidance, and 3 h of lectures per week. The method uses a flipped learning approach as students start studying a new topic by solving a set of problems. These topics have not yet been discussed during the lectures, so students need to read the course material in order to complete the tasks. However, the tasks are designed to be approachable and there are teaching assistants specific to this course helping the students in solving the problems. The teaching assistants guide the students in a learning space (see Fig. 1) in the middle of the department in drop-in basis approximately 6 h a day. Student collaboration is encouraged in the learning space. Every week students return written solutions to the problems, of which a few are selected for inspection and for giving written feedback to the students. The feedback focuses on solutions’ logical structure, but also readability and language are evaluated, and students’ have the possibility to improve and resubmit their solutions.
Students are prepared when they come to lectures as they have done pre-lecture tasks. Lectures focus on active interaction, as various small group activities are implemented, and students’ active participation encouraged. The aim of the lectures is to form links between the topics and enhance holistic understanding. An online anonymous platform is used for students’ questions and discussions on course content and practices. In addition, the platform is used actively during lectures to prompt questions and to enhance students’ interactions. After the lectures, students face more challenging problems on the topic.
During the course, the teaching assistants go through a training by taking part in weekly meetings in which mathematical and pedagogical topics are discussed. This enhances continuous bi-directional feedback between the students, the teaching assistants, and the lecturer on the progress of the students. Consequently, the level of the lectures and the weekly problems can be adjusted based on students’ needs. Regular and close interaction between students and teaching assistants, and the lecturer helps students establish relationships within communities of practice, which enhances the students’ integration into those communities (Lave and Wenger 1991). For a more detailed description of the implementation of the XA, see Rämö et al. (submitted).
Data Collection
The data consist of quantitative data collected from courses A and XA. Students on both courses voluntarily answered an electronic questionnaire at the end of the courses in autumn 2016. The questionnaire included items measuring students’ approaches to learning, self-efficacy, and experiences of the teaching-learning environment. In addition, data collected during the courses (number of completed tasks, participation and course exam results) were merged with the questionnaire data.
The Instrument
The HowULearn instrument (see Appendix 1) was used to collect quantitative five-point Likert scale data (1 = completely disagree, 5 = completely agree). HowULearn is a research-based instrument developed and coordinated by the Centre for University Teaching and Learning, University of Helsinki. The instrument is widely used at the University of Helsinki for feedback, research and instruction-enhancement purposes (Parpala and Lindblom-Ylänne 2012).
The instrument consists of multiple scales, of which three—approaches to learning, experiences of the teaching-learning environment and self-efficacy—are included in this study. Twelve items measure students’ approaches to learning, with four items measuring each approach (deep, surface and organised). The scale for experiences of the teaching-learning environment is comprised of 14 items measuring constructive feedback, alignment, requirements, interest and relevance, and peer support. Constructive feedback refers to receiving feedback that helps students to develop study skills, make connections to existing knowledge, and clarify possible misunderstandings. Alignment refers to the clarity of course objectives and the way these objectives are aligned with the instruction provided. Requirements refer to the clarity of the course work objectives and the way the course work relates to the learning objectives of the course. Interest and relevance refer to enjoyment and relevance of studying the course content, and peer support refers to students interacting and helping each other in learning the course material.
The approaches to learning and students experiences of the teaching-learning environment scales were originally derived from the Experiences of Teaching and Learning Questionnaire (ETLQ; Entwistle et al. 2003) and the Learning Process Questionnaire (R-LPQ-2F; Kember et al. 2004), and they have been modified and validated in the Finnish context by Parpala et al. (2013). In addition, five items measure self-efficacy for learning and performance. These items were slightly modified for HowULearn from Pintrich et al. (1991).
Participants
Students in both course A and course XA answered the HowULearn questionnaire on a course level at the end of the courses. The courses ran parallel, but the questionnaire was open at different time intervals for the two courses to increase the validity of the students’ self-reported results. The response rates were 83% for course A and 66% for course XA. After deleting students who declined to give permission to use their data for research purposes or who had missing data entries, the final sample consisted of 91 students who attended both courses.
An equal number of male and female students participated in the study. Majority of the participants were first-year university students (65%) and majoring in mathematics (63%). The participants’ background information is presented in Table 2. Students in the ‘other subjects’ category were predominantly majoring in education. Students with more than a year of university studies were often studying mathematics as their minor subject, implying that they had completed courses on their major subject prior to these mathematics studies.
Data Analysis
Data analysis was conducted using IBM SPSS Statistics 24 and Amos 24. As the instrument has been thoroughly validated and widely used in the Finnish higher education context, the factor structure was analysed with confirmatory factor analysis (maximum likelihood). Factor analyses were computed for both courses individually, but as no major differences emerged, the results reported in Table 3 are based on a factor analysis performed with the data merged into one file (N = 182).
The model validity was inspected with both absolute fit (χ2 and RMSEA) and relative fit (CFI) indices. The chi-squared test (χ2) indicates the difference between observed and expected covariance matrices, with values closer to zero indicating a better fit. However, the chi-squared test is sensitive to sample size, and with larger samples, such as the sample in the present study, it can reject an appropriate fit. In fact, the chi-squared test returned statistically significant results (p < 0.001) for every factor, implying a poor fit for the model. Different fit indices were thus used for a more holistic view of the model fit. Thus, to avoid issues with sample size, the root mean square error of approximation (RMSEA), which analyses the discrepancy between the hypothesised model and the population covariance matrix, was employed. Here, values range from 0 to 1, with smaller values indicating a better fit. In addition, the comparative fit index (CFI) was used to analyse the discrepancy between the data and the hypothesized model. Here, values range from 0 to 1, with higher values indicating a better fit. Research suggests that RMSEA values close to or below 0.06 and CFI values close to or above 0.95 are acceptable (Hu and Bentler 1999). The CFI and RMSEA values for the learning approaches, experiences of the teaching-learning environment and self-efficacy scales were within the acceptable range and therefore support the validity of the HowULearn instrument.
Based on the factor analysis, three items were excluded from different factors due to low communalities, mixed factor loadings, and deviant skewness and kurtoses. Every factor was then checked for internal consistency with Cronbach’s Alpha. The reliability levels can be considered acceptable.
As the study follows a repeated measures design, the data were analysed using a two-tailed paired samples t-test. For every factor, the means, standard deviations, mean differences with p-values from the paired samples t-test, and Cohen’s effect size d are presented. The boundaries used were 0.2, 0.5 and 0.8 small, medium and large effect size respectively (Cohen 1992).
Students were clustered based on their approaches to learning (deep approach and surface approach). The factor measuring organised studying was excluded from the clustering, as it refers more to studying than to learning (Biggs 1987; Coertjens et al. 2016; Marton and Säljö 1984). This choice is more apt for answering the research questions and captures the motivation behind the XA method. The students were grouped using hierarchical clustering to identify the number of clusters (between-groups linkage, squared Euclidean distance) and K-means clustering to identify cluster membership. In addition to the paired-samples t-test, a one-way ANOVA with Wilk’s Lambda was used to analyse the difference between the clusters within one course setting.
Results
The data consist of students who attended both courses (N = 91). In the first section, the results are reported on a course-level. The second section then describes student-level behaviour by clustering the students according to their deep and surface approaches to learning. Throughout the results section, asterisks are used to denote the p-values (* for p < 0.05, ** for p < 0.01, and *** for p < 0.001 significance levels), and the mean differences are reported as score in course XA—score in course A (XA-A).
Course-Level
In this section, the analysis focuses on students’ course-level behaviour. The means, standard deviations and mean differences are reported for each factor. In addition, the two course contexts are compared with a paired samples t-test for statistical significance and with Cohen’s d for effect size.
Approaches to Learning
Students’ approaches to learning were measured with three factors: deep approach, surface approach and organised studying. The comparison of students’ approaches to learning in the two course contexts are presented in Table 4. The most substantial difference between the two course contexts is in the surface approach factor: the students reported more of a surface approach in course A than in course XA, with the difference being statistically significant (MD = −0.79***). This result implies that in course A students tended to apply memorising techniques and their knowledge was more fragmented compared to course XA. The effect size (Cohen’s d = −0.89) implies that the course context had a large impact on the application of a surface approach. Furthermore, the students more often reported applying a deep approach and organised studying on course XA than on course A, with the difference also being above the level of statistical significance (MD = 0.15* and MD = 0.18* respectively). In practice, this means that students reflected more on new knowledge, made their own connections and conclusions and studied more systematically on course XA. However, the mean differences between the courses and the effect sizes are relatively small, which implies only a small effect for context.
Self-Efficacy
Self-efficacy is a one-factor scale. Comparison of students’ self-efficacy in the two course contexts is presented in Table 5. The results show that on course XA students reported statistically significantly higher self-efficacy levels compared to course A (MD = 0.58***). This means that students were more confident in their ability to perform the course tasks on course XA compared to course A. The effect size (Cohen’s d = 0.64) implies a medium role for the course context when measuring self-efficacy.
Experiences of the Teaching-Learning Environment
Students’ experiences of the teaching-learning environment were measured with five factors: constructive feedback, alignment, requirements, interest and relevance, and peer support. Comparison of students’ experiences of the teaching-learning environment in the two course contexts is presented in Table 6. Students reported the largest differences between the two contexts in alignment (MD = 0.42***, Cohen’s d = 0.53) and requirements (MD = 0.57***, Cohen’s d = 0.68) factors. In practice, students reported that in course XA the learning goals were clearer, and the instruction and the course tasks were more aligned with these goals. In addition, there was a statistically significant difference in the constructive feedback factor, with students reporting that they had received more constructive feedback on course XA compared to course A (MD = 0.33***, Cohen’s d = 0.40). In practice, students felt that on course XA they had received more feedback that enhanced the development of study skills and helped connect new knowledge to existing knowledge compared to course A. There was also a statistically significant difference in the interest and relevance factor (MD = 0.38***, Cohen’s d = 0.46). This means that students found the content on course XA more interesting and relevant and that they better enjoyed studying on the course compared to course A. The effect size implies a medium involvement for the course context. Moreover, there was no statistically significant difference in the peer support factor. This means that, on average, students had similar experiences of interaction with other students in both course contexts. The effect sizes ranged from small to medium in all factors except for peer support, where the effect size was insignificant. This implies that the course context had an effect on all the factors except for peer support.
Student-Level
In this section, the analysis focuses on student-level studying and learning. First, students were clustered based on their deep and surface approach to learning. Then these clusters were compared both within and between the contexts in terms of self-efficacy and students’ experiences of the teaching-learning environment.
Clusters
The students were clustered according to their deep approach and surface approach in both course contexts. Based on these four factors, the students formed three clusters:
-
1.
students applying a deep approach (N = 39)
-
2.
students applying a surface approach (N = 24)
-
3.
students applying a context-sensitive surface approach (N = 28).
The clusters and the level of their deep and surface approach factors in both course contexts are presented in Fig. 2. Students applying a deep approach (cluster 1) acted similarly in the two course contexts: they applied a deep approach and scored low on a surface approach to learning in both contexts. Correspondingly, students applying a surface approach (cluster 2) applied similar approaches to learning regardless of the context. However, their predominant approach to learning was a surface approach. In contrast to the first two clusters, students applying a context-sensitive surface approach (cluster 3) changed their approaches to learning according to the course context: these students applied a surface approach on course A but not on course XA.
Clusters in the Context of One Course
We now move on to consider the clusters in one course context at a time. First, the clusters are compared in terms of students’ major subject, gender, participation, number of completed course tasks and exam results. In the context of course A, there were no statistically significant differences between any of the clusters. This was also the case in the context of course XA; however, as an exception, the students applying a surface approach (cluster 2) scored lower in the course exam compared to students applying a deep approach (cluster 1; MD = 22.28***) and students applying a context-sensitive surface approach (cluster 3; MD = 15.25*) (F(2,88) = 10.750, partial η2 = 0.202; max(exam points) = 100).
Next, the clusters are considered in terms of self-efficacy and experiences of the teaching-learning environment. In the context of course A, students applying a deep approach (cluster 1) differed to a statistically significant degree from students applying a surface approach (cluster 2) and a context-sensitive surface approach (cluster 3) in self-efficacy (MD = 1.17***, MD = 0.80***; F(2,88) = 16.632***, partial η2 = 0.274), alignment (MD = 0.54*, MD = 0.80***; F(2,88) = 8.773***, partial η2 = 0.166), and interest and relevance (MD = 0.71**, MD = 0.71**; F(2,88) = 8.067***, partial η2 = 0.155). However, there were no statistically significant differences in constructive feedback, requirements or peer support factors. In the context of course XA, students applying a surface approach (cluster 2) differed to a statistically significant degree from students applying a deep approach (cluster 1) in self-efficacy (MD = 1.04***; F(2,88) = 15.651***, partial η2 = 0.262) and interest and relevance (MD = 0.50*; F(2,88) = 3.589*, partial η2 = 0.075), and from students applying a context-sensitive surface approach (cluster 3) in self-efficacy (MD = 0.92***; F(2,88) = 15.651***, partial η2 = 0.262). There were no statistically significant differences in constructive feedback, alignment, requirements or peer support.
Overall, when considering the clusters in one context at a time, students applying a deep approach (cluster 1) differed to a statistically significant degree from other students (clusters 2 and 3) in many factors, and these differences were advantageous to these cluster-1 students. Further, students applying a surface approach (cluster 2) and students applying a context-sensitive surface approach (cluster 3) differed solely in the self-efficacy factor and course exam results in the context of course XA.
Clusters in the Context of Two Courses
The paper now progresses to an individual analysis of the clusters in the two course contexts. The clusters in these two contexts and their scores for the self-efficacy and teaching-learning environment factors are presented in Table 7. Cluster 1 consisted of students applying a deep approach in both course contexts. Compared to the other two clusters, these students reported the smallest mean differences between the two contexts. Overall, the means were higher in the context of course XA compared to course A. However, the students in this cluster displayed the smallest number of statistically significant differences between the two course contexts, with the mean differences being statistically significant in only self-efficacy (MD = 0.33**, Cohen’s d = 0.37) and requirements (MD = 0.53**, Cohen’s d = 0.65), with medium effect sizes.
Cluster 2 consisted of students applying a surface approach in both course contexts. When moving from one context to another, these students reported greater mean differences compared to cluster 1, but smaller mean differences compared to cluster 3. The mean differences were statistically significant in self-efficacy (MD = 0.47**, Cohen’s d = 0.69), alignment (MD = 0.44*, Cohen’s d = 0.66), requirements (MD = 0.25*, Cohen’s d = 0.31) and interest and relevance (MD = 0.40*, Cohen’s d = 0.45), with effect sizes varying from small to medium.
Cluster 3 consisted of students applying a surface approach in the context of course A but not on course XA. Here, the mean differences were higher than those of the other two clusters. All factors were statistically significantly more positive in the context of course XA: self-efficacy (MD = 1.01***, Cohen’s d = 1.36), constructive feedback (MD = 0.55**, Cohen’s d = 0.70), alignment (MD = 0.68***, Cohen’s d = 0.83), requirements (MD = 0.89***, Cohen’s d = 1.03), interest and relevance (MD = 0.62**, Cohen’s d = 0.79) and peer support (MD = 0.35*, Cohen’s d = 0.52). The effect sizes varied from medium to large, with the largest effect size in the self-efficacy factor.
Discussion
In this study, the same cohort of students (N = 91) was investigated in the context of two different courses, with course A functioning within the traditional lecture-based framework but including student-centred elements and course XA using the Extreme Apprenticeship method. The latter involved a more student-centred course design, with characteristics such as learning by doing, flipped learning, instructional scaffolding and continuous feedback. Overall, the results suggest that students applied more favourable learning approaches, experienced the teaching-learning environment more positively and displayed higher self-efficacy levels in the more student-centred course design (course XA).
As prior research acknowledges the positive impact of a deep approach to learning, it is encouraging to see that students were more likely to apply a deep approach to learning in the more student-centred course design. This result is in line with prior research that suggests student-centred learning environments promote a deep approach to learning (Baeten et al. 2010). Nevertheless, the role of the course context was relatively small in the present study (MD = 0.15, p < 0.05, Cohens d = 0.18); however, again this finding is in line with prior research acknowledging the challenge of supporting a deep approach to learning through instruction (Baeten et al. 2013; Marton and Säljö 1984). By contrast, there was a bigger difference in the factor measuring a surface approach to learning: students applied less of a surface approach to learning in the more student-centred course (MD = −0.79, p < 0.001, Cohen’s d = −0.89). Here, the effect size was relatively large, implying a significant impact for the course context. These results are noteworthy, as a deep approach to learning leads to higher mathematical performance (Murphy 2017), and a surface approach to learning leads to lower mathematical achievement (Maciejewski and Merchant 2016). Therefore, the Extreme Apprenticeship method is a form of mathematics instruction that can support more favourable approaches to learning. As a conclusion, it is possible to promote students’ mathematical achievement with more student-centred course designs.
Other studies have demonstrated that a deep approach to learning is related to positive experiences of the teaching-learning environment (see e.g. Baeten et al. 2010; Parpala et al. 2010). This study supports these previous findings, as in the more student-centred course design with more favourable approaches to learning, students reported more positive experiences of the teaching-learning environment. Here, students claimed that they had received more constructive feedback (MD = 0.33, p < 0.001, Cohen’s d = 0.40) and that the teaching was more aligned (MD = 0.42, p < 0.001, Cohen’s d = 0.53), the requirements clearer (MD = 0.57, p < 0.001, Cohen’s d = 0.68) and studying on the course more interesting and relevant (MD = 0.38, p < 0.001, Cohen’s d = 0.46).
Peer support was the only factor measuring students’ experiences of the teaching-learning environment that displayed no statistically significant differences between the two course contexts. The mean scores for peer support were high in both course contexts (4.00 in course A and 4.09 in course XA). This implies that both course designs succeeded in promoting positive peer relationships. However, it could also signify that peer relations, once established, are not sensitive to changes in course context.
In the XA method, special focus is on feedback: students are offered a variety of possibilities to receive feedback for reading, writing and communicating mathematics. However, the means for the factor measuring constructive feedback were rather low in both course contexts: 2.75 on the more traditional (course A) and 3.08 on the more student-centred course (course XA). Despite the fact that, on average, students reported having received more constructive feedback on the more student-centred course, the results suggest that it is challenging to include effective feedback strategies in instructional designs.
In contrast to Peters (2013), students reported higher self-efficacy levels in the more student-centred course design (MD = 0.58, p < 0.001, Cohen’s d = 0.64). Although the course exams were different and the course exam results thus incomparable, it can be hypothesised that students’ mathematical achievement was higher in the more student-centred course design, as self-efficacy enhances academic achievement (Pajares 1996; Peters 2013).
In this study, three groups of students were identified: 1) students who applied a deep approach regardless of the context, 2) students who applied a surface approach regardless of the context, and 3) students who applied a context-sensitive surface approach. The last group consisted of students who applied a surface approach on the more traditional but not on the more student-centred course. These clusters are supported by prior research, as a deep approach tends to be more stable than a surface approach, and student-centred learning environments can prompt students applying a surface approach to adopt a deeper approach (Coertjens et al. 2016; Wilson and Fowler 2005). Students applying a deep approach differed to a statistically significant degree from the other students in many factors, the differences being more positive for students applying a deep approach. The two other clusters, students applying a surface approach and students applying a context-sensitive surface approach, differed only in the context of course XA: in the course exam results and in the factor measuring self-efficacy. As stated earlier, more favourable approaches to learning should positively contribute to mathematical achievement. However, it would be interesting to further analyse the course exams as the exam in course A did not differentiate between student groups applying deep or surface approach to learning.
When comparing the clusters individually in the two contexts, all clusters reported higher self-efficacy levels and more positive experiences of the teaching-learning environment in the more student-centred course context. Students applying a context-sensitive surface approach reported the most positive gains from the more student-centred course design. In the factor measuring self-efficacy, they displayed the largest mean difference between the contexts and the strongest effect size (MD = 1.01, p < 0.001, Cohen’s d = 1.36). In addition, this cluster was the only group of students who showed a statistically significant difference in the peer support factor (MD = 0.35, p < 0.05, Cohen’s d = 0.52). An interesting question is why some, but not all, of the students applying a surface approach in the more traditional course design shifted their approach towards a deep approach to learning in the more student-centred course design. The two clusters, students applying a surface approach and students applying a context-sensitive surface approach, did not differ in a statistically significant way in the more traditional course context, but they did differ in the more student-centred course context in the factor measuring self-efficacy and in the course exam results. It could be assumed that higher mathematical achievement is caused by higher self-efficacy. Therefore, is seems likely that the more student-centred course design positively contributed to the self-efficacy levels of the students applying a context-sensitive surface approach. In addition, the students applying a context-sensitive surface approach reported having received statistically significantly more peer support in the more student-centred course design. Therefore, the results suggest that the context sensitivity of students applying a context-sensitive surface approach can be explained by the factors measuring self-efficacy and peer support. However, the analyses do not provide an answer to why this phenomenon failed to occur for the students applying a surface approach.
Overall, the results suggest that the more student-centred course design did no harm to any of the student groups. On the contrary, the results unanimously support the use of more student-centred course design for all groups of students, especially for students who tend to adopt a surface approach to learning, which can cause difficulties in their later mathematics studies. The results indicate that including student-centred elements in a traditional lecture-based course design is insufficient to fully utilise the students’ learning potential. Instead, a more dramatic structural shift in a student-centred direction is needed. This shift would positively contribute to students’ self-efficacy, experiences of the teaching-learning environment and approaches to learning. To conclude, the results of this study suggest that it is possible to promote the quality of university mathematics learning with instructional designs that, besides content, take a holistic approach to the learning environment.
Limitations of the Study and Future Research
One of the major limitations of this study is that the two courses differed in content. As Marton and Säljö (1976) argue, learning should be described in terms of content, because students understand content in a wide variety of ways. However, the instrument is validated in the Finnish higher education context across different disciplines to minimise the effect of the subject content matter. Especially the factors of the instrument measuring the teaching-learning environment are not directly related to the content of the courses. To continue, there is a solid body of research arguing that positive experiences of the learning environment are related to more favourable approaches to learning and higher self-efficacy. As a conclusion, we can assume that both the content to be learned and the learning environment are related to students’ experiences and their learning outcomes. However, this study focuses on the learning environment and is not able to individually articulate the effect of the content and the learning environment on students’ learning.
Another limitation is that the courses had different lecturers. Although both lecturers were experienced and regarded as pedagogically competent, approaches to teaching influence students’ approaches to learning, as Baeten et al. (2010) state. This limitation arises from the choice of research design, as it was impossible to obtain the same students in two different pedagogical settings with the same course content and lecturer. However, one of the strengths of the research design is that the students remained the same when moving from one context to the other.
Baeten et al. (2010) observe that many factors related to approaches to learning are dependent on the individual student. These individual factors include gender, personality, previous experiences and learning habits. The results of this study may have been affected by the fact that some of the students were more capable of adapting to new instructional designs. By contrast, the more student-centred method might have provided an opportunity for some students to study in the way that suited them best. As the study is based on students’ self-reported data, it is unable to fully address these student-dependent factors. However, although it is probable that students vary in their ability to reflect on their own learning, the questionnaire used to collect the data had been thoroughly validated to minimise such limitations caused by individual differences.
This study identified three groups of students. However, more research is required to fully understand the motivations and mechanisms behind these groups’ characteristics. Our research will continue with a qualitative analysis of student interviews in order to clarify the roles of content and the learning environment in students’ learning. In addition, the aim is to identify the specific elements of instructional design that support positive behaviour and experiences in students.
References
Baeten, M., Kyndt, E., Struyven, K., & Dochy, F. (2010). Using student-centred learning environments to stimulate deep approaches to learning: factors encouraging or discouraging their effectiveness. Educational Research Review, 5, 243–260 https://doi.org/10.1016/j.edurev.2010.06.001.
Baeten, M., Struyven, K., & Dochy, F. (2013). Student-centred teaching methods: can they optimise students’ approaches to learning in professional higher education? Studies in Educational Evaluation, 39, 14–22 https://doi.org/10.1016/j.stueduc.2012.11.001.
Bandura, A. (1994). Self-efficacy. In V. S. Ramachaudran (Ed.), Encyclopedia of human behavior (Vol. 4, pp. 71–81). New York: Academic Press.
Biggs, J. (1987). Student approaches to learning and studying. Melbourne: Australian Council for Educational Research.
Biggs, J. B. (1991). Approaches to learning in secondary and tertiary students in Hong Kong: some comparative studies. Educational Research Journal, 6, 27–39.
Coertjens, L., Vanthournout, G., Lindblom-Ylänne, S., & Postareff, L. (2016). Understanding individual differences in approaches to learning across courses: a mixed method approach. Learning and Individual Differences, 51, 69–80 https://doi.org/10.1016/j.lindif.2016.07.003.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155–159 https://doi.org/10.1037/0033-2909.112.1.155.
Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: making thinking visible. American Educator, 15(3), 6–11.
Crawford, K., Gordon, S., Nicholas, J., & Prosserf, M. (1994). Conceptions of mathematics and how it is learned: the perspectives of students entering university. Learning and Instruction, 4, 331–345 https://doi.org/10.1016/0959-4752(94)90005-1.
Crawford, K., Gordon, S., Nicholas, J., & Prosser, M. (1998). University mathematics students’ conceptions of mathematics. Studies in Higher Education, 23(1), 87–94 https://doi.org/10.1080/03075079812331380512.
Entwistle, N., & Tait, H. (1990). Approaches to learning, evaluations of teaching, and preferences for contrasting academic environments. Higher Education, 19, 169–194 https://doi.org/10.1007/BF00137106.
Entwistle, N., McCune, V., & Hounsell, J. (2003). Investigating ways of enhancing university teaching-learning environments: Measuring students’ approaches to studying and perceptions of teaching. In E. De Corte, L. Verschaffel, N. Entwistle, & J. van Merriënboer (Eds.), Powerful learning environments: Unravelling basic components and dimensions (pp. 89–107). Oxford: Pergamon/Elsevier Science.
Freudenthal, H. (1991). Revisiting mathematics education: China lectures. Dordrecht: Kluwer Academic Publishers.
Hailikari, T., & Parpala, A. (2014). What impedes or enhances my studying? The interrelation between approaches to learning, factors influencing study progress and earned credits. Teaching in Higher Education, 19(7), 812–824 https://doi.org/10.1080/13562517.2014.934348.
Heikkilä, A., & Lonka, K. (2006). Studying in higher education: students’ approaches to learning, self-regulation, and cognitive strategies. Studies in Higher Education, 31(1), 99–117 https://doi.org/10.1080/03075070500392433.
Hmelo-Silver, C. E. (2004). Problem-based learning: what and how do students learn? Educational Psychology Review, 16(3), 235–266 https://doi.org/10.1023/B:EDPR.0000034022.16470.f3.
Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55 https://doi.org/10.1080/10705519909540118.
Kember, D., Biggs, J., & Leung, D. Y. P. (2004). Examining the multidimensionality of approaches to learning through the development of a revised version of the learning process questionnaire. British Journal of Educational Psychology, 74(2), 261–279 https://doi.org/10.1348/000709904773839879.
Kogan, M., & Laursen, S. L. (2014). Assessing long-term effects of inquiry-based learning: a case study from college mathematics. Innovative Higher Education, 39(3), 183–199 https://doi.org/10.1007/s10755-013-9269-9.
Konstantinou-Katzi, P., Tsolaki, E., Meletiou-Mavrotheris, M., & Koutselini, M. (2013). Differentiation of teaching and learning mathematics: an action research study in tertiary education. International Journal of Mathematical Education in Science and Technology, 44(3), 332–349 https://doi.org/10.1080/0020739X.2012.714491.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.
Lesseig, K., & Krouss, P. (2017). Implementing a flipped instructional model in college algebra: profiles of student activity. International Journal of Mathematical Education in Science and Technology, 48(2), 202–214 https://doi.org/10.1080/0020739X.2016.1233586.
Lindblom-Ylänne, S., Parpala, A., & Postareff, L. (2013). Challenges in analysing chance on students’ approaches to learning. In D. Gijbels, V. Donche, J. T. E. Richardson, & J. D. Vermunt (Eds.), Learning patterns in higher education: Dimensions and research perspectives (pp. 232–248). New York: Routledge.
Lindblom-Ylänne, S., Parpala, A., & Postareff, L. (2018). What constitutes the surface approach to learning in the light of new empirical evidence? Studies in Higher Education, Online. https://doi.org/10.1080/03075079.2018.1482267.
Maciejewski, W., & Merchant, S. (2016). Mathematical tasks, study approaches, and course grades in undergraduate mathematics: a year-by-year analysis. International Journal of Mathematical Education in Science and Technology, 473(3), 373–387 https://doi.org/10.1080/0020739X.2015.1072881.
Marton, F., & Säljö, R. (1976). On qualitative differences in learning - I outcome and process. British Journal of Educational Psychology, 46(1), 4–11 https://doi.org/10.1111/j.2044-8279.1976.tb02980.x.
Marton, F., & Säljö, R. (1984). Approaches to learning. In F. Marton, D. J. Hounsell, & N. J. Entwistle (Eds.), The experience of learning (pp. 39–58). Edinburgh: Scottish Academic Press.
Murphy, P. E. L. (2017). Student approaches to learning, conceptions of mathematics, and successful outcomes in learning mathematics. In L. Wood & Y. Breyer (Eds.), Success in higher education (pp. 75–93). Singapore: Springer Singapore. https://doi.org/10.1007/978-981-10-2791-8_5.
Oikkonen, J. (2009). Ideas and results in teaching beginning maths students. International Journal of Mathematical Education in Science and Technology, 40(1), 127–138 https://doi.org/10.1080/00207390802582961.
Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research, 66(4), 543–578 https://doi.org/10.3102/00346543066004543.
Parpala, A., & Lindblom-Ylänne, S. (2012). Using a research instrument for developing quality at the university. Quality in Higher Education, 18(3), 313–328 https://doi.org/10.1080/13538322.2012.733493.
Parpala, A., Lindblom-Ylänne, S., Komulainen, E., Litmanen, T., & Hirsto, L. (2010). Students’ approaches to learning and their experiences of the teaching-learning environment in different disciplines. British Journal of Educational Psychology, 80(2), 269–282 https://doi.org/10.1348/000709909X476946.
Parpala, A., Lindblom-Ylänne, S., Komulainen, E., & Entwistle, N. (2013). Assessing students’ experiences of teaching–learning environments and approaches to learning: validation of a questionnaire in different countries and varying contexts. Learning Environments Research, 16(2), 201–215 https://doi.org/10.1007/s10984-013-9128-8.
Peters, M. L. (2013). Examining the relationships among classroom climate, self-efficacy, and achievement in undergraduate mathematics: a multi-level analysis. International Journal of Science and Mathematics Education, 11(2), 459–480 https://doi.org/10.1007/s10763-012-9347-y.
Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ). Ann Arbor: National Center for Research to Improve Postsecondary Teaching and Learning.
Rämö, J., Oinonen, L., & Vikberg, T. (2015). Extreme apprenticeship – Emphasising conceptual understanding in undergraduate mathematics. In K. Krainer & N. Vondrová (Eds.), Proceedings of the ninth congress of the European society for research in mathematics education (pp. 2242–2248). Prague: CERME.
Rämö, J., Lahdenperä, J., & Häsä, J. (n.d.). The extreme apprenticeship method. Manuscript submitted for publication.
Rasmussen, C., & Kwon, O. N. (2007). An inquiry-oriented approach to undergraduate mathematics. Journal of Mathematical Behavior, 26, 189–194 https://doi.org/10.1016/j.jmathb.2007.10.001.
Schoenfeld, A. H. (2007). Problem solving in the United States, 1970–2008: research and theory, practice and politics. ZDM - International Journal on Mathematics Education, 39, 537–551 https://doi.org/10.1007/s11858-007-0038-z.
Trigwell, K., & Prosser, M. (1991). Relating approaches to study and quality of learning outcomes at the course level. British Journal of Educational Psychology, 61(3), 265–275 https://doi.org/10.1111/j.2044-8279.1991.tb00984.x.
Varunki, M., Katajavuori, N., & Postareff, L. (2017). First-year students’ approaches to learning, and factors related to change or stability in their deep approach during a pharmacy course. Studies in Higher Education, 42(2), 331–353 https://doi.org/10.1080/03075079.2015.1049140.
Vihavainen, A., Paksula, M., & Luukkainen, M. (2011). Extreme apprenticeship method in teaching programming for beginners. In Proceedings of the 42nd ACM technical symposium on computer science education (SIGCSE ‘11) (pp. 93–98). New York: ACM. https://doi.org/10.1145/1953163.1953196
Wilson, K., & Fowler, J. (2005). Assessing the impact of learning environments on students’ approaches to learning: comparing conventional and action learning designs. Assessment & Evaluation in Higher Education, 30(1), 87–101 https://doi.org/10.1080/0260293042003251770.
Zimmerman, B. J. (1995). Self-efficacy and educational development. In A. Bandura (Ed.), Self-efficacy in changing societies (pp. 202–231). Cambridge: Cambridge University Press.
Acknowledgements
The research has been funded by the Finnish Cultural Foundation, grant numbers 00160520 and 00172299.
The authors would like to thank Anna Parpala for her cooperation in the data collection procedure. The authors would also like to express their gratitude to Jani Hannula, Juuso Nieminen, Jenni Räsänen, and the researchers at the Centre of University Teaching and Learning, University of Helsinki for fruitful discussions and comments on the manuscript.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Electronic supplementary material
ESM 1
(DOCX 26 kb)
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Lahdenperä, J., Postareff, L. & Rämö, J. Supporting Quality of Learning in University Mathematics: a Comparison of Two Instructional Designs. Int. J. Res. Undergrad. Math. Ed. 5, 75–96 (2019). https://doi.org/10.1007/s40753-018-0080-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40753-018-0080-y