Online Knowledge–building Interactions in Problem–based Learning: The POLARIS Experience
Online Knowledge–building Interactions in Problem–based Learning: The POLARIS ExperienceIntroduction
Problem-based Learning at Maastricht
Use of Polaris in Learning Statistics
In this chapter, we present a learning tool which was built as part of design research that focused on (1) the development of functionality that better supports learning tasks in problem-based learning (PBL) and (2) the construction of an architecture that facilitates the passage from the testing stage to the institution-wide implementation of tools. We describe the rationale for the development of this tool, the tool itself, and the experiences of working with the tool.
For tasks that are difficult to accomplish, new tools often bring a promise that these tasks, which were beyond our capability yesterday, will be within our reach today. One of these promising applications is small-group collaborative learning. A recent review of research on small-group learning found that the outcomes were positive in almost all aspects of learning (Springer et al., 1999). But, often, pilots and trials do not live up to expectations in practice. Failure has been attributed to, among others, the tools available not meeting the requirements of the learning environment in which they were implemented. A common complaint is that the functionality of the tool does not fit learning activities.
As one of the best candidates for large-scale innovation of education, computer-supported collaborative learning has attracted a great deal of research interest. However, the focus of most of this research is to learn more about the learning processes, not to directly improve the learning experience. And when tools are designed in a research environment, their purpose is to support that research. These tools are for various reasons, of which some will be discussed in our system design section, neither readily usable outside of the laboratory nor ever intended to be used in that way. Commercial development does focus on the user experience and is a booming area. New providers of ever more capable tools are coming up (and going) almost weekly. However, most commercial tools focus on communication paradigms that do not fit learning activities. These tools come with more and more functions, and get sleeker and faster, but they do not suit collaborative learning any better than the previous generation of tools. And for the few tools that might meet educational needs, there is the hurdle of implementation to cross. Now that most universities have a course management system of one kind or another in place, often acquired and maintained at considerable cost and effort, a new tool not only presents an organizational challenge in its implementation but also has to compete with existing tools for budget, support, and user preference.
The tool that we are describing in this chapter was developed at the University of Maastricht, where the curricula of all its schools and faculties are based on the PBL model. The principal reason for the development of POLARIS was the need to intensify communication between learners in a PBL context. In an explorative study on the use of the campus network for facilitating learning activities, both students and tutors made explicit a need for communicating about learning progress between regular face-to-face group meetings (Ronteltap et al., 1997).
All the curricula in our institution use the same structure. Each curriculum is subdivided into a limited number of course units, or blocks. A course, in general, consists of a set of cases that are analyzed in tutorial groups. The following is an example of a case from a medical course:
A 60-year-old man, visibly worried, visits his doctor in the clinic. He tells the doctor that, while riding his bike, a small insect went into his left eye. When he rubbed his eye to try to remove the insect, he noticed he had problem seeing with his right eye.
Students need to be able to analyze the text given and explain the basic events that are presented in this short case description in terms of relevant conceptual representations. In exploring the problem, students use a general problem-solving framework that consists of the following seven steps (Schmidt, 1983), or variations of it with a comparable procedure:
- Clarify concepts that are not readily comprehensible
- Define the problem
- Analyze the problem: brainstorming
- Analyze the problem: compiling the results of brainstorming
- Formulate learning issues
- Collect additional information outside of the group
- Synthesize and test the acquired information
The first five steps are carried out in the first of two meetings, and the last step in the second meeting. Between these two group meetings, students have two days available for independent study. Each regular group meeting takes two hours: one hour for reporting the progress of individual study and another hour for analyzing a new problem assignment. Often, the number of problems that are assigned to start individual learning equals the number of group meetings (e.g., 6 weeks = 12 meetings = about 12-20 problems). Students meet twice a week in tutorial groups of 10-14 people, coached by a tutor. Maastricht students’ learning activities can be represented as shown in the flowchart in Figure 9.1.
In our first set of pilot studies on group learning tools, standard asynchronous communication tools (based on the Lotus Notes discussion forum) were employed. In a nutshell, the following difficulties were experienced in using these tools for learning purposes, which would have hindered the facilitation of learning interactions:
- The asynchronous communication often was hardly organized, and the results were unpredictable or even chaotic. There was a need to structure the communication flow.
- There was no visual contact, so tools that support social interaction online were needed. Students wanted feedback when they shared their work with others.
- Students were sometimes discouraged or even demoralized by the information overload. There was a need to reorganize and restructure information.
- The content of discussion forums needed to be structured so as to overcome the rigidity in the organization of the threads.
- Orientation in an information-rich environment can be simplified by the provision of information at a higher level of abstraction, for example by subject headings of posting.
- In the context of active learning, as PBL is, we needed to go beyond the limited functions of sending and replying.
The general conclusion that we made after analyzing the results of these pilots was that, in order to enable productive asynchronous learning interactions, one has to examine the nature of communication that is anticipated in the design of a learning environment in which basic principles of the leading pedagogical concept are put into practice. Observation of the existing communication process was completed with an analysis of the added value of using tools to raise effectiveness. Figure 9.2 shows the relationships between the elements that formed the basis of this phase of the project.
Why do students learn when they work together? In answering this question, during which we focused on the social aspects of learning rather than individual learning, we came up with a shortlist of activities that are typical for asynchronous collaboration: writing, reading, analyzing incoming information, finding new information, as well as restructuring and reorganizing available information in the group environment. These activities are part of a pedagogical scenario and can be seen as the driving force, or the cause, of learning. The result of this analysis was an outline of relevant learning mechanisms that are typical for online PBL. Following Dillenbourg (1999), we supposed that small-group learning involves the following processes, which we would analyze: “the situation generates interactions, and these interactions trigger cognitive mechanisms, which in turn generate cognitive effects. However, such a linear causality is a simplification. Most relations are reciprocal.” The mechanisms in group learning became apparent from our observations and from the theoretical grounding of these observations. The mechanisms are as follows:
- Conflict: Students discover that peers have different solutions and answers to the same problem or question. In this case, we are talking about inter-individual sociocognitive conflict that triggers further learning and the reframing or reorganization of individual knowledge. Intra-individual conflict (doubt) can also be the consequence of interaction, leading to exploration of new information so as to reorganize and restructure prior knowledge (Gilly, 1989).
- Reflection: Students, during or after group collaboration, put a meaning to information that is exchanged and assembled (Johnson et al., 1990).
- Articulation: Students verbalize their ideas to peers during interaction (Koschmann et al., 1993) or explain meanings by elaboration and joint construction of knowledge (Ploetzner et al., 1999).
- Activity: In asynchronous discussions about learning issues, learners engage in the activities of writing, reading, and jointly constructing knowledge, which together provide a powerful learning experience (Spivey, 1997).
- Appropriation: Learners, in a shared environment, take something that belongs to another and make it their own (Wertsch, 1998; Ronteltap & Eurelings, 2002).
In online learning activities, a user has to carry out actions in performing the tasks. A specific action, such as opening a posting, may map onto exactly one command in the tool, such as clicking on the hyperlinked title of the posting, which results in the content of the posting being displayed. In this case, there is a one-to-one mapping of action and command, and the result obtained is satisfactory. However, when engaging in higher-order learning activities, many of the tasks require the performance of actions that do not have such a good fit. It is more likely that a user who wants to do something, like reading a thread, needs to map this intention onto a sequence of actions, such as first selecting one by one the postings one wishes to include in the overview and then clicking on a button to get the overview, which may consist of a page listing the content of each of these postings. The postings would be arranged in an inefficient sequence according to the order in which they were posted, as a result the thread of a discussion is lost. The same is true for interface design, which shows affordances for actions and orientation. After a series of trials, we came up with a list of problems encountered in using standard discussion tools. We present in Table 9.1 a summary of these problems and how we addressed them at the functional level in POLARIS.
When we scaled up the implementation after the first series of pilots, we encountered problems with user administration and performance. At the same time, Maastricht University migrated to Blackboard; and since this course management system contains a discussion board too, we felt that having two separate systems was not a good idea. Instead of competing with Blackboard, we decided to take advantage of it by rebuilding POLARIS using the Building Block technology of Blackboard. This allowed us to integrate our system into Blackboard, thereby eliminating all user administration and performance issues, or at least moving these responsibilities from
|Table 9.1 Problems presented by the structure of discussion forums and the solutions implemented|
|Long threads made search of postings difficult|
|Titles of posts were the only means of identifying the posts and were often retained in a long string of responses without regard to the content of the posts|
|The thread structure hampered direct response|
|Difficult to follow the discourse|
|Difficult to move between threads|
|Archiving and reuse of personal perspectives not possible|
|Short positive feedback contributed to long threads and termination of threads, which might cause better earlier postings to be bypassed|
|No information as to who had read the postings|
POLARIS to Blackboard. It also allowed us to undertake a gradual implementation by offering POLARIS initially only to staff who felt that Blackboard’s discussion board did not suit their needs. For the reimplementation, bearing in mind that in a large-scale rollout the system had to be usable with little or no training, we simplified it by pruning the least frequently used functions. The result is a tool that can be installed as an add-on in a Blackboard environment and that solves some of the problems that are normally encountered in discussion board-based knowledge-building activities (Figure 9.3).
In every course conducted in the School of Health Sciences, students are assigned on a regular basis to small collaborative learning groups, which are guided by a tutor. In the academic year 2002-2003, two out of ten of these groups were selected to use the POLARIS learning tool in the study of statistical topics such as regression analysis, analysis of variance (ANOVA), and chi-square tests. For each of these topics, a discussion forum was set up, and a total of seven forums were established. The number of students in the two experimental groups was respectively 15 and 12. Group 1 was not actively encouraged by the tutor to use POLARIS, while group 2 was.
The impact of adopting the learning tool was evaluated by means of (1) a questionnaire measuring student satisfaction; (2) the discussions documented within POLARIS, which were analyzed and scored with respect to their relevance and accuracy; and (3) the results in the end-of-term examinations. Additionally, the frequency of student participation in various activities using POLARIS was recorded, such as the number of documents, number of discussion threads, number of times a document was read or approved, and the number of active participants in each forum.
The two experimental groups were compared with each other in terms of their activities in POLARIS and with groups not using the tool in terms of their examination results. Table 9.2 compares the experimental groups’ participation in some of the activities in POLARIS. The students’ discussions were analyzed using the ATLASti program for qualitative analysis of text. From the table, it can be concluded that, not surprisingly, group 2, which received prompting, was more active than group 1. Group 2 students carried out more and longer discussions. Furthermore, the discussions lasted throughout the course. Discussions on the more difficult topics like ANOVA and factorial design were longer. The mean examination score of group 2 was higher, although this difference was not statistically significant against group 1 (score of 14.57
|Table 9.2 Experimental groups' rates of participation in POLARIS|
|Forum||No. of documents||No. of threads||Average thread length|
|Group 1||Group 2||Group 1||Group 2||Group 1||Group 2|
|1. Introduction to statistics||11||7||4||2||2.75||3.5|
|2. Statistical testing||13||12||4||4||3.25||3|
|4. Linear regression analysis||1||17||1||3||1||5.7|
|5. Factorial ANOVA design||2||25||1||3||2||8.3|
|6. Cross-tabulation analysis||0||33||0||5||0||6.6|
|7. Miscellaneous Total/Average||1||56||1||11||1||5.1|
vs. 12.44). The mean score of the reference group, which did not use POLARIS at all, was 14.85, which was not significantly different compared with the score of either of the experimental groups. The maximum score was 20. The correlation between the number of student contributions to the POLARIS discussions and the final examination score was slightly negative, at -0.20 (n.s.).
Feedback from the questionnaire showed that the students who used POLARIS were positive about the tool. They indicated that POLARIS was easy to use, was quite suitable for group discussions about the learning material, and led to better and deeper understanding. Group 2 requested for the continued use of POLARIS in later statistics courses that followed this pilot study. This actually happened, and the same group again appeared to be able to achieve a deeper level of understanding.
The following are samples of postings from a discussion in this study. In group 2, a few students interacted with each other and with the tutor. Questions were posed but mostly not answered, as Table 9.3 shows. In a thread from the forum on simple linear regression analysis, a student asked for clarification (translated from Dutch):
|Table 9.3 Interactions between students of group 2 and with the tutor|
|Forum||No. of questions posed||No. of answers|
|1. Introduction to statistics||1 (s)||0|
|2. Statistical testing||3 (1t, 2s)||0|
|3. ANOVA||2 (2s)||1 (s)|
|4. Linear regression analysis||7 (3t, 4s)||1 (s)|
|5. Factorial ANOVA design||5 (3t, 2s) |
(1 with 18 partial questions)
|6. Cross-tabulation analysis||7 (6t, 1s) |
(1 posting with 6, 1 with 11,
and 1 with 2 partial questions)
|7. Miscellaneous||4 (4s)||0|
|s = student; |
t = tutor. Partial questions were questions posed in one posting. Discussion forums 4 to 6 dealt with very difficult statistical concepts for students at this level.
Another student responded: The tutor intervened: Another student commented:
Learning Mechanisms Observed
Earlier in this chapter, we identified the learning mechanisms in collaboration. Some of these mechanisms can be observed in the sample discussion above. The discussion started with one student trying to answer the question “What is regression analysis?” This is clearly an example of reflection: meaning is given to the concept of regression analysis. The student tried to verbalize his or her own ideas to himself or herself and to peers—an example of articulation. It was, of course, also a relevant activity, leading to a relevant learning experience. The first student had some knowledge of regression analysis, but it was limited, which the student was well aware of. The message ended with a request for clarification, prompting response activity from other students and leading to new reflection and finally appropriation, with the students acquiring knowledge from others through the sharing of reflections on the concept of regression analysis. The tutor was there to guide and stimulate the discussion.
The sample exchanges are an example of an attempt to achieve deep understanding of regression analysis, a difficult topic for an introductory course. There are more interesting discussions in the forums, but we do not have the space to deal with them here.
We have used POLARIS for several years now. In our experience, the tool is easy to employ. Nevertheless, students need to be induced to use it. This can be done by posing interesting and stimulating questions that are helpful to the understanding of the subject. Students need to be induced to learn through online discussion so that they can see the advantage of such interaction with their peers. In fact, their perception of learning should change before they can become active learners who are willing to adopt POLARIS as a tool.
For faculty, too, POLARIS can be an interesting and useful tool. Exchanges in POLARIS indicate where students are having difficulty in their learning, where misconceptions arise, and how these can be corrected by using appropriate content-specific pedagogical techniques. Used in this way, POLARIS can become a powerful tool for building content-specific pedagogical theories, first with a limited scope specific to the situation for which they were developed. Gradually the scope of the pedagogical theory may become more general and may be expanded to encompass learning in a specific domain. In our experience, POLARIS is a powerful tool for theoretical design.
Dillenbourg, P. (Ed.) (1999). What do you mean by collaborative learning? In Collaborative learning: Cognitive and computational approaches (pp. 1-19). Oxford: Elsevier.
Gilly, M. (1989). The psychosocial mechanisms of cognitive constructions: Experimental research and teaching perspectives. International Journal of Educational Research, 13 (6), 607-21.
Johnson, D. W., Johnson, R. T., Stanne, M. B., & Garibaldi, A. (1990). Impact of group processing on achievement in cooperative groups. Journal of Social Psychology, 130, 507-16.
Koschmann, T. D., Myers, A. C., Feltovich, P. J., & Barrows, H. S. (1993). Using technology to assist in realizing effective learning and instruction: A principled approach to the use of computers in collaborative learning. Journal of the Learning Sciences, 3, 227-64.
Ploetzner, R., Dillenbourg, P., Praier, M., & Traum, D. (1999). Learning by explaining to oneself and to others. In P. Dillenbourg (Ed.), Collaborative learning: Cognitive and computational approaches (pp. 103-21). Oxford: Elsevier.
Ronteltap, C. F. M., & Eurelings, A. M. C. (1997). POLARIS: The functional design of an electronic learning environment to support problem based learning. Paper presented at the ED-MEDIA ’97 Conference. Calgary.
Ronteltap, C. F. M., & Eurelings, A. M. C. (2002). Activity and interaction of students in an electronic learning environment for problem-based learning. Distance Education, 23 (1), 11-22.
Schmidt, H. G. (1983). Problem based learning: Rationale and description. Medical Education, 17, 11-16.
Spivey, N. N. (1997). The constructivist metaphor: Reading, writing, and the making of meaning. San Diego: Academic Press.
Springer, L., Stanne, M. E., & Donovan, S. S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research, 69 (1), 21-51.