Developing Design Principles to Scaffold ePBL: A Case Study of eSTEP

views updated

Developing Design Principles to Scaffold ePBL: A Case Study of eSTEP

Introduction
Round 1: Moving Online with Parallel Play
Round 2: Getting Students Engaged
Round 3: Toward Meshed Interaction
Discussion
Acknowledgments
References

Cindy E. Hmelo-Silver
Sharon J. Derry

Introduction

Problem-based learning (PBL) is an effective approach to collaborative learning in professional education environments (Hmelo, 1998; Hmelo-Silver, 2004; Derry & Hmelo-Silver, 2005). It provides a cognitive apprenticeship in which students learn through solving problems and reflecting on their experiences. Students work in small collaborative groups with a facilitator who scaffolds the learning process. Effective transfer is promoted when students repeatedly bring together conceptual ideas underlying a domain with visions and plans of professional practice as they construct what we call a meshed schema representation (Derry, 2006; Derry & Hmelo-Silver, 2005). In PBL, learners study and discuss concepts in depth and apply them to practical problems, so they become highly trained in recognizing how these ideas and reasoning are used in varied problems across many cases of practice.

The subject of this chapter is the evolutionary design of the eSTEP system, a collection of tools, text, and video materials supporting an innovative online course in learning sciences for teacher education. eSTEP allows us to expose teachers to classroom practice through the use of video cases. Learning activities are designed to enable teachers to discuss and use course concepts while simultaneously perceptually encoding events from videos of actual classroom practice (Derry et al., 2005). These activities help learners build up schemas in which different and varied kinds of knowledge (declarative, procedural, and perceptual) are meshed together in ways that emphasize the deeper conceptual themes taught in the course. This theoretical idea of meshing was not a part of our initial instantiation of the PBL process in the eSTEP system; it evolved during our work, as described next.

The goal of this chapter is to present a narrative of how our design of eSTEP evolved as well as how our theory about learning from video was refined. All the design and testing rounds involved different groups of preservice teachers, at both Rutgers University and the University of Wisconsin-Madison, where we work (see Chernobilsky et al., 2005; Derry et al., 2005). Our goal was to help preservice teachers understand how the learning sciences apply to classroom practice. Our data from the initial rounds of work— postings of ideas in the eSTEP environment—were interpreted as indicators of engagement. In later rounds, we collected additional detailed process data such as frequency of posts and computer logs of students’ use of the system (Chernobilsky & Chernobilsky, 2005; Hmelo-Silver & Chernobilsky, 2004) as well as information about learning outcomes (Derry et al., 2005).

A traditional model of PBL was the point of departure for the design initially. In such a PBL environment, typically a group of 5-7 students work with their own facilitator (Barrows, 2000), who provides instructional guidance by scaffolding the learning process. Much of this scaffolding is in the form of metacognitive questions that help structure the group’s learning and problem-solving processes, assist the group in managing their time, and push them to think deeply (Hmelo-Silver, 2002). In addition to the scaffolding provided by the facilitator, a structured whiteboard helps support the group’s learning and problem solving. Typically, this whiteboard has four columns: facts, ideas (hypotheses about the causes of the problem and possible solutions to the problem), learning issues (concepts that the students need to learn more about in order to solve the problem), and an action plan (a “tickler” list). The whiteboard provides a focus for students to negotiate and represent their understanding of the problem and possible solutions, and it also guides discussion (Dillenbourg, 2002; Hmelo-Silver, 2003; Suthers & Hundhausen, 2003). Larger classes in which an instructor is not present in each group will require additional scaffolding (e.g., a more structured whiteboard or activity structure) to support PBL (Steinkuehler et al., 2002). The first author (CHS) had engaged the PBL method in her educational psychology class using printed case material in a real classroom setting for two years and had identified some areas of weakness that computer-based scaffolding might address (Hmelo-Silver, 2000). Rather than having an assigned facilitator for each group, she used a wandering facilitator model. This allowed the facilitator only short periods of time with each of the groups and reduced the amount of scaffolding and monitoring that could be provided. In addition, the printed cases did not adequately convey the richness of classroom practice. Thus, prior to creating an online version of PBL activities, we identified several problems that we hoped an integrated online environment could address. However, putting PBL online required careful consideration of how the environment could serve to structure the learning process.

One major adaptation was a move from synchronous face-to-face discussion to asynchronous online discussion. There were two reasons for this adaptation. First, students engaging in asynchronous discussion tend to be more reflective (Andriessen, 2006; Bonk et al., 1998). Second, it is easier for a single instructor to facilitate multiple groups in an asynchronous environment than in a synchronous environment. A second major adaptation was based on research suggesting that domain-specific scaffolding geared specifically towards teacher education may be more effective in the online environment (Hmelo & Guzdial, 1996; Hmelo-Silver, 2006; Reiser, 2004) than the general whiteboard appropriate for all subject matter, as used in face-to-face PBL. To accomplish our goal of ePBL, we needed to design a whiteboard that would specifically promote principled instructional design activities. We wanted an online whiteboard (and other online tools) that helps structure the collaborative PBL process and promotes productive learning interactions (Dillenbourg, 2002). And we wanted to strike a balance between productively constraining group interaction and allowing the process to remain student-centered.

In the remainder of the chapter, we describe the three design iterations of eSTEP. In the first round, we simply tried to move the traditional model of PBL online. We quickly learned that we needed to provide additional structure in the online environment. The second round added a great deal of structure as we tried to distribute some of the scaffolding onto the system. This structure ended up being overly complex, and as will be explained, issues with tool integration remained. The third round was successful as we developed engaging tasks, established a sufficient activity structure, and integrated the tools. In all three rounds, we learned that we needed to take into consideration the tasks and the activity structure in addition to the online tools.

Round 1: Moving Online with Parallel Play

Our initial goal was to do an online adaptation of PBL to help preservice teachers learn how the learning sciences apply to the teaching practice. This activity structure focused on having students use the learning sciences to interpret, evaluate, and redesign actual video cases of classroom instruction. We used video to make the problems more realistic than in the printed versions. To support small-group interaction, the initial online environment included a notebook for individuals to record case analyses and reflections (Figure 2.1a), a

structured group whiteboard to serve as a focus for negotiation as in a face-to-face environment (Figure 2.1b), and an asynchronous threaded discussion to allow students to engage in less-structured discussion. For instructional resources, in addition to standard textbooks, students had access to the Knowledge Web, an online learning sciences hypertextbook (DelMarcelle et al., 2002; Derry, 2006).

The first implementation of PBL required students to analyze a video case of science instruction in which the teacher was not achieving the learning outcomes that he had hoped for. The problem required students to redesign the video case based on an analysis of the case from a learning sciences perspective. To facilitate students’ initial individual analyses and subsequent group analysis and redesign, we constructed a structured individual notebook designed to scaffold the initial analysis, while argumentation was promoted through the group whiteboard, which provided a space for students to post their ideas and a place to post comments that identified the strengths (pro) and weaknesses (con) of the proposed redesign.

The specific prompts chosen in these scaffolds created representations that we had hoped would guide the discussion in productive ways (Suthers & Hundhausen, 2003). The initial activity structure was simple, with three phases. Students were asked, firstly, to do an individual case analysis; secondly, to conduct a collaborative analysis in the threaded discussion; and, thirdly, to develop a redesign proposal on the whiteboard. The design of this activity was tested in two groups that were experienced in PBL conducted in a face-to-face format. This particular PBL activity was the last of six that students were required to complete for their educational psychology course and it was conducted entirely online.

While both groups had functioned well in a face-to-face format, they were not very effective online. We identified three potential reasons. First, we observed a parallel play phenomenon—that is, the students did not coordinate their postings. They moved through the activity on parallel paths without meaningfully interacting. For example, one student might post a note, and another student might post another note 1-6 days later. As the facilitator (CHS) noted in her journal, “I am still frustrated with the parallel play aspect of the activity. I think that the first few times students do a problem like this they will need a lot of structure in the task, in terms of milestones and the required number of notes in a specified part of the site. As I have said before, a big problem is the disconnect between the web board and the whiteboard.” The students’ proposed solutions tended to be somewhat independent of each other. This is antithetical to the central tenet of PBL, that ideas are collaboratively reviewed, negotiated, and decided upon. Second, the structure of the activity was very broad. Norms of face-to-face interaction did not simply translate to the online environment. Although two weeks were allotted for the group phase of the activity, the students tended to think of that as a deadline and some did not post anything until the final day. Further, unlike in a face-to-face situation, where silence is awkward, in an online environment silence is difficult to break. The facilitator spent a great deal of effort e-mailing students to encourage them to get online and join the discussion. Third, technical issues hampered facilitation. For example, the facilitator could not post to the group whiteboard; if a student posted something to the whiteboard, the facilitator could only post a question in the threaded discussion, where there was no context for the question.

This initial experience identified several important issues, both theoretical and practical, that needed to be addressed before the next implementation round. First, the activity structure should more forcefully encourage interaction and discourage parallel play. Second, we had to adjust our expectations for how norms of interaction would transfer and develop online. Third, from a cognitive apprenticeship perspective, the representations and the activity structure needed to better scaffold students’ learning and problem solving (Collins et al., 1989; Hmelo-Silver, 2006). The activity structure and the multiple work spaces were not integrated in a way that afforded a truly collaborative approach to learning, which is a frequent challenge in developing computer-supported collaborative learning environments (Dillenbourg, 2002). Thus, this initial experience demonstrated the need for distributing some of the facilitation onto the interface and the activity structure (Steinkuehler et al., 2002).

Round 2: Getting Students Engaged

For the next round, we redesigned and restructured the activity to address the concerns mentioned above. Yet, it still needed to be a student-centered learning process that provided more milestones for the students’ activity. We first reconceived the phases of the activity and their milestones as time frames rather than as deadlines to emphasize the continuity of the activity. We divided the activity into 12 discrete steps to help students manage their time and effort as shown in the road map in Figure 2.2a. The titles of the steps more clearly communicated what students might expect in each part of the PBL activity. In addition, the task itself was simplified. Rather than having students redesign a lesson, they engaged in a collaborative conceptual analysis of two minicases, chosen from a complete video case that contained ten minicases. The prompts in the individual notebook were designed to focus students on pertinent aspects of the case and to help them make decisions about which minicases they would analyze and the concepts they would explore in depth (Figure 2.2b). After group analysis, students would design their own individual lessons. The threaded discussion board was the place for students to decide on the minicases for analysis, choose concepts to explore, and make comments for a scribe (chosen by the group) to incorporate into the conceptual analysis in the whiteboard tool (Figure 2.2c). The activity was designed to be conducted completely online with initial individual analysis, followed by joint group analysis of a section of the case (the minicase) and, finally, individual design of a lesson. This design overcame the parallel play problem, as students posted their ideas and responded to each other’s ideas (DelMarcelle & Derry, 2004; Hmelo-Silver & Chernobilsky, 2004). There was also a great deal of interaction, but this occurred entirely in the threaded discussion.

Reflecting on this implementation, we identified two major problems with this design. First, the whiteboard itself did little to focus group discussion in productive ways. We had succeeded in getting students engaged in the activity but not always productively. Often their posts involved either elaborate conceptual discussions with weak connection to the case or, conversely, were very grounded in the case with superficial connection to conceptual ideas. For example, Figure 2.2c shows the students describing at length what the teacher, Kyle, should be doing. They make some connection to the specifics of the case (e.g., about Kyle controlling the discussion) but provide no evidence to back up their views. For instance, although the students exhibited a clear preference for a student-centered discussion, it is not clear how well they understood why this should enhance student learning. From a procedural standpoint, the roles that students needed to play in this activity were not optimal for learning. For example, negotiation could only be conducted in the threaded discussion, and only the student designated as the scribe could post entries on the whiteboard for the entire group, which made it difficult to integrate the threaded discussion with the conceptual analysis on the whiteboard. The lack of integration made facilitation difficult, as in round 1. This experience demonstrated that the design had to provide representations that could guide anchored collaboration (Guzdial et al., 1997; Hmelo et al., 1998; Suthers & Hundhausen, 2003). In anchored collaboration, discussion is “anchored” around the artifact being discussed, which means students need to be able to comment directly on the whiteboard, which was not possible in this implementation.

Second, the activity provided structure, but a complex one. The discussion board was the major place where students worked, with this particular group posting 147 notes in 18 threads over six weeks, but it offered no guidance to focus the students. Many posts were devoted to choosing the specific minicases to examine and deciding which concepts to explore (DelMarcelle & Derry, 2004). The next round’s design needed to get students beyond procedural issues and toward deep discussion. In addition, the conceptual analysis was not well connected to the students’ individual lesson designs, thus the activity structure needed to be more coherent.

Besides addressing the practical problems we identified during round 2, we also developed theory about how students learn in complex knowledge domains. As we argued previously (Derry, 2006; Derry et al., 2005), the transfer of ideas from the classroom to future practice requires that students develop representations which support complex forms of cognitive “meshing” of concepts, skills, and perceptual visions of practice. We conceptualize both pre-professional learners and practitioners as people who experience their environments through cognitive processes that are essentially perceptual in nature. These processes involve perceiving situations in the environment, which activates complex cognitive patterns within individuals, and then responding to those perceived situations with understanding and action in ways that should transfer course knowledge. It is thus important to teach so that ideas, as well as concepts (e.g., attention, metacognition) and skills (e.g., scaffolding, reciprocal teaching) covered in our courses, are later assembled to support planning and appropriate actions in subsequent practice. This evolving theory started initially as a refinement of early ideas about PBL, cognitive flexibility theory (Spiro et al., 1988), and cognitive apprenticeship (Collins et al., 1989), but it became something quite different as we compared our intentions in design with the actual implementation of our first two rounds, and between the face-to-face and online discussions (Derry, 2006; Hmelo-Silver, 2000; Chernobilsky et al., 2004). We needed to support interactions that meshed ideas about the perceptual information derived from the presented video-based problems with relevant conceptual ideas, through our whiteboard design and activity structure.

Round 3: Toward Meshed Interaction

We made a number of changes for our third design round. First, we reduced the complexity of the activity structure, as the revised road map in Figure 2.3a shows. Rather than focusing on a conceptual analysis task, we structured the problems to require either redesign of the lesson presented in a video or adaptation of techniques shown in a video. Thus, there was a clearer and more authentic problem for the students to work on. We also used a hybrid activity structure in which some of the activity occurred online but steps that required students to discuss procedural issues occurred face to face (e.g., deciding which concepts to explore). We very explicitly embedded the backward design process, developed by Wiggins and McTighe (1998), throughout the online activity beginning with the pre-analysis recorded in the individual notebook (Figure 2.3b). This helped provide a structure for the activity as a whole and for the whiteboard and notebook tools in particular, as it provided a principled model for instructional planning. In this approach, students began by identifying the major understanding (instructional goals) they wanted their students to attain, then their students to attain, then considering what might be evidence of those understanding, and finally planning instructional activities which would provide that evidence.

With meshing as a goal, we reflected on the prior implementations, in which the activity structures and group whiteboard seemed to hinder complex discussion. We redesigned the group whiteboard to more seamlessly connect design or adaptation proposals to discussion spaces. In this round, the whiteboard included an integral discussion space as shown in Figure 2.3c. In the example shown, students were using contrasting cases of lessons on static electricity. The students would post ideas or proposals for redesign, and the board would automatically attach a comment space for each group member to reflect on, evaluate, and provide feedback on the proposal. This design addressed two critical issues. First, it clearly represented potential solutions to the instructional problem along with the associated discussion of each solution. Second, these two spaces were physically connected, allowing students and facilitators to easily interact with one another. In particular, it allowed the facilitator to help support perceptual-conceptual meshing by posting questions that were anchored to the students’ comments. After a student entered a proposal, each group member had a space to comment on that specific proposal. Not shown in the figure is the facilitator’s comment: “Great discussion folks—what is the psychological rationale for having students work on experiments? See Sally’s comment below as well.” As this example shows, the students were all able to post responses to this proposal as well as posing questions to the rest of the group (e.g., “What does everyone else think?”). In addition, although the software had the limitation of allowing each participant to post only one comment on any proposal (although this proposal could be added or edited), this group developed a norm of maintaining all their previous comments in their individual response boxes. This design accomplished our aim of making the whiteboard the focus for negotiation. In this particular problem, which the students worked on for two weeks, there were 50 posts on the whiteboard and only 20 posts in total in the discussion board, mostly for the purpose of sharing research. Students were engaged in productive ways as demonstrated by the number of posts and the detailed analyses of student discourse (Chernobilsky et al., 2005). Over several semesters of using this design, students in the eSTEP environment demonstrated significant learning gains compared with students in a comparison group (Derry et al., 2005).

This design clearly accomplished the goal of creating interaction focused on the students’ proposals for assessment and activities. What evidence then do we have that they were meshing the conceptual and perceptual ideas? In one of the proposals, Maria wrote in somewhat general terms about applying cognitive apprenticeship to the design of a learning activity, specifically using the notion of scaffolding, as well as about the need to apply what is learned to a real-life situation. Carrie proposed the idea of a prediction sheet and then Maria, in her comments, talked a little more about the need to help structure the activity. Carrie jumped in at one point to ask about the connection to the video they watched. The group worked together to clarify their understanding of what they saw in the video as well as to refine and specify an activity that might fit their notion of cognitive apprenticeship, in particular focusing on scaffolding and context. Sally later posted, in response to Carrie’s ideas about experiments, and added the notion of deliberate practice to help the students learn inquiry skills. Elsewhere, Linda proposed that the teacher engage the class in concept learning and initially provided a fairly decontextualized description but concluded with “Blair Johnson [a teacher in the video] should use this idea so that static electricity does not become an isolated concept in students minds that they will not be able to use. Finally, teachers should use concepts in ‘REAL LIFE SITUATIONS’ as this has been shown to ‘increase chances of transfer, link ideas to prior knowledge, and decrease chances of misconceptions.’ ” Here Linda was meshing notions of concept learning with general advice to the teacher about applying concepts to real-life situations.

These are just a few examples of many. We attribute part of the success in creating mesh to providing this anchored collaboration environment, in which students’ (and the facilitator’s) contributions and reflections were connected to authentic planning activities and the rich perceptual experiences the video cases afforded. In addition, in this round, the facilitator became a full participant in the group work. With the allocation of a specific space for comments on the whiteboard, the group members responded to the facilitator’s efforts in scaffolding learning. The group shown in these examples was particularly good at meshing their conceptual and perceptual ideas. They also identified some limitations in the system. As they added to their one allowed “comment post,” students developed a norm of labeling their comments as old and new so they could keep track of their discussion. For groups to engage in lengthy discussions, we need to keep the advantages of the integrated space but at the same time find a way to give the groups room to grow. Although there is still work to be done and also variability in how groups use the whiteboard, we have constructed a whiteboard design that meets our initial practical goal of providing a space for negotiation as well as our later theoretical goal of supporting conceptual-perceptual meshing.

Discussion

This chapter tells the story of the evolution of our system design through the generation of hypotheses about what features would support the kinds of discussion we hoped to promote, followed by implementation of the design, and finally critical analysis of interaction patterns. Our experience in designing for productive interaction led us from a focus on the pragmatic issues of taking an effective instructional model and adapting it for online use to one on instantiating a theory of how people effectively learn from cases. We went from an underconstrained environment to a highly scripted version and found an appropriate middle ground that met our instructional and theoretical goals (Dillenbourg, 2002). The eSTEP experience also demonstrated the affordances of different collaboration spaces for different kinds of activity. The first two implementations of eSTEP showed that asynchronous spaces do not work very well for task-related negotiation, which is better conducted synchronously. Asynchronous discussion can work very well for other parts of the activity structure, provided the tools are integrated. In particular, anchoring the discussion around students’ proposed solutions works well in this mode.

As Dillenbourg (2002) notes, designs for computer-supported collaborative learning need to incorporate activity structures that integrate disparate individual and collaborative activity phases as well as face-to-face and computer-mediated communication. Such designs have to consider incorporating signposts to help students manage time and the important role of the facilitator. Problems may emerge in such designs, several of which were experienced in the evolution of eSTEP. In particular, our first two rounds of design disturbed the kinds of natural interaction that needed to occur. In round 1, the interface did not make it clear how the whiteboard and the threaded discussion were related, so these tools failed to shape the collaboration in productive ways. In round 2, the activity structure did not help the students come to consensus on their choices of what concepts and minicases to focus on and, in addition, the complexity of the activity structure in this round may have served to increase the students’ cognitive load.

In the tradition of design experiments, our goals were twofold: to develop and refine theories about learning and to “engineer” the means to support that learning (Cobb et al., 2003). Our iterative design process was a reflexive one, in which our instructional theory both informed and evolved from our system design. In the beginning, having never implemented a collaborative activity in an online environment, it was expected that we would make a few mistakes. Simply learning the constraints and affordances of the technology and how students would interact with it was an initial goal. As we began to see how we could effectively use technology to support PBL, we refined important components of the instructional environment and we began to better understand the connection between the activity structure, the group whiteboard, and the students’ collaborative knowledge construction. We developed a theory that connects three critical components of our instructional environment and describes how they contribute to effective learning that promotes transfer to professional practice. The first component, the learning sciences or conceptual component, was the foundation that we began with. In the beginning, the primary goal was to teach students the learning sciences and how they can be used to inform instruction. To this end, we utilized face-to-face PBL activities that required students to analyze and design instruction. The second component was the planning or design component. We wanted students to learn the connection between instructional theory and design, as well as to develop design skills that blended the two seamlessly. As we moved activities online and began to experiment with video cases, we began to understand the need for students to ground their conceptual and design knowledge in knowledge of actual classroom practice, the third component. Unlike printed material, video provides an unparalleled perceptual experience. The latest round of eSTEP design attempted to authentically and meaningfully connect these components of the instructional environment. Our theory suggests that the rich conceptual and perceptual meshing that eSTEP affords will help preservice teachers transfer their learning sciences knowledge to their future teaching practice.

Our experience in designing eSTEP has important practical implications for designing similar systems. First is the consideration of the complexity of the learning environment as a whole. We began by developing tools but, like many other designers, soon learned that we needed to consider the activity structure and the nature of the problem itself (see, e.g., Edelson, 2005). Second is the consideration of how the PBL model may have to be adapted to different domains. Domain-specific scaffolding (such as our structuring a design activity in terms of specific “backward design” steps) is especially important when a facilitator must work with many groups in different domains and some of the scaffolding may need to be distributed onto the system. This requires consideration of how the general scaffolds from PBL need to be modified. In our case, we used the backward design structure for instructional design problems. Finally, the system should support learning. While this may seem an obvious statement, it is not always so obvious how the different components of a complex learning environment fit together seamlessly. The tools need to be reliable, the interface clear and intuitive, and the activity structure needs to integrate all the tools.

Acknowledgments

This research was funded by NSF ROLE grant # 0107032 and the Joyce Foundation. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. We especially thank David Woods, Ellina Chernobilsky, and Matt DelMarcelle, our partners in developing the system. We thank all the students who helped us learn as they participated in our eSTEP courses.

References

Andriessen, J. (2006). Collaboration in computer conferencing. In A. M.

O’Donnell, C. E. Hmelo-Silver & G. Erkens (Eds.), Collaborative learning, reasoning, and technology. Mahwah, NJ: Erlbaum.

Barrows, H. S. (2000). Problem-based learning applied to medical education. Springfield, IL: Southern Illinois University Press.

Bonk, C. J., Hansen, E. J., Grabner-Hagen, M. M., Lazar, S. A., & Mirabelli, C. (1998). Time to “connect”: Synchronous and asynchronous dialogue among preservice teachers. In C. J. Bonk & K. S. King (Eds.), Electronic collaborators: Learner-centered technologies for literacy, apprenticeship, and discourse (pp. 289- 314). Mahwah, NJ: Erlbaum.

Chernobilsky, E., DaCosta, M. C., & Hmelo-Silver, C. E. (2004). Learning to talk the educational psychology talk through a problem-based course. Instructional Science, 32, 319-56.

Chernobilsky, E., Nagarajan, A., & Hmelo-Silver, C. E. (2005). Problem-based learning online: Multiple perspectives on collaborative knowledge construction. In T. Koschmann, D. D. Suthers & T.-W. Chan (Eds.), Proceedings of the Computer Supported Collaborative Learning Conference CSCL 2005 (pp. 53-62). Mahwah, NJ: Erlbaum.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32, 9-13.

Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 453-94). Hillsdale, NJ: Erlbaum.

DelMarcelle, M., & Derry, S. J. (2004). A reflective analysis of instructional practice in an online environment. Paper presented at the Sixth International Conference of the Learning Sciences. Santa Monica, CA.

DelMarcelle, M., Derry, S. J., & Hmelo-Silver, C. E. (2002). Identifying antecedents to tutorial interactions in online PBL discourse. Paper presented at the American Educational Research Association Annual Meeting. New Orleans, LA.

Derry, S. (2006). eSTEP as a case of theory-based web course design. In A. M. O’Donnell, C. E. Hmelo-Silver & G. Erkens (Eds.), Collaborative learning, reasoning, and technology. Mahwah, NJ: Erlbaum.

Derry, S. J., & Hmelo-Silver, C. E. (2005). Reconceptualizing teacher education: Supporting case-based instructional problem solving on the World Wide Web. In L. PytlikZillig, M. Bodvarsson & R. Bruning (Eds.), Technology-based education: Bringing researchers and practitioners together. Greenwich, CT: Information Age Publishing.

Derry, S. J., Hmelo-Silver, C. E., Feltovich, J., Nagarajan, A., Chernobilsky, E., & Halfpap, B. (2005). Making a mesh of it: A STELLAR approach to teacher professional development. In T. Koschmann, D. D. Suthers & T.-W. Chan (Eds.), Proceedings of CSCL 2005 (pp. 105-14). Mahwah, NJ: Erlbaum.

Dillenbourg, P. (2002). Over-scripting CSCL: The risks of blending collaborative learning with instructional design. In P. A. Kirschner, Three worlds of CSCL (pp. 61-91). Heerlen: Open Universitat Nederland.

Edelson, D. C. (2005). Engineering pedagogical reform: A case study of technology supported inquiry. Paper presented at the Inquiry Conference on Developing a Consensus Research Agenda. Piscataway, NJ. http://www.ruf.rice.edu/~rgrandy/NSFConSched.html.

Guzdial, M., Hmelo, C., Hübscher, R., Nagel, K., Newstetter, W., Puntambekar, S., Shabo, A., Turns, J., & Kolodner, J. L. (1997). Integrating and guiding collaboration: Lessons learned in computer-supported collaborative learning research at Georgia Tech. In R. Hall, N. Miyake & N. Enyedy (Eds.), Proceedings of the Computer Support for Collaborative Learning Conference CSCL ’97 (pp. 91-100). Toronto: University of Toronto.

Hmelo, C. E. (1998). Problem-based learning: Effects on the early acquisition of cognitive skill in medicine. Journal of the Learning Sciences, 7, 173-208.

Hmelo, C. E., & Guzdial, M. (1996). Of black and glass boxes: Scaffolding for learning and doing. In D. C. Edelson & E. A. Domeshek (Eds.), Proceedings of ICLS ’96 (pp. 128-34). Charlottesville, VA: Association for the Advancement of Computing in Education.

Hmelo, C. E., Guzdial, M., & Turns, J. (1998). Computer-support for collaborative learning: Learning to support student engagement. Journal of Interactive Learning Research, 9, 107-30.

Hmelo-Silver, C. E. (2000). Knowledge recycling: Crisscrossing the landscape of educational psychology in a problem-based learning course for preservice teachers. Journal on Excellence in College Teaching, 11, 41-56.

Hmelo-Silver, C. E. (2002). Collaborative ways of knowing: Issues in facilitation. In G. Stahl (Ed.), Proceedings of Computer Supported Collaborative Learning Conference CSCL 2002 (pp. 199-208). Hillsdale, NJ: Erlbaum.

Hmelo-Silver, C. E. (2003). Analyzing collaborative knowledge construction: Multiple methods for integrated understanding. Computers and Education, 41, 397-420.

Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review, 16, 235-66.

Hmelo-Silver, C. E. (2006). Design principles for scaffolding technology-based inquiry. In A. M. O’Donnell, C. E. Hmelo-Silver & G. Erkens (Eds.), Collaborative learning, reasoning, and technology. Mahwah, NJ: Erlbaum.

Hmelo-Silver, C. E., & Chernobilsky, E. (2004). Understanding collaborative activity systems: The relation of tools and discourse in mediating learning. In Y. B. Kafai, W. A. Sandoval, N. Enyedy, A. S. Nixon & F. Herrera (Eds.), Proceedings of the Sixth International Conference of the Learning Sciences (pp. 254-61). Mahwah, NJ: Erlbaum.

Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. Journal of the Learning Sciences, 13, 273-304.

Spiro, R. J., Coulsen, R. L., Feltovich, P. J., & Anderson, D. K. (1988). Cognitive flexibility theory: Advanced knowledge acquisition in ill-structured domains. In Proceedings of the Tenth Annual Conference of the Cognitive Science Society (pp. 375-83). Hillsdale, NJ: Erlbaum.

Steinkuehler, C. A., Derry, S. J., Hmelo-Silver, C. E., & DelMarcelle, M. (2002). Cracking the resource nut with distributed problem-based learning in secondary teacher education. Journal of Distance Education, 23, 23-39.

Suthers, D. D., & Hundhausen, C. D. (2003). An experimental study of the effects of representational guidance on collaborative learning processes. Journal of the Learning Sciences, 12, 183-218.

Wiggins, G., & McTighe, J. (1998). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development.

About this article

Developing Design Principles to Scaffold ePBL: A Case Study of eSTEP

Updated About encyclopedia.com content Print Article

NEARBY TERMS

Developing Design Principles to Scaffold ePBL: A Case Study of eSTEP