Shari L. Jackson, Steven J. Stratford
Joseph Krajcik, Elliot Soloway
University of Michigan
Paper presented as part of a Symposium, Finkel, E., (Chair), The ScienceWare Project: Supporting Modeling and Inquiry via Computational Media and Technology, conducted at the annual meeting of the National Association for Research on Science Teaching, April 1995, San Francisco, CA. Another version of this paper was presented at the annual meeting of the American Educational Research Association.
Modeling, for pre-college science students, is an out-of-reach cognitive activity. Professional tools are too hard for novices to use, user-unfriendly, and provide no support for learners. We have designed a new modeling tool, Model-It, to provide scaffolding for learners, enabling them to build and test dynamic models of complex systems easily, using object-oriented and qualitative techniques. Model-It provides scaffolding that grounds the learner in prior knowledge and experience, that bridges the learner from novice to expert understandings and practices, and that couples the learner's mental model with testing actions and model feedback. Our user and classroom testing shows evidence that the scaffolding strategies of Model-It do indeed support learners' active construction of knowledge, and that students can create meaningful models.
It makes you think more about a real-life situation, where there's no real answer, you set it up and everything.
This statement by a 9th-grade science student about our software, Model-It, speaks volumes about why constructing models in an interactive learning environment can be valuable. It makes you thinkif learning is active construction of knowledge, with the emphasis on `active,' then this student is an active learner. The thinking is directed toward a real-life situationan authentic, situated context where the learning has meaning. There's no real answershe realizes that any answer she finds will be tentative and that the process of generating an answer will be one of inquiry and investigation. You set it up and everythingby emphasizing the word `you,' she indicates that the modeling problem is personally meaningful and valuable and recognizes that set[ting] it up and everything will require her active participation.
Scientists build models to test theories and to improve their understanding about complex systems. In this exploratory, speculative style of modeling,
Simulation is used at a prototheoretical stage, as a vehicle for thought experiments. The purpose of a model lies in the act of its construction and exploration, and in the resultant, improved intuition about the system's behavior, essential aspects and sensitivities. (Kreutzer, 1986)
As our opening quote illustrates, students, too, can benefit from building models in order to develop their own understanding of natural phenomena. We should encourage theory-building and experimentation, since both are important activities of science, in order to give students a realistic view of the science profession (Tinker, 1990). Building models gives them opportunities to use their existing knowledge, to perform thought experiments and to gain insight into the behavior of complex systems.
The problem is that modeling as it is currently practiced is too hard for students to do it requires too much prior knowledge and mathematical ability. However, by redefining the modeling task, modeling can be made accessible to high school science students. For example, the recent Project 2061 curriculum reforms suggest a high-level, qualitative approach to modeling:
In modeling phenomena, students should encounter a variety of common kinds of relationships depicted in graphs (direct proportions, inverses, accelerating and saturating curves, and maximums and minimums) and therefore develop the habit of entertaining these possibilities when considering how two quantities might be related. None of these terms need be used at first, however. `It is biggest here and less on either side' or `It keeps getting bigger, but not as quickly as before' are perfectly acceptableespecially when phenomena that behave like this can be described (American Association for the Advancement of Science, 1993).
The challenge in making modeling accessible in the pre-college science classroom is to create a modeling environment which requires minimal prior knowledge from other domains, which incorporates advanced interface design, and which not only enables rapid generation of simple models, but facilitates the learner's transition toward more expert-like modeling practices. We have designed an constructivist interactive learning environment, Model-It, that provides qualitative expression, and utilizes learner-centered design techniques, to attempt to meet the challenge. In this paper, we discuss the theories, strategies, and techniques that were applied in its design and implementation, and the results of our first year of classroom testing. We also explore how the scaffolding designed into Model-It makes modeling accessible, and speculate on the cognitive mechanisms involved.
Several researchers have introduced computer-based modeling into high school and middle school classrooms (Mandinach & Thorpe, 1988; Mandinach & Cline, 1992). However, modeling was extremely difficult. First, sometimes students have had to learn to program in order to create working models. Other studies have found that students don't have the requisite mathematical knowledge for creating rigorous quantitative models, or for properly interpreting the graphical output of models (Roberts, 1985; Feurzeig, 1992). In these cases modeling was out of reach mainly because the cognitive load was simply too great students lacked the requisite prior knowledge in mathematical and programming domains.
Sometimes modeling is inaccessible because the software environment in which the modeling activities occurred is not designed to be user-friendly (for example, models were constructed with a command-line interface), or the modeling environment incorporates large numbers of expert level modeling functions. These kinds of environments are designed for experts, to enable the construction of sophisticated, mathematically precise models; learners, however, are easily overwhelmed or discouraged by confusing interfaces and too many options.
Briefly, here are the tenets upon which our research is built. We have elaborated upon our theory elsewhere (Blumenfeld et al., 1991; Krajcik et al., 1994; Soloway et al., 1995).
The process of learning is one of active construction of knowledge, as opposed to passive reception of delivered information (Piaget, 1954; Resnick & Glaser, 1976; vonGlaserfeld, 1989; Vygotsky, 1978). In this constructive process, learners assimilate experiences into existing cognitive structures, with the goal of developing a structure and depth of knowledge that approaches that of an expert in the domain under study (vonGlaserfeld, 1989, Resnick & Glaser, 1976). By constructing external representations (models) of scientific phenomena, learners can build and refine their internal, mental models of the phenomena (Papert, 1980; Draper & Swanson, 1990).
Learners need to be exposed to authentic problems and participate in authentic practices. Authentic problems are contextualized, real-world and non-trivial, and encompass worthwhile content (Blumenfeld et al., 1991). Professional practices are usually authentic and meaningful. However, learners have little knowledge of or experience with professional practices. So attempts to enculturate learners into professional practices must be supported, for example, by providing symbol systems that have meaning in the learner's knowledge and experience, but that help move the learner toward the use of symbol systems typical of professional practice (Kozma, 1991).
Learning tasks should be anchored; new knowledge should be situated, (Brown, Collins & Duguid, 1989). Tasks that are anchored in environments that permit sustained exploration by students and teachers ... enable them to understand the kinds of problems and opportunities that experts in various areas encounter and the knowledge that experts use as tools (CGTV, 1990) In this way students avoid the problem of inert knowledge and are better able to transfer old knowledge to new problems. Tasks, such as modeling, which allow learners to investigate and re-create real-world phenomena, are well matched with techniques of anchored instruction.
Learning occurs within a community, not in isolation. For example, while certain aspects of professional scientific inquiry are competitive, knowledge construction is usually collaborative. Much scientific inquiry occurs within a particular social context, with expertise distributed among collaborators. Learning environments that are intended to encourage collaboration and negotiation of meaning may be particularly instrumental in supporting the development of a community of learners within the classroom (Roth & Bowen, in press; Scardamalia & Bereiter, 1994).
The range of activities possible to learners can be amplified and extended with cognitive tools. We feel that computer-based learning environments in which the learner actively constructs an artifact have strong potential for promoting cognitive growth in the learner. Cognitive capacities are extended while using the technology; learners not only accomplish more with the computer than would be possible without, but also engage in much more mindful learning than without (Salomon, 1990).
We apply these theoretical tenets to the cognitive activity of model construction as a way to understand complex systems. When students build and test models of familiar natural phenomena, using supportive cognitive tools, in authentic and interesting contexts, with the support of their peers in the classroom community, they are able to test and refine their mental representations and understandings of that system. What follows is a description and rationale for the kinds of strategies that Model-It incorporates in order to support the model building and testing processes.
In the Highly Interactive Computing (HI-C) group at the University of Michigan, we have formulated a rationale for learner-centered design (LCD) (Soloway et al., 1994; Jackson et al., 1995). In designing software for education, we are mindfully designing for learners. Learners are also users, so the principles of user-centered design certainly apply (Norman & Draper, 1986). Graphical user interfaces are easier to use, more intuitive, and more engaging than command-line interfaces. Mouse-driven user-interactive features such as pop-up menus, active screen areas, and button-driven commands lighten cognitive overhead and encourage active cognitive engagement with the task (Hutchins, Hollan, & Norman, 1986).
However, user-centered design guidelines are not sufficient to address certain unique needs of learners, such as intellectual growth, diversity of learning styles, and motivational needs. For example, learners need software that represents information in a way that is familiar to them, but that also helps introduce them to more professional or symbolic representations. By designing software to help learners grow from apprenticeship towards mastery, we create an environment that is truly learner-friendly.
The central claim of LCD is that software can incorporate learning supportsscaffoldingthat can address the learner's needs. Scaffolding is important because it enables students to achieve goals or accomplish processes that would not be possible without its support and that are normally out of their reach (Vygotsky, 1978; Wood, Bruner, & Ross, 1975). Vygotsky said that these goals or processes are in the learner's zone of proximal development. This concept is variously expressed as enabling learners to engage in out-of-reach activities; having a knowledgeable other or more capable peer to bring the learner along; having something or someone share the cognitive load. The choice of goal and process is important. We want to scaffold tasks, such as modeling, that are rich learning experiences. The environment in which the learning takes place, such as that provided by a computer program and/or a classroom, also contributes to the scaffolding and to the value, or lack thereof, of the learning experience.
Scaffolding strategies can be implemented in many ways. Software-realized scaffolding strategies for programming environments include coaching, communicating process, and eliciting articulation (Guzdial, 1993; 1995). In a recent paper, we describe a general framework for software-realized scaffolding (Soloway et al., 1995). In this paper, however, we will focus on those scaffolding strategies that we believe are particularly appropriate for supporting model building and testing.
Grounding in Experience and Prior Knowledge. The learning environment should allow learners to create models with representations that are grounded in current experiences and understandings (CGTV, 1990; Brown, Collins & Duguid, 1989). The modeling environment, therefore, should be personally meaningful and approachable, and should allow opportunities for personalization.
Bridging Representations. New information representations should be connected to the learner's current understandings through examples, analogies, and multiple visual representations (Clement et al., 1989). These multiple synchronized representations can help move the learner to more expert-like model building techniques and understanding.
Coupling Actions, Effects, and Understanding. The interactive learning environment provides a tight coupling between the learner's actions while testing the model, the visual feedback produced by the software as a result of testing actions (Mokros & Tinker, 1987), and the learner's own mental representations of the phenomenon (Draper & Swanson, 1990). Since the learner actually built the model he's interacting with, this tight coupling provides a way for him to run his own mental model of the phenomenon, and to perform thought experiments (Kreutzer, 1986) on an externalized representation of his thinking.
In the later description of the Model-It program, we will elaborate on each of these scaffolding techniques and describe how each is implemented in Model-It. In the following section, however, we turn to some recent developments in the field of ecological modeling.
The scientific literature on modeling and simulation, and especially on ecological modeling, provides some new ways of thinking about modeling that we believe are especially beneficial for learners. In particular, there is a growing interest by the scientific community in the application of object-oriented languages for modeling and simulation, and the application of qualitative techniques for knowledge representation. We think that these object-oriented programming paradigms and qualitative knowledge representation techniques can be utilized to design and create modeling environments that provide grounding, bridging and coupling scaffolding for learners. We will discuss the advantages of these two techniques in the design of modeling environments, and the aspects of those environments that affect student learning.
Object-oriented programming languages are particularly appropriate for the design and implementation of computer-based models:
An object-oriented program is just like the representation of some part of the world in a human's mind, with all its facts, associations and beliefs closely corresponding to our intuitions. Simulation is then akin to thought experiments: an animation of this symbolic structure according to some rules of interaction. (Kreutzer, 1986)
Particularly in ecosystem modeling, object-oriented approaches offer a natural mapping to the phenomena being modeled. Constructing models of natural systems as sets of differential equations or as algorithms in a procedural computer code like FORTRAN does not lend one to think about the real objects in the system. But object-oriented models of ecosystems provide a clear link to observations of the natural world in which individuals and objects interact and affect the current states (attributes) of one another (Saarenmaa, Stone, Folse, Packard, Grant, Maleka, & Coulson, 1988).
An object-oriented approach to ecosystem modeling also has advantages for a modular implementation of modeling (Silvert, 1993a, 1993b). Following modular design for ecosystem modeling makes sense not only as good programming style, but also because it allows the independent representation of different populations (objects) in the ecosystem. Traditionally, the rate of change of a population is represented by a single differential equation, even when that equation involves variables that may be associated with several different populations. For example, the rate of change of a fish population might be described by the following equation:
d(biomass of fish)/dt = (growth) + (recruitment) - (reproductive outputs) - (natural mortality) - (fishing mortality) + (net migration)
Using object-oriented representations, one can instead implement a distributed derivative (Silvert, 1993b) in which each variable's effect on the rate of change is represented by a separate equation, and the overall effect is built up as a sum of the effects generated by each variable. The advantages of this approach are that each equation becomes simpler to create and to understand, and that the impact of new populations can be easily added to the system without changing existing equations. It is this approach that forms the basis of the implementation of relationships in Model-It (see Appendix A).
Providing an object-oriented framework for modeling allows students to think about the phenomena that they are modeling in a more natural way, i.e., matching the interacting objects that they can see in the world with what they see in the modeling environment, instead of having to translate those objects into abstract representations. Thus the object-oriented framework enables the grounding scaffolding strategy, and provides the initial entry point for bridging the learner from concrete to more abstract model representations. It also enables a simpler mechanism for model construction, as the student simply specifies pair-wise relationships between variables, and the underlying simulation engine handles the complexity of combining multiple impacts on the same variable. So not only does the object-oriented technique make sense from the programmer's point of view, it also provides the learner with a straightforward mechanism for dealing with what would otherwise be a difficult, if not intractable, mathematical modeling problem.
Models, especially computer-based models, are typically based on mathematical equations, so that in order to build a model it is first necessary to derive the equations that represent its behavior. This mathematical approach to modeling is taught at the collegiate level; for example, a software package for undergraduate and graduate courses in ecology teaches the quantitative expression of ecological processes by asking students to specify the coefficients to nonlinear differential equations (Ewel, 1989).
There is, however, a great deal to be said for qualitative modeling. Scientists often think qualitatively about a model before quantifying the relationships (White & Fredericksen, 1990). The purpose of modeling is insight not numbers, (Hamming, 1962) and when models are used for speculating about a system, the model may never be used to generate any numerical results at all (Kreutzer, 1986). Recently, much modeling and simulation research has explored applications for qualitative modeling (e.g., Cochran & Paul, 1990; Green, 1990; Guerrin, 1991; Salski, 1992).
Specifically in the domain of ecological modeling, it is often very difficult to obtain precise ecological data, and complex ecological systems are often too insufficiently investigated to permit formal numerical reasoning (Karplus, 1983). Qualitative approaches to ecological modeling can permit reasoning about a system without requiring precise data. For example, LARKS, a fuzzy knowledge-based system for ecological research, allowed the definition of linguistic rules based on the natural language that ecologists typically use to describe their knowledge about ecosystems (Salski, 1992). For example:
IF vegetation-height is low and
population of larks is very high and
vegetation density is minimally smaller than
standard
THEN number of territories is high
Providing qualitative representations for causal relationships allows students to deal with these relationships at the conceptual level, instead of on a level that requires a great deal of technical knowledge (e.g., mathematical formulae). This modeling technique, then, provides an entry point to modeling by grounding the students' early modeling efforts in familiar concepts. Qualitative modeling can also provide a bridge to more quantitative representations, scaffolding the student into representations more like those practiced by professionals.
In this section, we describe the Model-It program, and explain how each of the different scaffolding strategies were implemented in Model-It. Table 1 summarizes the three scaffolding strategies and briefly describes how each is implemented.
Scaffolding Strategy | Model-It Implementation |
Grounding in Experience and Prior Knowledge | Pre-defined high-level objects. Digitized, personalized photographs and graphics. Qualitative, verbal representation of relationships. |
Bridging Representations | Textual to graphical
representations of relationships. Qualitative to quantitative definition of relationships. Concrete to abstract representations of the model. |
Coupling Actions, Effects, and Understanding | Direct manipulation of factor
values while a simulation is running. Immediate, visual feedback of the effect of user's changes in factor values. |
Table 1: Scaffolding strategies and their implementation in Model-It
The primary task of creating a model is to recreate the phenomenon (or some part of it) in such a way that the structure and behavior of the model reflects the phenomenon itself. The objects and relationships that the learner sees and experiences in the world must somehow be re-represented within the modeling environment. To assist the learner in making the transition from what she already knows of the world over to computerized model representations, Model-It provides a set of pre-defined high-level objects (e.g. stream, macroinvertebrate population, golf course) with which she can build a model [1] . These physical objects provide a close conceptual match with the learner's knowledge representation of the domain, in contrast to an expert's knowledge representation which might consist of domain-independent input, output, function and state primitives. In effect, the modeling environment situates the model in the prior knowledge and experience of the learner.
Objects are represented visually with digitized photographs
and graphics. Students can create their
own objects and paste in their own pictures. Figure 1 shows the
Simulation Window of Model-It, with the stream object already in
place, and below, a palette of other objects that have been
created in this model, and that can be added to the simulation.
The stream is represented by a photograph of the actual stream
the student measured, collected bugs from, and got her feet wet
in. This personalized representation helps to create an authentic
context through which the activity has meaning. In this way, the
learning environment grounds the learner in her prior knowledge
and experience.
[1]Model-It can be used to build a wide range of process flow models; for our preliminary classroom study we chose the domain of stream ecosystems. In our description of the program, we use examples from this domain.
Figure 1: Simulation Window, and Object Palette
Students create or select from a set of objects, and define factors (measurable quantities or values associated with the objects). For example, the total phosphates measured in a stream, or the count of a population of bugs. Figure 2 shows the Object Window for the stream object, and the Factor Factory where the stream's phosphate factor is being defined.
Figure 2: Object Window and Factor Factory
Next, the student can define relationships between the factors (how the value of one factor affects the value of another). Model-It supports a qualitative, verbal representation of relationships, rather than requiring formal mathematical expressions. Students can define a relationship simply by selecting descriptors in a sentence, e.g., As stream phosphate
increases, stream quality decreases by less and less (Figure 3) [1]. This is another example of grounding, on a conceptual basis learners create relationships simply by re-representing them on the screen as English-like sentences (presumably the language of their prior knowledge and experience). This scaffolding is important for learners because their knowledge structures and skills don't initially include the same quantitative command of the concepts that experts would have. As students discuss the best representation for a relationship, (e.g., by gathering data, consulting experts, finding reference resources, etc.), constructing relationships also provides opportunities for them to participate in the community of learners.
Figure 3: Qualitative relationship definition: Text View
[1]Stream quality refers to a standard index called the Water Quality Index (WQI) developed by the National Sanitation Foundation (Mitchell & Stapp 1994). The WQI is determined by nine tests: dissolved oxygen, fecal coliform, pH, biochemical oxygen demand, temperature, total phosphate, nitrates, turbidity, and total solids. Associated with each test is a weighting curve chart which converts the value of the test into a 0-100 scale Q-value, indicating the impact of that test result on the health of the stream. The WQI is calculated as a weighted average of the Q-values for the nine tests, giving a measure of the overall stream quality.
Model-It also supports a similar qualitative definition of rate relationships which define how one factor sets the rate of change of another factor over time (Figure 4). Appendix A provides a detailed description of the mathematics behind both types of relationships.
Figure 4: Qualitative relationship definition of Rate Relationships
Model-It provides simultaneous, linked textual to graphical representations of relationships. Given a qualitative, textual definition, the software translates the text into a quantitative, visual representation; e.g. decreases by less and less is interpreted as shown by the graph in Figure 3. While the learner can create relationships easily that are grounded in internal understandings using the English-language representation, he also is presented with a corresponding, more abstract mathematical representation. This simultaneous representation establishes a bridge to the more expert-like representations of the modeling community.
The same principle applies to switching from the qualitative text view to a quantitative table view (Figure 5), in which the textual representation is re-represented as a table, which can then be tailored to a more accurate representation and understanding of the relationship.
Figure 5: Quantitative relationship definition: Table View
Model-It also provides bridging from concrete to abstract representations of the model. While the Simulation Window (Figure 1) provides a concrete, semi-realistic representation of the objects being modeled, the Factor Map presents a more structural and relational (and more abstract) representation of the model (Figure 6).
Figure 6: Visualizing abstract structure: Factor Map
Factors are represented by icons which include the object's picture; this provides a bridge between representations. The Factor Map helps students construct mental representations of the system they are modeling by providing a way to visualize the relational network of factors and relationships. This view is interactivestudents can rearrange the nodes in a visually meaningful way and make changes (e.g., drawing an arrow to create a new relationship).
Once objects, factors, and relationships have been defined, the student can run simulations with his/her model (Figure 7). The student selects factors to view during a simulation using meters (vertical indicators for dependent factors, controls for independent factors) and graphs (displays of factor values as they change over time). During a simulation, these meters and graphs provide immediate, visual feedback of the current state of the simulation. Students can directly manipulate current factor values even while the model is running, and immediately see the impact. The learner both provides and experiences interactive feedback with the model. What if? questions are generated and answered nearly simultaneously; hypotheses can be tested and predictions verified within moments. This differs from other modeling environments such as STELLA in which the user must schedule parameter changes as a batch or in which the results from the simulation run are reported as a table of numbers or a printed graph (Costanza, 1987). This interactivity provides opportunities for students to refine and revise their mental models, by comparing the interactive feedback they initiate and receive with the feedback they expected to receive. It also supports students with low motivation and short attention spans, and provides opportunities for engagement for students who would otherwise be uninvolved.
Figure 7: Running a simulation
All of the interface components of Model-It are implemented through the graphical user interface (GUI) of the Macintosh. We use GUI components like windows, lists, pop-up menus, buttons, sliders, and editable text boxes. Sliders can be used to set initial values and change values while the model runs; new factors are automatically entered into lists and pop-up menus, object pictures can be cut and pasted, etc. In addition, the positioning of pop-up menus is carefully chosen, particularly in the Relationship Maker window, in which the menus are part of the sentence at the top of the screen that defines which factor affects which other, and the sentence going down the right of the screen that defines which qualitative relationship to use. Instead of having to remember and/or type in the names of factors repeatedly, students can quickly pop up menus to find what they're looking for. Instead of having to laboriously type in data points, they can quickly define a relationship's graph by clicking on the graph itself (Figure 5). We feel that the GUI components contribute significantly to the usability, learnability, and flexibility of Model-It.
Model-It is designed to be used within an authentic, project-based science classroom (Marx et al., 1994), a paradigm of science instruction that closely follows the learning theory discussed in this article. We are working with science teachers at a local public alternative high school who are developing a new project-based curriculum called Foundations of Science, in which computing technologies are routinely used, and the subject matter of earth science, chemistry, and biology are combined within the context of meaningful, long term projects. The high school is alternative in the sense that community-based and innovative instructional techniques are encouraged, and that students must apply and be accepted in order to attend. The students in our studies are generally white, primarily middle- to upper middle-socioeconomic class, and of average to above average ability.
The ninth and tenth grade students who are taking this class have been engaged in a long-term project investigating the question How safe is our water? Specifically, they are studying a tributary of the Huron River that flows near the school, collecting a variety of data to determine the quality of the water. Since this water eventually ends up in their drinking fountains, the question is motivating and personally meaningful to the students. Their project investigations have also included using various technologies to conduct and report detailed biological, physical, and chemical assessments.
Model-It has been used three times with a Foundations of Science class of 22 students. First, we pilot tested the software with six ninth grade students from the class, and then we used the software twice in the classroom once as the the ninth grade final project, and then again by the same students early in the fall of their tenth grade. Table 2 describes the dates, students, time frame, and data collected for these studies to date.
Dates | Study | Students and Time Frame | Data Collected |
3/94 | Study 1, pilot | 6 selected ninth grade students (2 pairs, 2 individuals), 1.5 to 2 hrs. each | Video/audiotape of training and practice sessions; models |
6/94 | Study 2, classroom | 22 ninth grade students (working in pairs) for four 50 minute class periods. | Video/audiotape of students using Model-It; models; log files |
9/94 | Study 3, classroom | 22 tenth grade students (same students, working in pairs) for over one week. | Same as above, plus post- interviews |
Table 2: Model-It Studies, Subjects, and Data
In each case, the students used Model-It to construct and test models of the stream ecosystem that they had been studying for months. They collaborated with partners on open-ended projects in which they built models of their own design to represent their choice of particular stream phenomena, for example, land use practices the impact of man-made structures such as golf courses or parking lots on stream quality, or cultural eutrophication and algae blooms. The Factor Map, as described above, had not yet been implemented for these three studies; however, students were encouraged to draw conceptual maps representing their models to help them visualize the overall structure of their model. Following is a more detailed description of each study.
For the pilot study, we worked with six students individually or in pairs, for 1.5 to 2 hours each. We first asked the students to brainstorm about the objects, factors, and relationships in a stream ecosystem with which they were familiar. Then we briefly demonstrated the program to the students by showing them how to build and test a few simple relationships, and finally (for the major time of the session) suggested that they add onto the model, based on what they already knew about stream ecosystems. A researcher sat with the students to answer questions, to prompt the students to talk about what they were doing, to take notes, etc. Our main focus was to evaluate Model-It's learnability and to identify potential sources of confusion for students.
In the first classroom testing of Model-It, 22 students used the program for four class periods of 50 minutes each. The students used a guide during the first three days, to help them learn to use the program and make models. Working through the guide in groups of two, students created models with qualitative immediate and rate relationships, made predictions about the behavior of the resulting models, explored their model's behavior by manipulating independent factors, and wrote explanations about the behavior they observed. On the fourth day, they constructed a model of one of several proposed scenarios: benthic macroinvertebrates as indicators of water quality; stream phosphate and algae blooms; land use practices and their impact on the stream; or an open-ended design of their own choice. During the fifth and final class period, the students discussed their models with the class and with each other.
In the second classroom testing, the same 22 students used another prepared guide designed to introduce them to some added functionality of Model-It (specifically, the run-time graphs), and to give them some experience with creating and using population objects and setting up predator/prey models. All students worked for two days on the prepared guide, in groups of two, and then four groups of two and one group of three used Model-It for several more days to construct models of their stream's quality and how it had changed since last year.
Our method and data analysis to date has been formative; in looking at and analyzing the various data sources (models, tapes, interviews, and log files), we have attempted to identify themes related to models and modeling and illustrations of scaffolding in action. These were our research questions:
Models: Did the students actually build reasonable models?
Modeling: Did the process of building and testing models help the students develop their understanding of complex systems?
Scaffolding: How did the scaffolding strategies support students in their modeling activities?
Did the students actually build reasonable models? We begin with an evaluation of the models that the students actually created with Model-It. For all three studies, we looked to see if the models that students created were generally reasonable, that is, if the relationships defined were appropriate, and if the model worked to illustrate the intended scientific phenomena. For Study 2, we also conducted a more formal evaluation, judging qualitatively for accuracy, complexity, and completeness.
To give the reader an idea of what kinds of models were constructed, Figures 8, 9, and 10 show Factor Map representations of example models from the three studies. In general, we found that students built reasonable models in a short amount of time.
Figure 8 shows a typical model from the pilot testing, in which students were shown how to create a few relationships (oxygen to quality, and phosphate to quality), and then invited to use the stream and other objects, and try building any other factors and relationships that occurred to them. The student who built the model in Figure 8 started with some chemical factors of the stream, then showed how they affected various populations of organisms that live in the stream (bacteria, mayflies, and midge flies). She created and tested this model in about an hour. There are a few errors (e.g., the relationship from midge fly count to stream quality is in error; macroinvertebrate counts are indicators of water quality, not components of it.); overall, however, the model is sound.
Figure 8: Factor Map representation of a model created by a student in Study 1.
Figure 9 shows a model from the first classroom testing, on the fourth day of the study (one 50 minute class period), when students were asked to represent one of three possible scenarios. The pair of students who created this model chose to represent the impact of a golf course on a stream ecosystem, by indicating both the chemical impact of fertilizers on the golf course which can be washed into the stream (the factor size is used to indicate that the larger the golf course, the greater its effect), and also the fecal matter deposited by geese which like to live on golf courses. They went on to show how this impact would affect the population of mayflies in the stream.
Figure 9: Factor Map representation of a model created by a pair of students in Study 2.
We conducted a more formal assessment of the models in this study, since we had a good sample size (twelve), and the models were the most varied and interesting. We evaluated the models and compared them with appropriate scientific models, judging qualitatively for accuracy and completeness. We used a scale from excellent to poor, on a scale of 4 to 1: [excellent (4)accurately represented the modeling task, without errors or misconceptions; good (3)mostly accurate, but incomplete; fair (2)some reasonable relationships, but incomplete or in error; poor (1)no accurate representations]. The model in Figure 9 was judged an excellent model, because the relationships were reasonable and well demonstrated that group's chosen task.
Overall, we found that the models students created were accurate and reasonable. According to our assessment criteria, two-thirds of the groups in the class created reasonably good quality models (quality rating 2.5 or above). The average rated model quality was 2.6 with a low of 1.5 and a high of 4. Most groups were able to set up at least three reasonable relationships in their model. Some models contained errors in the way relationships were defined. For example, one common error was to confuse a population's rate of growth with its count. A few had relationships that were backwards, contradictory, or that made no sense; however, for the most part, students were successful in their model creation efforts.
Figure 10 shows a typical model from the second classroom testing, when students spent several days on the task assigned by the teachers of building a model of stream quality, its effect on various populations of macroinvertebrates, and an outside factor which would account for differences in measurements from last year to this year. Here, rain was defined as the outside factor, and the students indicated how the rain would cause more pollutants to be washed into the stream, thereby lowering water quality. This task was more constrained than the previous ones, and all the groups' models looked fairly similar, and were similarly reasonable. The main variation between models was the accuracy with which students created the relationships to stream quality; groups that were most concerned with accuracy used the quantitative Table View in order to more accurately construct the relationships' graphs.
Figure 10: Factor Map representation of a model created by a pair of students in Study 3.
We examined the videotapes and interview transcripts for scenarios that were good examples of typical model building and testing activities, or in which a particular scaffolding strategy seemed to be particularly salient in the students' conversations and actions with the computer. We listened to what they were talking about and watched what they were doing as they constructed a model, watched and listened as they tested their models, and considered the supportive role of the computer in those activities.
We report our results as an illustrated commentary of students creating models in a scaffolded learning environment, and draw freely from all three studies for our analysis (indicating the source for each illustration in parentheses). Students are labeled with letters without regard to actual identity (however, gender identity is preserved). For each results section, we state the research question, some answers to the question, and then examples.
Did the process of building and testing models help the students develop their understanding of complex systems? We find evidence that building models leads to refining and articulating understanding, that building and testing leads to inspiration for model extension, that testing activities lead to the discovery of flaws in models or suggest refinements, and that building their own models is motivating for students. The following four examples illustrate these findings.
During the process of constructing models by implementing factors and relationships, students often found that they had to refine their understanding of a phenomena in order to represent it. They might at first say just that X is related to Y, but the act of defining that relationship required its further articulation. To refine their understanding, students engaged in thoughtful discussion, and referred to the field manual to learn more about the phenomena they were trying to model.
Now I need to get oxygen in there. [looks in manual] Sunlight affects oxygen. Phosphates affects oxygen. 'Cause phosphates make plants grow out of control, and then they eat all the oxygen. I think. [reading from total phosphate description in manual] Also decreases the number of pollution... Okay. The dissolved oxygen level goes down as phosphates go up. So. (Study 1)
Given an open-ended project, we also saw that the process of building and testing a model inspired students to extend their model. For example, when a student and her partner lowered the phosphate to 0 while testing their model, she commented that the change in phosphate should affect all living things in their model, so she suggested that they add one of the macroinvertebrate populations to represent that phenomena. This phenomenon is also illustrated in another study (Study 2), when two students had been testing their model, and, with several meters still displayed on the screen, student A suddenly has a little brainstorm in which she expresses a series of ideas about several more factors and relationships that could be incorporated into their model:
A: So we could make a direct relationship thing. We could put, we could add insects, and then we could make a relationship between phosphate and insects. ... Yeah! and then the more insects the better the stream quality, I suppose we could put, yeah.
B: It depends on what taxa.
Both examples illustrate that when students are engaged in building and testing a model, we see them coming up with ideas as to how their model can be extended or improved.
Data on students' testing strategies suggests that testing activities often led to possible refinements or to the discovery of flaws in their model. For example, one group had erroneously defined factors (bacteria and algae) that were already pre-defined as objects. When they tested their model, they observed behavior that didn't agree with how they thought it should work, and consequently discovered their error. Another group, while testing their model, realized that it would make sense to link two of their unrelated factors together by creating another relationship, so they went back and defined the new relationship. In both of these examples, the students discovered flaws or found areas for improvement while they were testing their model.
The open-ended modeling task assigned to the students gave them the flexibility to branch off and explore different topics, and to express their own understanding of various phenomena. For example, to demonstrate land use impacts, students C and D chose to put the golf course object into their model, and show how factors of the golf course might affect the stream and the organisms living in it:
D: Let's use that one.
C: The golf course?
D: Yeah, we haven't used that one yet.
C: How the golf course affects what, though?
D: How the golf course affects, um, bacteria.
C: Too hard.
D: It's easy. Because the golf course, a lot of geese are on the golf course, and the geese feces go in the water.
C: Oh, and it affects fecal coliform
D: Which in turn affects the bacteria, and the fecal coliform grows on bacteria.
C: Okay, where do you want the golf course?
D: Right there.
This opportunity to build their own models was extremely motivating for these students; they displayed excitement and enthusiasm for the project throughout the class period. Once they had completed their initial goal of representing the golf course impact, they branched out on their own to create more relationships, from the stream quality to the mayfly population. They expressed pride in their model, and called the teacher over to show it off to her. (Figure 9, above, shows the factor map of their final model) The next day, in class discussion, they proudly described how their model worked:
[Teacher draws a map of their model as they talk]
C: The size of the golf course affected the geese, the number of geese...
D: The more land there is the more geese... And the more geese the more fecal coliform.
C: The golf course size affected nitrates and phosphates...because the bigger golf course has more fertilizer and fertilizer has nitrates and phosphates in it.
Teacher: Do you have any [relationships] going to quality?
C: Well I'm getting there, okay? This is complicated! Okay, fecal coliform goes to quality, phosphate goes to quality, nitrate goes to quality... And then the quality went to rate of growth.
Teacher: Why?
C: Because the better quality...
D: There is the more mayflies can grow. And then the growth went to count and the decay went to the count.
How did the scaffolding strategies support students in their modeling activities? Here we present a number of examples to illustrate how we saw the scaffolding strategies of grounding, bridging, and coupling supporting the students' modeling efforts in various ways. Each scaffolding strategy is discussed in terms of its Model-It implementations.
The digitized photographs of a familiar place and the graphics within the Model-It learning environment seemed to provide a grounding around which students could talk and think about the real stream ecosystem. Seeing real pictures seemed to help them think of relationships they could model. We also saw a number of instances of the students wanting to place objects in a visually appealing and realistic way. In the following three examples, we show how the digitized photographs and graphics help ground the students in their prior knowledge and experience.
Example 1: (Study 1) Students identified with the photorealistic graphics, and seeing those pictures seemed to inspire them to try modeling other things they already knew about. For example, one student saw the stream photo, and decided to build a model of how our stream was when we tested it. She then proceeded to set the factors to the values she remembered from those real-world tests to see what would happen in her model. Her decisions about what modeling activities to pursue were grounded in her own prior knowledge of stream factors.
Example 2: (Study 1) Here's one more example of how the photorealistic graphics provide a grounding in prior knowledge. This student referred to the stream photo while brainstorming:
...Or since this one's [stream] by a road, roads can have pollution, they have things from the car like oil, lots of salt from the road can get in, and that can change what's in it,...
Example 3: (Study 2) In this conversation, students E and F are beginning a new model. They are manipulating the objects in the environment, placing them in visually appealing places, and commenting on the appearance once placed. Notice that student F comments that she wanted the object placed where it looks real, and then said the placement was pretty good. Student E had some fun placing the parking lot object in the portion of the stream picture where there was water.
E: [clicks on drainpipe object, moves cursor around on stream ecosystem picture]
F: Put it somewhere really nice ... so it looks real.
E: [still moving cursor around]
F: Where we had it before, click like right there, see what it looks like.
E: [Clicks on the right-hand side of the picture]
F: That looks pretty good.
E: I've always wanted to do this. [selects the parking lot object and places it in the water area of the picture]
F: Why?
E: Right in the middle of the stream. That's funny!
The video/audio data of other students showed a number of instances of students placing, removing, and re-placing objects in slightly different positions, often commenting on whether one place was better than another. This example indicates how closely students identify with things they see in the Model-It environment the photorealistic images and graphics they see are grounded in their own experiences and observations.
The qualitative representation of relationships in the Model-It environment allowed all students the opportunity to construct relationships, because representing a relationship qualitatively is much closer to the way students seem to naturally think and express themselves than quantitatively or mathematically. Students seem comfortable expressing themselves qualitatively. The qualitative Relationship Maker helps them think about defining relationships and enables them to construct complex relationships quickly.
Example 1: (Study 2) In this example, the students find that reading the qualitative sentence verbally out loud helps them realize a mistake and think of a better way to define their factors. They also seem to think they understand what factors are for a little better. Students C and D have previously defined a golf course factor of the Golf course object, a definition that probably doesn't make sense. Here they are looking at the Relationship Maker view, and constructing a rate relationship between Golf course:golf course and Bacteria:count.
C: [reading from the screen] At each time step, add golf course to bacteria count
[pauses and moves cursor back and forth, tracing over the words Golf course:golf course]
C: Wait but it should say like golf course .. `size' or something. Wait we did that one thing wrong.
D: Yeah. [they go back to the Factor Factory and change the name of the `Golf course: golf course' factor to `Golf course: size']
C: [now in the Object Editor] So the object is a golf course, the factor of the golf course [pointing with the cursor to the word `size']see we could have different factorslike we could have `number of geese.'
Student C, reading out loud from the screen, suddenly realizes that the factor is not the object itself, but is something related to the object. He may not have reached this realization as quickly had he not been reading the qualitative, verbal representation of the relationship on the screen. Not only does he find his error, but later on he shows a better understanding of factors by proposing another golf course factor geese.
Example 2: (Study 2) In this example, we see students C and D each contributing ideas to the model they're building, ideas that are incorporated into the model in short order. They seem to be comfortable expressing themselves qualitatively, and, using the qualitative definition of relationships, they were able to build complex relationships very quickly.
C: As geese increases fecal coliform increases at about the same. [saves geese -> fecal coliform relationship] And then if we want, do you want, it won't take long to put in nitrates.
D: Okay.
C: We can add that in. [closes Relationship Maker]
D: Cause that's part of fertilizer...
C: Cause that's part of fertilizer, yeah. So we go to stream [opens Factor Factory] okay...let's see...nitrates N I T nitrates. [types name into Factor Factory]
D: Lesser and lesser.
Within the next 2 minutes they proceeded to construct a relationship from `Golf course: size' to `Stream: nitrates' and one from `Stream: nitrates' to `Stream: quality.' This was accomplished by opening only 2 windows, pulling down 6 menus, and pressing 6 buttons. The relationships they understood from their prior investigations in the stream seem to translate easily and quickly into the qualitative representations in the modeling environment.
The pre-defined high-level objects in Model-It provided students with simple and accessible manipulatives, as opposed to high-level language commands. Students had little trouble learning the object-oriented environment and the high-level primitives of objects, factors, and relationships; in fact, the object-oriented representation seemed to correspond well with the way students expressed their understanding and with what they already knew.
Example 1: (Study 1) Talking about objects, factors, and relationships seems to come naturally for students. In the preliminary study, in the brainstorming session before they had ever been introduced to the program, students could list factors of the stream oxygen, total solids, insects, all those nine chemical tests, and describe relationships: colder water has more oxygen, and warmer water has less oxygen. We see then that the modeling vocabulary used in Model-It is grounded familiar to students and thus is a scaffold for modeling.
Example 2: (Study 2) Students referred to multiple causal relationships in various ways, some as `chains,' others as `hooks.' Here students E and F are working on their model (drainpipe discharge affects stream phosphate which affects stream quality), and are in the process of writing down an explanation. Student E talks through a causal relationship, suddenly realizing that all of them are related causally.
E: High growth rate is caused by phosphorus which increases plant life which decays and they feed on it.
F: Yeah.
E: Whoa! We got a whole chain here.
F: That's it! That's it!
The student's reference to a chain is an indication that she is thinking about the objects in the stream as things that can be physically connected. She related her perception of the multiple relationships to her prior experience with chains and links, which gave her a way of thinking about Model-It relationships.
Providing both the text and graph views of absolute relationships can scaffold learning; we saw students using the graphical representation to interpret the text. This helped them connect unfamiliar representations with familiar ones, helping them make sense of the unfamiliar.
Example: (Study 1) This student, thinking out loud, is trying to decide which relationship to choose between two factors. She initially tries to think it through using the qualitative words, but, finding them inadequate, turns to the more quantitative graph at the right of the Relationship Maker. This helps her verbalize the relationship and make a decision.
Should we say more and more. Cause it gets more... I don't really understand that one. More and more. Should we say more and more? Well, let's look at the graph thing. If the stream temperature is like, really high, then it [oxygen] starts going down.
The simultaneous representation of the relationship both as a verbal sentence and a mathematical graph forms a bridge between a representation with which the student is less familiar, and one with which she is more familiar. In this case, the student used the graphical relationship to confirm her thinking that more and more might be an appropriate qualitative definition for the relationship.
The qualitative and quantitative definitions of relationships helped students make connections between commonsense and more abstract ideas. Students were able to use whichever representation suited their abilities or goals. When the verbal representation suited them, they used it to define their relationships; where they perceived it to be inadequate to express their understanding of the relationship (particularly as it related to accuracy), they used the more quantitative table view.
Example 1: (Study 2) In this study, students were not given any instructions in the guide as to how to make quantitative relationships with the table view; consequently, most constructed their models with the qualitative text view exclusively. However, one student was dissatisfied with the levels of accuracy possible with the text view (since he was trying to re-create graphical relationships pictured in his water quality manual), and, on his own, discovered the table view. With this more quantitative representation, he created relationships that more closely matched his understanding of those relationships.
Example 2: (Study 3) The same students who mostly used the qualitative relationships in Study 2 (Example 1 above) transitioned to the quantitative relationships in Study 3 in order to gain accuracy with their models. Students found it easy to use: You just entered in the numbers that were in the book, and it just made the graph for you. In the follow-up interviews several groups of students said that they used the table view graphs to make the quality, as one student put it, very, very accurate. On the other hand, other graphs were not definitely accurate but .. close enough. In other words, the graphical view allowed them to easily improve the quantitative accuracy of their models; on the other hand, they recognized the limitations of the graphical view.
Providing students with both concrete and abstract representations of models gives learners different ways to think about their model and helps them transition from novice to expert representations. Students sometimes arranged meters on the screen to represent the underlying structure of the model. Instead of placing meters randomly on the screen, they placed them in a (left-to-right) order that indicated how objects to the left caused changes in objects to the right.
Example: (Study 2) In this example, the initial random arrangement of meters on the screen didn't match this student's mental representation. She decided to try to create a closer match by rearranging them in the order of causality which, in effect, became for her an abstract representation of the model. Here, she has 6 meters up on the screen, randomly placed (oxygen, fecal coliform, bacteria, quality, algae and phosphate, placed somewhat lower than the others). She moves the phosphate meter up to the same level as the others. Then,
We should have this in the order that it goes. [begins moving meters] Drainpipe discharge [moves discharge to the far left position] Discharge affects phosphorus [moves phosphorus to right of discharge] Phosphorus which affects algae [moves algae to right of phosphorus] which affects bacteria [moves bacteria to right of algae] which affects oxygen [moves oxygen to right of bacteria] which affects .. we don't need fecal coliform for this.
While the student had already constructed the model, and probably knew what relationships she had built into it, she still discovered that she could use the arrangement of the meters to visually represent the structure of the model on the screen. The next logical step for this scaffolding strategy is to represent the model in a manipulable iconic form, i.e., the Factor Map which has now been implemented.
The opportunity to manipulate the factor values directly while the model is running provides a link between the learner's cognitions and the model. Meters provide a speedy way to test out an hypothesis; students used information from the meters to explore how their model worked and to verify its operation, and also to construct new knowledge about it. We also saw that the visual feedback allowed students to evaluate their model and compare its behavior with their expectation or prediction of how the model should behave; in essence, they can compare the behavior of their Model-It model with their mental model.
Example 1: (Study 2) Sometimes manipulating a meter in an interactive testing process helped students to understand the model's behavior in a way that went beyond just understanding the individual relationships. In this example, two students have just been testing their stream quality model, watching how the quality affects the population of the mayfly larvae. This student is slowly increasing the size of his golf course (larger golf courses put more fertilizer in the stream which lowers the quality which adversely affects the mayfly population). He has an idea that the population of mayflies should thrive, until the golf course reaches a certain larger size, above which the mayflies will die off:
[The Golf course size is very small, and he increases it to 38 (on a `scale' of 1 to 100 acres) Various factors in the stream increase or decrease, but the mayfly population meter indicator stays high.] The mayfly count still goes up. You have to have a golf course over that many, over 50 acres [as he increases it to 51]. [The stream quality drops further, and the mayfly count immediately starts decreasing as the model runs.] Jeez, look at it go!
By manipulating the model he created in the Model-It environment, in just the space of a few moments he was able to form and verify an hypothesis that there existed a critical golf course size, above which the impact upon the mayfly population would be much larger.
Example 2: (Study 2) Manipulating meters and receiving immediate feedback led to students being able to verify their models very easily. In this example, two students have been creating relationships between drainpipe discharge, phosphate levels, and stream quality. Here is one student's interaction with the program, as she verifies that the model works as she thought it should:
See, look, watch. Now if I move this down, this will go down, and that will go up. [discharge and phosphate meters go down, stream quality goes up] Now if I move this up, this will go up and that will go down. [discharge and phosphate meters go up, stream quality goes down]
This short episode embodies all of the possibilities of Model-It. She phrases her statements in the form of predictions; using the meters provided by Model-It, she generates and tests four simple predictions as to how her model will work, and, receiving immediate feedback, confirmed not only that her predictions were correct but that her model was working according to her own mental model.
Example 3: (Study 2) The students use meters and graphs in various ways to understand how their model works. Students G and H have already figured out that their model properly implements phosphate's and fecal coliform's effects on stream quality. They remove one meter (stream quality) in order to check other relationships and gain a better understanding of how their model works.
[Phosphate, quality, oxygen, and fecal coliform meters are showing on the screen.]
G: Say, let's close stream quality.
H: Um hm.
G: [The student removes the quality meter.] Now let's see what happens. It's running. It doesn't have any effect on them.
H: It does! Well, let's see, fecal coliform...
G: It's true because we're trying to see if it has an effect on these and it doesn't have any effect on this, see? [student displays slider for fecal coliform, and moves it up and down. None of the meters change]
H: You're right, you're right, it has no effect whatsoever. ... This isn't a very complex model, so far, because we have no other relationships besides to quality.
Simply by removing one meter from the screen and running one additional test with the meter, these students accomplished several things. First, they were able to focus their attention on three particular factors. Second, they were able to obtain information that allowed them to find out how one of those factors affected the others (and in fact, they found out that it didn't).
The real-time, visual feedback provided by the Model-It meters while the learner changes values provides a scaffold that allows the learner to visualize the behavior of the model as it ran. Often, in the process of testing their model, students were led to further exploration and expansion which could be implemented quickly; the testing process revealed students' understanding of the behavior of the phenomenon in real life.
Example 1: (Study 1) In this example we see a student talking out loud while watching the meters as her model runs. (Figure 8, above, shows a representation of her model). She's comparing what she sees happening on the meters with what she thinks should be happening and evaluating the level of each factor as she reads it off the meter. She seems able to explain her reasoning as to why a particular factor is at a good, bad, or okay level. Implicit in her remarks is the stamp of approval upon the whole model she has built she's obviously happy with it. She explains what she sees on the meters as she runs her model:
The fecal coliform is at an okay level, so the oxygen is at an okay level. Oxygen's actually at a really good level. And that means the mayflies can thrive in it, so the mayflies are just having a picnic, and they're all growing. The stream quality is really good because there's lots of mayflies,[...] The midge fly count is 0 which is really good because midge flies show bad conditions. Bacteria count is going down to 0 and that's good, because, you don't need it, well, I mean, you need bacteria, but this is sort of an extreme. And the stream phosphate is down a lot, it's down to a decent level.
The ease with which she can play what if games emboldens her. Knowing that there's no real cost, she describes several experiments that would be impossible in real life, experiments that could be dangerous:
And then, this'll be fun, I know, I want to start playing around, and you can make just the most dangerous thing in the world, I mean just like, tons of midge flies, and lots of runoff, and... this'll be just totally dangerous.
So the real-time visual feedback is a scaffold for visualizing their model's behavior, and also provides opportunities to experiment with what-if situations, even implausible situations.
Example 2: (Study 2) The real-time visual feedback allowed model testing and expansion to proceed without interruption, and stimulated model expansion. In this example, these students are testing a partially complete model. During their testing, they used the meters to try different values of golf course size, and in the process realized that the size of the golf course should also affect the number of geese on the golf course. So the testing process with immediate feedback helped them think of additional relationships that could be constructed. One says,
So, golf course size affects golf course geese. Yeah, we can do it. As golf course size increases, geese increases by about the same. [they subsequently put the relationship `Golf course: size' affects `Golf course: geese']
Example 3: (Study 3) At times the interactive nature of the meters and graphs resonated with the students' own experiences that they were immediately able to relate what they saw on the screen with their real life knowledge. Students I and J were working through the guide, exploring rabbit population growth.
I: [reading from guide] `Start the simulation and raise the rate of growth to point 6.' OK
J: Go.
I: Wait. OK. Start. And I want it to be ... point 6. [student moves slider on Rabbit rate of growth to 0.6]
J: Yes. [as the model runs, both watch the graph of the Rabbit count go up sharply]
I: Whoa the rabbit count is--whoa they're multiplying like rabbits!
J: Ha ha.
The feedback they receive from their action is immediately related back to prior knowledge about how rabbits multiply in the `real world.'
Since the time this data was collected, several new features have been implemented that should increase the impact of the scaffolding strategies. Students can now create their own objects with pictures and graphics of their own, including replacing the stream picture and creating population objects. Also, the Factor Map has been completely implemented. We just completed a new study of Model-It in January of 1995, with these new features in place, and we are eager to see how they contribute to the effectiveness of the software-scaffolding.
Other future changes to the program have been suggested by the classroom testing. For example, the data showed that frequent and iterative testing of models as they were built tended to result in better models. We therefore intend to redesign the interface to guide the learner towards these testing strategies, either by providing prompts to test each relationship as they create it or by making the building and testing cycle more explicit. We also noticed that predicting model behavior was useful for students, yet they often didn't make predictions before testing, so we intend to scaffold students in making predictions about their model, and analyzing and explaining those predictions after testing. Other future goals include: including built-in checks for common student errors, and supporting a wider range of modeling activities, from an even more qualitative beginning, to beyond the current level with some more advanced types of relationships. In addition, we would like to support scientific argumentation by implementing ways for learners to keep track of model runs, both as illustrations of their model's behavior and to be used in presentations.
Finally, there are more long-term goals for research with the software, in which we consider other tasks that the program might support. For example, we are looking into networking the software so that students can interact and share their models. Students might even work on different sections of the same model, so that their changes affect each other's model. For example, if the students upstream add pollution to the stream, the students downstream will see their organisms dying off. We also plan to integrate Model-It with real-world data, so that students can generate realistic relationships from real world data, and can validate their models by comparing simulation results with data from the real world.
Our research has shown that with Model-It, students can create working models of complex phenomena that they have observed, a task which is usually inaccessible to learners in high school science classrooms. Influenced by current trends in scientific ecosystem modeling, Model-It redefines modeling as an object-oriented, initially qualitative task. The students therefore spent minimal time on the mechanics of programming a model, and instead used their cognitive energies to concentrate on thinking about and modeling the phenomena. We have seen the value of this model building activity in helping students construct, test, and refine their understanding of complex systems, just as model building helps scientists test theories and explore their ideas.
Furthermore, with regard to research and development of interactive learning environments, this study demonstrates scaffolding strategies for learner-centered design as applied to pre-college modeling software that seems to hold promise for the design of educational software in general. The grounding strategy makes the program accessible by situating activities in learners' prior knowledge and experience. The bridging strategy, providing multiple graphical and textual representations at increasing levels of abstraction, helps learners make connections between what they know and what they ought to know in order to make progress toward more abstract modeling. The coupling strategy gives learners an interactive handle into their mental model of a phenomenon, allowing them to quickly generate and answer what if questions, make and test predictions, and run and revise both the model artifacts in the Model-It environment and their own mental models.
We would like to extend our great appreciation to the other members of the Highly Interactive Computing (HI-C) research group, and the teachers and students at Community High School in Ann Arbor, for their feedback and support. This research has been supported by the National Science Foundation (RED 9353481) and the University of Michigan.
American Association for the Advancement of Science, (1993). Benchmarks for Science Literacy. New York, NY: Oxford Press.
Blumenfeld, P. C., Soloway, E., Marx, R. W., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991). Motivating project-based learning: Sustaining the doing, supporting the learning. Educational Psychologist, 26(3 & 4), 369-398.
Bobrow, D. G. (1984). Qualitative Reasoning about Physical Systems: An Introduction, Artificial Intelligence, 24:1-6
Brown, J., Collins, A., Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42.
Clement, J., Brown, D. E., & Zietsman, A. (1989). Not all preconceptions are misconceptions: finding 'anchoring conceptions' for grounding instruction in students' intuitions. International Journal of Science Education, 11(Special Issue), 554-565.
Cochran, J. K. and Paul, B. K. (1990). QUAL: A Microcomputer System for Qualitative Simulation, Simulation, November, 300-3089
Cognition and Technology Group at Vanderbilt (1990). Anchored instruction and its relationship to situated cognition. Educational Researcher, 19(6), 2-10.
Costanza, R. (1987). Simulation modeling on the Macintosh using STELLA. BioScience, 37(2), 129-132.
Draper, F. & Swanson, M. (1990). Learner-directed systems education: a successful example, System Dynamics Review, Vol. 6, No. 2, 209-213.
Ewel, K. C.(1989). Learning to Simulate Ecological Models on a Microcomputer, Ecological Modeling, 47, 7-17
Feurzeig, W. (1992). Visualization tools for model-based inquiry. Paper presented at the Conference on Technology Assessment, Los Angeles.
Green, D. G. (1990). Syntactic Modeling and Simulation, Simulation, June, 281-286
Guerrin, F. (1991). Qualitative Reasoning About an Ecological Process: Interpretation in Hydroecology, Ecological Modeling , 59, 165-20114
Guzdial, M. (1995). Software-Realized Scaffolding to Facilitate Programming for Science Learning. Interactive Learning Environments, to appear.
Guzdial, M. (1993). Emile: software-realized scaffolding for science learners programming in mixed media. Unpublished Ph.D. dissertation, University of Michigan.
Hamming, R. W. (1962). Numerical Methods for Scientists and Engineers, New York, McGraw-Hill
High Performance Systems. (1992). Stella II: An Introduction to Systems Thinking.
Hutchins, E. L., Hollan, J. D., & Norman, D. (1986). Direct Manipulation Interfaces, in User Centered System Design, Norman, D., & Draper, S. (Eds.),.Hillsdale, NJ: L. Erlbaum & Assoc.
Jackson, S., Stratford, S., Krajcik, J., & Soloway, E. (1995, March). Model-It: a case study of learner-centered software for supporting model building. Proceedings of the Working Conference on Technology Applications in the Science Classroom, The National Center for Science Teaching and Learning, Columbus, OH.
Karplus, W. (1983). The Spectrum of Mathematical Models, Perspectives in Computing, Vol. 3, No. 2, May, pp. 4-13
Kozma, R. (1991). Learning with Media. Review of Educational Research. 61(2) 179-211
Krajcik, J., Blumenfeld, P., Marx, R. W., & Soloway, E. (1994). A collaborative model for helping science techers learn project-basied instruction. Elementary School Journal, 94(5), 483-498.
Kreutzer, W. (1986). Systems Simulation: Programming Styles and Languages, Addison-Wesley, Wokingham, England
Mandinach, E., & Thorpe, M. (1988). The systems thinking and curriculum innovation project (Technical report ).
Mandinach, E., & Cline, H. (1992). The impact of technological curriculum innovation on teaching and learning activities. Paper presented at the American Educational Research Association.
Marx, R. W., Blumenfeld, P., Krajcik, J. S., Blunk, M., Crawford, B., Kelly, B., & Mills, K. (1994). Enacting project-based science: experiences of four middle grade teachers. Elementary School Journal, 94, 517-538.
Miller, R., Ogborn, J., Briggs, J., Brough, D., Bliss, J., Boohan, R., Brosnan, T., Mellar, H., & Sakonidis, B. (1993). Educational tools for computational modelling. Computers in Education, 21(3), 205-261.
Mitchell, M. K., & Stapp, W. B. (1994). Field manual for water quality monitoring: an environmental education program for schools. (8th ed.). Dexter, MI: Thomson-Shore Printers.
Mokros, J. R., & Tinker, R. F. (1987). The impact of microcomputer-based labs on children's ability to interpret graphs. Journal of Research in Science Teaching, 24(4), 369-383.
Norman, D., & Draper, S. (1986). eds., User Centered System Design. Hillsdale, NJ: L. Erlbaum & Assoc.
Papert, S. (1980). Mindstorms. (2nd ed.). New York: Basic Books.
Piaget, J. (1954). The construction of reality in the child. New York: Basic Books.
Resnick, L. B., & Glaser, R. (1976). Problem solving and intelligence. In L. B. Resnick (Ed.), The nature of intelligence. Hillsdale, NJ: Erlbaum.
Roberts, N. (1985). Model building as a learning strategy. Hands On!, 9(1), 4-7.
Roth, W.-M., & Bowen, G. M. (in press). Knowing and interacting: a study of culture, practices, and resources in a grade 8 open-inquiry science classroom guided by a cognitive apprenticeship metaphor. Cognition and Instruction.
Saarenmaa, H., Stone, N. D., Folse, L. J., Packard, J. M., Grant, W. E., Maleka, M. E., & Coulson, R. N. (1988). An Artificial Intelligence Modeling Approach to Simulating Animal/Habitat Interactions, Ecological Modeling , 44, 125-141
Salomon, G. (1990). Cognitive effects with and of computer technology. Communication Research, 17(1), 26-44.
Salski, A. (1992). Fuzzy Knowledge-Based Models in Ecological Research, Ecological Modeling , 63, 103-112
Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. Journal of the Learning Sciences, 3(3), 265-283.
Silvert, W. (1993a). Object-oriented ecosystem modelling. Ecological Modelling, 68, 91-118.
Silvert, W. (1993b). The distributed derivative: an aid to modular modelling. Ecological Modelling, 68, 293-302.
Soloway, E., Jackson, S., Klein, J., Quintana, C., Reed, J., Spitulnik, J., Stratford, S. J., & Studer, S. (1995). Learning theory in practice: case studies of learner-centered design. Paper submitted to Designing Interactive Systems, Ann Arbor, MI.
Soloway, E., Guzdial, M., & Hay, K. E. (1994). Learner-centered design: the challenge for HCI in the 21st century. Interactions, 1(2), 36-48.
Swartzman, G., and Kaluzny, S. (1987). Ecological Simulation Primer, Macmillan Publishing Company, New York, NY
Tinker, R. (1990). Teaching theory building: modeling: instructional materials and software for theory building : The Technical Education Research Centers, Inc.
vonGlaserfeld, E. (1989). Cognition, construction of knowledge, and teaching. Synthese, 80, 121-140.
Vygotsky, L. S. (1978). Mind in society: the development of higher psychological processes. Cambridge, MA: Cambridge University Press.
White, B. Y., & Frederiksen, J. R. (1990). Causal model progressions as a foundation for intelligent learning environments. Artificial Intelligence, 42, 99-157.
Wood, D., Bruner, J. S., & Ross, G. (1975). The role of tutoring in problem-solving. Journal of Child Psychology and Psychiatry, 17, 89-100.
Model-It provides a constrained control structure two types of relationships with which users can define how one factor affects another. The functionality of those relationships (immediate and rate) was carefully chosen to support the basic essentials for implementing models, so that students can achieve the goals associated with model building without the overhead of learning complex mathematics or a programming language.
Immediate relationships are of the form y = f(x); they are used to define a relationship in which the value of the affected factor is immediately calculated based on the value of the causal factor. Immediate relationships represent constraint systems whose mathematical analogs are collections of simultaneous algebraic equations (Miller et al, 1993). That is, for a simulation defined entirely by immediate relationships, the values of all dependent factors are calculated from the values of the independent factors (state variables), and define the state of the model. Nothing changes until the user gives a new value to an independent factor, putting the model into a new state. This form of modeling permits the simplest exploration of basic modeling concepts such as chains of relationships (e.g. the sunlight affects the photosynthesis level, which affects the oxygen production of the plants, which affects the oxygen level of the stream, which affects the quality of the stream), and combinations of relationships (e.g. the scenario, in which stream phosphate and stream oxygen both affect stream quality). Through the use of immediate relationships, middle and high-school learners can easily define and explore models may be easily defined and explored to gain an understanding of basic modeling concepts.
Rate Relationships define feedback equations of the general form y(t+1)= yt � x. This equation can also be considered as a discrete time step approximation of the linear differential equation: dy/dt = �x, where x is the rate of change of y. Rate relationships are implemented slightly differently for objects that represent populations, such as mayflies. For rate relationships involving populations, we want to take into account the impact of each organism in the population, so that the form of the equation becomes: dy/dt = � (x * n), where n is the count.
For our example, to model the rate of growth (rg) of the population (n) , the equation becomes: dn/dt = (rg * n). At each time step, multiply the population's rate of growth by the population's count, and add that product to the count. This is still a linear differential equation, but the function it defines is exponential. We present this concept to the user with a qualitative sentence such as: At each time step, and for each Mayfly, add the Mayfly rate of growth to the Mayfly count (Figure 4). The user chooses the sign of the relationship, either add or subtract.
Rate relationships provide support for exploring dynamic, time-based models. They represent the basic flow equations that are the basis for linear models, the simplest dynamic models used to represent ecosystems, and the first to be taught in a typical collegiate simulation modeling textbook (Swartzman & Kaluzny, 1987). The relationship between the rate of growth and count of a population, in particular, supports the exploration of the process of exponential growth, without the student having to first express it as an equation.
To make calculations based on the relationships, we first convert any qualitative definitions into quantitative functions [1]. Specifically, the text-based immediate relationships are converted into the functions presented by their associated graphs, and scaled to the defined range of values for each factor; e.g. Figure 3 shows the curve associated with decreases by less and less scaled to the phosphate range of 0 to 10 and quality range of 0 to 100. The functions are stored internally as 11 data points (for the 10x10 grid used by the graph view). Linear interpolation between these points is used to draw the graph, and to calculate the function for arbitrary input values.
Objects maintain lists of their associated factors, and factors keep track of their initial value and current value. Factors also keep track of which relationships they cause. When a simulation is running, the Modeler's central Controller cycles through a loop, such that in each cycle (represented as a time step), it executes two functions:
(1) Firing relationships The Controller tells each object in the world to fire its relationships, and each Object passes the message on to each of its factors, which then fire all of their causal relationships. When a relationship is fired, it calculates a new value for the affected factor (based on the equations described above), and tells the affected factor its new value. The affected factor stores the new value in a list.
(2) Calculating new values Once all relationships have been fired, the factors calculate their new value, if any, from the stored list of new values. If there is more than one new value in the list (because the factor was affected by multiple factors), the new values are resolved as follows, depending on whether the factor was affected by immediate relationships or rate relationships (the same factor cannot be affected by both immediate and rate):
If affected by immediate relationships, the factor sets its current value to the average of the new values. This means of combination is a constraint on the possible combinations of factors, but it is still useful for building simplified models where the emphasis is on exploring systems with many interacting factors, e.g., the water quality example described in the scenario, which is based on an average of the effects of the different chemical tests.
If affected by rate relationships, the factors adds or subtracts the new values to or from the current value of the affected factor. This approach of building up the differential equation from each of its terms generated separately is essentially the distributed derivative recommended by Silvert (1993b).
[1]The growing field of qualitative reasoning (see Bobrow, 1984 for an overview) suggests an alternative approach in which qualitative systems are simulated through artificial intelligence techniques such as constraint propagation and logic-based expert systems.
https://www.metadialog.com/ - Artificial intelligence for your business