SECONDARY STUDENTS' DYNAMIC MODELING PROCESSES: ANALYZING, REASONING ABOUT, SYNTHESIZING, AND TESTING MODELS OF STREAM ECOSYSTEMS

Steven J. Stratford, Maranatha Baptist Bible College, Watertown, WI

Joseph Krajcik, HI-C Research, University of Michigan

Elliot Soloway, HI-C Research, University of Michigan

 

 

Introduction and Objectives

How often do students in our secondary science classrooms really have the opportunity to think about the content they are supposed to learn? David Perkins and others have suggested that learning is a consequence of thinking, and that understanding that goes beyond the information given (Bruner, 1973) comes about through using knowledge in performances of understanding (Perkins, 1992; Gardner, 1991). But how can educators provide students with opportunities to reflect upon science content? And what might processes of thinking about science content look like? These are questions that have driven our research into dynamic modeling in science classes. The research reported here, focusing on students' dynamic modeling processes (“Cognitive Strategies for Modeling”), is part of a larger study of dynamic modeling in secondary science classrooms (Stratford, 1996b; Stratford, Krajcik, & Soloway, 1997), in which we have investigated processes of students' dynamic modeling efforts, the products of those efforts, and relationships between process and product. In this paper we focus on the students modeling processes, presenting results of a study of the cognitive strategies in which ninth-grade science students engaged as they used a learner-centered dynamic modeling tool (called Model-It) to make original models based upon stream ecosystem scenarios.

Problem and Rationale

Systems thinking was designed many years ago (Forrester, 1968) and has often been promoted since that time as a way of thinking about and understanding complex systems. Several computer tools were developed to support systems thinking through the creation of dynamic models (specifically, Dynamo and STELLA). Attempts have been made to introduce systems thinking and dynamic modeling to middle and secondary school students (Roberts, Andersen, Deal, Garet, & Shaffer, 1983), but research is scarce. We do know that it's difficult and time-consuming to engage all students in dynamic modeling (Roberts & Barclay, 1988; Mandinach & Cline, 1994; Schecker, 1993). However, with microcomputers becoming more readily available to science students, with the increasing processing power of those computers (Soloway & Pryor, 1996), and with developments in theory and implementation of learner-centered software (Jackson, Stratford, Krajcik & Soloway, 1996), the opportunity exists to try again.

Why is the exploration of dynamic modeling in science classes a significant exploration? First, many science curriculum topics dealing with systems, such as ecology, weather, climatology, and biology may be enhanced by creating, manipulating, and exploring computer models of those systems (Roberts, et al., 1983). Other curricula with systems-related content, such as history or economics, may also be enhanced with computer models. Second, creating dynamic models should engage students in combining isolated, fragmented, inert knowledge about poorly-understood concepts and relationships into larger, more clearly-understood constructs by allowing them to re-present, reconstruct and explore that knowledge within a computer model. Third, creating models should provide students with opportunities to think about and discuss scientific phenomena: breaking them down into pieces, considering how (and why) those pieces are related, incorporating those pieces into computer models, and verifying those models by comparing their behavior to reality (Stratford, Krajcik, & Soloway, 1996). Finally, creating models may allow students to come face-to-face with fundamental issues of scientific models such as their accuracy, limitations, and usefulness (Gilbert, 1991; Stratford, 1996a). All of these reasons suggest that constructing models, and in particular, constructing dynamic models, may help students better understand the science content we want them to learn.

Our exploration of dynamic modeling has focused on the cognitive strategies in which students engage as they create dynamic models, strategies we have called “Cognitive Strategies for Modeling.” These strategies include analyzing, relational reasoning, synthesizing, testing/debugging, and explaining. In the process of creating a model, one would expect to see someone analyzing the phenomenon being modeled, breaking it conceptually down into relevant, related parts. Relationships (usually causal) between those parts have to be reasoned out, identified and clearly defined. As the model is then created with the computer, those parts, and the relationships between them, are synthesized conceptually back together into a computerized representation of the phenomenon. To verify that the model works as intended, and that its behavior matches that of the phenomenon, the model should be thoroughly tested and debugged. And throughout the process of building and testing a model, explanations for why parts are related (that is, explanations for the mechanisms underlying the causal relationship between two parts) certainly exist in the mind of the builder(s), otherwise the model would be totally random; such explanations may be articulated in oral form or documented in written form.

RESEARCH QUESTIONS

The research questions of this study, then, were as follows:

Setting

The participants in our study were 16 ninth graders, enrolled in a public school in a midwestern college town. They were chosen by their 3 science teachers to possess a range of characteristics (ability, gender, and race) roughly representative of the group of 100 ninth grade students from which they were drawn. They were also selected for having the qualities of being likely to cooperate with data collection procedures (daily videotape recordings and pre-/post- interviews), being able to work well with others, and being relatively “talkative” (to ensure rich videotape and interview data). The ninth grade science class in which they were enrolled was taught following a curriculum called Foundations of Science (Heubel-Drake, et al., 1995), with the goal of engaging students in long-term inquiry of non-trivial driving questions. In the three months prior to the research activities reported in this paper, students investigated an authentic, meaningful question: “Is our water safe?” It was authentic because they used a local creek for their investigatory activities, and it was meaningful because the water flowing in that creek was part of the watershed from which their town drinking water was obtained. The class and curriculum was enhanced by ubiquitous computer technologies (portable computers, networks, and printers in the laboratory, digital data collection and display devices, off-the-shelf productivity software, and custom-designed and -programmed research software). Their investigations included chemical assessments (collecting and testing water samples), conducting biological and physical habitat surveys and assessments, and reporting their results to peers, to their teachers, and to the community. Most of their project work was done in groups using computers, so they were used to working with others and were reasonably proficient with computer operations. They did not, however, have any formal classroom instruction about models in science or about dynamic modeling.

Method, CATEGORIES, and ANALYSIS PROCEDURES

Method

Model-It, described in Appendix A and elsewhere (Jackson, et al., 1996; Stratford, 1996b) provided the dynamic modeling computer environment. Eight pairs of students (8 male, 8 female; 3 African American, 1 Asian, 4 Caucasian; 2 mixed-gender pairs; 3 male and 3 female same-gender pairs) were chosen as focus groups whose conversations and actions on the computer were videotaped throughout the study. Participants, as part of their classroom activities, used a written guide along with Model-It on the computer for 6 to 8 daily 50-minute class periods. [Note that a range is given here because some students finished working through the guide more quickly than others.] The purpose of the guide was to help them learn how to use the software to make dynamic models, and it was written in such a way as to require mindful, directed activity with the software. Then, during the following 2 or 3 class periods, they created models based upon their choice of five stream ecosystem scenarios (or a model of their own choice). For example, one scenario suggesting a model of cultural eutrophication: “When excess phosphorus from human sources is added to a stream (cultural eutrophication), algae blooms can result. Build a model that includes algae and bacteria population objects, along with stream factors such as dissolved oxygen and total phosphorus. Also include a possible source of the phosphorus in your model.” In like manner, each scenario briefly described an ecological phenomenon (e.g., cultural eutrophication) and suggested a couple of objects or factors to help them get started in their analysis. Their teachers communicated an expectation to the students that they should attempt to enter explanations and descriptions into the appropriate explanation and description boxes provided in the software, for all of the objects, factors, and relationships they included in their model. The videotape from the independent modeling sessions (about 11 hours of total footage) comprised the data for our study.

Categories of Analysis

The five Cognitive Strategies for Modeling we have associated with dynamic modeling are analyzing, relational reasoning, synthesizing, testing/debugging, and explaining. The modeling-related meanings for each of these categories are found in Table 1, along with examples of the kinds of behaviors we took as evidence for those strategies. Here we briefly discuss the contents of the table.

Analyzing strategies include statements or actions in which students divide the scenario or phenomenon into parts, identify important components, or in which they attempt to make sense of or pass judgment on their model's behavior. Analyzing strategies, then, are statements or actions such as: identifying factors or objects, creating Factors or Objects in Model-It, making judgments about the difference between parts, interpreting the model's behavior, drawing conclusions about the model, or critiquing what works and what doesn't.

Relational reasoning strategies consist of statements or actions related to reasoning about the relationships between parts of the scenario or phenomenon, reasoning about the relationships between factors or objects in the model, or making a reasoned prediction about the behavior of a model. Reasoning strategies include statements or actions such as: creating relationships with the Relationship Maker, making cause and effect statements, discussing or selecting relationships, and predicting what should happen when the model runs.

Synthesizing strategies are statements or actions related to viewing the content, behavior, or form of a model as a whole, or to making connections between previously unconnected ideas. This includes the following strategies: deciding how the model should work as a whole, discussing or commenting on the model's representation in the Factor Map or in a concept map, and making connections between ideas (e.g., realizing that factors are related).

Testing and debugging encompasses strategies related to verifying that a model works, or to figuring out why it doesn't work. It includes the strategies of testing the model using Model-It's built-in testing facilities, and of changing existing factors and relationships through additions, deletions, or modifications.

Finally, explaining strategies are associated with talking or writing about why a relationship exists, that is, about the reason(s) why one factor causes changes in another. So explaining strategies are those that involve telling why or how parts of a phenomenon are related (or typing them into an explanation box in Model-It), illustrating statements with examples, stating some supportive evidence or justifying an argument logically, elaborating on or demonstrating ideas, or giving witness to something they have personally experienced or observed.

 

 

Cognitive Strategies for Modeling Definition Criteria and Examples
Analyzing identifying factors or objects Students talk about factors or objects, discuss which are relevant (or not relevant) to their scenario; they discuss relevant minimum, maximum, or initial values
  creating factors/objects in Factor Factory/Obj. Editor Students create factors or objects using the Factor Factory or Object Editor
  making judgments (comparing and contrasting) Students talk about how things are alike or different (e.g., discussing whether a relationship should be “about the same” or “more and more”); students compare their model to the real world
  interpreting model's behavior when testing Students discuss what the behavior they're observing means, either in terms of the factors and relationships they created, or in terms of how the real world works
  drawing conclusions Students discuss an issue and come to some conclusion
  critiquing what works Students make comments like “it's not working” or “it's working” and talk about what they think is right or wrong with it; they make reference to whether their model is accurate or realistic
Relational Reasoning creating relationships in the Relationship Maker Students create relationships using the Relationship Maker (immediate or rate, text or table view)
  making cause and effect statements Students say “this affects that” or “this makes that go up or down”; they use words like “increases,” “decreases,” “a causes b,” makes more,” “makes less”
  discussing or selecting relationships Students select which relationships they should include in their model; they discuss possible relationships to include (or exclude); they discuss whether a relationship should be immediate or rate
  predicting what should happen Students say things like “it should do ... when we run it” or “it's going to ...”; or, they say “it didn't do what I thought it would”
Synthesizing deciding how model should work as a whole Students discuss model as a whole (e.g., “our model shows how weather affects stream depth”); they remind themselves about what their model is supposed to do
  discussing or commenting on the model's representation in the Factor Map Students look at their Factor Map and discuss the overall shape (e.g., “look, these make a long chain”, or, “in our model the main factor is...”), or configuration of relationships (e.g., “look, this factor depends on everything else”)
  discussing/commenting on Concept Map representation Same as previous, only in reference to Concept Map
  connecting ideas Students discover or think of relationships between factors that they hadn't considered before (e.g., “I wonder if there's a relationship between...”)

Table 1. Cognitive Strategies for Modeling (CSMs): categories, criteria, and examples for analyzing, relational reasoning, and synthesizing.

Cognitive Strategies for Modeling Definition Criteria and Examples
Testing and Debugging testing model Students run their model with meters and/or graphs after they have constructed one or more relationships
  trying possible solutions Students are not satisfied with their model's behavior (either they want to improve it or they think something is wrong with it) so they modify something and re-test it
Explaining explaining why or how parts are related (causally or correlationally) Students talk about how or why a relationship works using words like “because” (e.g., “a affects b because c”)
  giving examples Students give examples with their explanations (e.g., “things in the weather such as clouds, rain, sun, and wind affect the stream”)
  stating evidence Students refer to data, experience, or common sense to support an explanation
  justifying an argument Students make a logical argument to support an idea or explanation
  elaborating or demonstrating ideas Students restate an idea or demonstrate it on the computer
  describing what was observed Students observe something and describe what they saw (e.g., “It's like...” or “I saw it do...”)

Table 1 (continued). Cognitive Strategies for Modeling (CSMs): categories, criteria, and examples for testing and debugging and explaining.

 

 

Analysis Procedures

The goal of the data analysis was to create narratives capturing the characteristics and quality of each focus groups' Cognitive Strategies for Modelingthe Factor Factory or the Object Editor to create factors or objects they are analyzing. Third, students might make comparisons between similar parts, such as different kinds of fertilizers, analyzing them as to their similarity or difference. So when students talk about how things are alike or different, discussing whether a relationship should be “about the same” or “more and more,” or in which students compare their model to the real world, they are likewise analyzing a scenario. Fourth, as students run a model, they view the output on meters or graphs; they must interpret this information, analyzing it and relateing the information back to parts of the model they defined that give rise to that behavior. Thus, when students discuss what the behavior they observe means, either in terms of the factors and relationships they created in their model, or in terms of how the real world works, they are analyzing. Fifth, during the process of building a model, students may deal with modeling issues and make decisions as to, for example, how detailed their model should be, how accurate it should be, or how important one part is in relation to another. So occasions in which students discuss some issue related to modeling and reach some conclusion or decision may also be related to their analysis of the phenomenon. Finally, as students run their model, interpreting its behavior, they pass judgment on its behavior by analyzing how it behaves and comparing it to their idea of how it should work. So when students make comments like “it's (not) working,” or comments about what they think is right or wrong with the model, or by make reference to whether their model is accurate or realistic, they are engaging in analyzing behavior.

Relational Reasoning

Students engage in relational reasoning cognitive strategies when they discuss, select, or create relationships, make cause and effect statements, or predict what will happen when a model runs. First, before actually creating a relationship with Model-It, students determine whether a cause-effect relationship exists between parts of a scenario. Prior knowledge upon which to base that determination may be obtained from any number of sources, such as textbooks, conversations with the teacher, reference materials, etc. Whatever the source, students must have enough information about the relationship to be able to tell which factor affects the other, whether increases in one lead to increases or to decreases in the other, and perhaps how effects vary over a range of possible values. In this data analysis, when students used words like something “affects” another, talked about how something makes something else go up or down, or used words like “increases,” “decreases,” A “causes” B, “makes more,” or “makes less,” they were engaging in relational reasoning. Second, as students work together to create their model, they carry on conversations about the relationships they are considering putting into their model. For example, they may discuss whether a relationship is even needed for the scenario they are modeling, or they may consider a number of possible relationships before making a decision. So when students select a relationship to include in their model or when they discuss possible relationships to include or exclude, they are reasoning about relationships. Third, in order to create a relationship in Model-It, students must first create at least two factors, and reason that one factor causes changes in another. Once they create the relationship, they have to reason out additional decisions, such as whether the first factor affects the second over time (as a rate) or immediately, or whether the pre-programmed immediate relationships are inadequate, necessitating using the table view. So when students used the Relationship Maker in Model-It to create relationships, immediate or rate, text or table view, they are also reasoning relationally. Finally, as students create relationships in their model, they eventually test their model. Sometimes they make a prediction about their model's behavior, based either upon how they reason it should work in order to model the scenario appropriately, or how they reason it should work given the way they created the model. Thus, when students say things like “it should do ... when we run it” or “it didn't do what I thought it would,” they are engaging in strategies of relational reasoning.

Synthesizing

As students engage in the process of creating a model, they have to focus their attention on the how it will be constructed and work, by making decisions about how the model should work as a whole, by looking at the Factor Map (or perhaps another conceptual representations such as a concept map) and contemplating its form or configuration, and by making connections between ideas. First, in this study, because students created models based upon scenarios (which they had learned about in FOS sometime prior to the modeling sessions), their model should mirror the behavior described in the scenario. A goal, then, of the modeling task for each group was to create “a model” that exhibited certain behavior; if students view their model as a conceptually complete (synthesized) entity, they also talk about it as a whole entity. For example, they might say that a certain factor was central to their model, that their model reminded them of something (such as a chain), or that it behaved in a certain way. They might also carry on conversations in order to decide the kind of overall behavior they would like their model to exhibit, before it was even constructed. Thus, as students discuss their model as a whole (e.g., “in our model the main factor is ...”) or when they talk about what their model is supposed to do (e.g., “our model shows how ...”). Second, in the task in this study, students were asked to first create a “concept map” to lay out the main concepts and relationships in their model. Then they took the ideas in their map and represented them in Model-It. The process of creating the map, examining it, and then incorporating some or all of their ideas into the model might generate discussion or commentary about their model as a whole. Additionally, as their model took shape and they view it with the Factor Map, similar discussions or comments might take place. The discussions or comments might be about the model's appearance in the Factor Map, or about how factors or relationships are configured spatially in relation to each other. So when students look at either of those representations (concept map or Factor Map) and discuss the overall shape (e.g., “our model looks like a long chain”) or the configuration of relationships (e.g., “this factor depends on everything else”), they are engaging in a strategy of synthesizing. Finally, as students analyze their scenario and create their model, they generate ideas about possible factors and relationships to include. Those ideas may be unrelated early in a brainstorming session, but later may be connected as concepts are recalled from memory or looked up. For example, students may initially omit a relationship between two factors, but after looking at the Factor Map may remember that one factor affects the other. Thus, when students discover or think of relationships between factors they hadn't considered before (e.g., “I wonder if there's a relationship between...”, or “oh, yeah, factor A affects factor B”), they are synthesizing.

Testing and Debugging

Students engage in testing and debugging strategies when they test their model using the testing facilities built into Model-It, and when they find their model isn't working as they expect, so they try to figure out the problem. First, after creating a few factors and at least one relationship with Model-It, students test the model by displaying meters (to see the immediate values of a factor) and/or graphs (to see the values of a factor plotted over time). As the model runs, they manipulate independent factors (factors not affected by any other factors) by moving sliders on meters up or down and observing its effects on the dependent variables. Thorough testing might consist of running the model several times with each independent variable set at a range of values, whereas incomplete testing might include only a few factors or only portions of their ranges. Sometimes a particular test might be repeated in order to more carefully observe the model's behavior. Thus, when students ran their model using meters and/or graphs they were engaging in testing strategies. Second, in the course of testing their model, students might encounter situations in which their model does not behave as expected. The sources of this discrepancy are many–they may have inadvertently created a relationship that didn't match with their conceptual idea; they may not have understood how to use the Relationship Maker to create, say, a rate relationship; or, sometimes the combination of relationships working together as a model runs produces unexpected results. These discrepancies essentially pose problems for students to solve by debugging their model. In order to solve these problems, they might review their relationships, make modifications, and then re-test, or, they might remove, replace, or add relationships. Taking a model as it currently exists and trying to improve it, or adding different or additional behaviors is also a problem that may be solved by debugging. Thus, situations in which students were apparently not satisfied with the behavior of their model (either wanting to improve it, or thinking that something was wrong with it) often lead to a debugging strategy of modifying something (e.g., change or add a relationship, change a factor's range, etc.) and testing the changes.

Explaining

As students create their model, they may reason about how parts are related, but their reasoning is based upon some underlying explanation for that relationship. They sometimes talk about, discuss, and even argue about how or why parts are related, providing evidence, making a logical argument, elaborating on their ideas, demonstrating what they mean on the computer, or describing what they have observed or are observing. First, Model-It provides an “explanation” text field in which students may type an explanation for a relationship. Sometimes they articulate explanations orally as they discuss relationships between parts, and sometimes they do before or while they type something in the “explanation box.” Sometimes the explanation is articulated as they analyze their scenario or as they reason about relationships. Thus, they engage in explaining strategies when they make statements about how or why a relationship works, using words like “because,” (for example in a construct such as, “A affects B because of some reason”). Second, in the process of articulating an explanation, students might enrich their explanation with some examples. Doing so has the effect of connecting the explanation to other relevant concepts or ideas, and might be considered a demonstration of integrated knowledge. So when students' made statements in which they gave examples with their explanations, using words such as “such as” (e.g., “things in the weather such as clouds, rain, sun, and wind affect the stream”), they were engaging in explaining strategies. Third, as students articulate explanations, they may be compelled to state evidence in support of their explanation. For example, they may point out trends or patterns in data they collected; they might share an experience; they might use common sense reasons as evidence. Thus, when students referred to data, experience, or common sense to support an explanation, they are adding to their explanation. Fourth, in addition to stating evidence, students may justify their reasoning by making an argument in support. In situations in which students made a logical argument to support an idea, line of reasoning, or explanation, using constructs of the general form “because X is true, then Y must be true,” they were also explaining. Fifth, in the course of giving explanations, students may find it necessary to restate their explanation for a relationship, for example, if the person they are talking disagrees or doesn't seem to understand. Or, they may support their explanation with a demonstration of some kind, perhaps using the computer or drawing a figure to show what they mean. So restating an idea they had previously stated, or in which they were demonstrating an idea somehow, usually with the computer, is also an explaining strategy. Finally, as students work on their model, perhaps testing a relationship they just created, they may see something happen on the screen and describe what they saw to their partner. This allows them to confirm their observation with the other person. So in situations in which students verbalized what they saw happening on the computer screen, they were engaging in explaining strategies. . To that end, the analysis proceeded through several stages. First, in the descriptive stage, transcriptions of conversations were annotated to include non-verbal interactions with the computer and then divided into episodes according to shifts in modeling activity (such as when shifting from creating a relationship to testing the model). Then, for each episode a descriptive account was composed to Each category of cognitive modeling strategies may be identified in students' conversation or actions according to various criteria. Table 1 also, in the third column, shows the criteria for each, and these criteria describing what was taken as evidence for Cognitive Strategies for Modeling are discussed in the following paragraphs.

Analyzing

Students analyze a phenomenon when they identify or create parts (factors or objects), when they compare and contrast parts, when they interpret what was happening as they test their model, when they draw some conclusions, or when they critique their model. First, in Model-It, when students discuss potential factors or objects for their model, considering whether or not they are relevant to the scenario they are trying to model, they were analyzing the scenario. Also included in the analysis category are behaviors such as discussing or choosing the minimum, maximum, or initial values, because these actions involve considering factors as parts of a model or parts of a scenario. So instances in which students talked about factors or objects in their model, discussed which were relevant or not relevant to their scenario, or in which they discussed relevant minimum, maximum, or initial factor values were taken as analysis behaviors. Second, in Model-It, the Factor Factory or Object Editor is the tool used to create factors and objects to represent parts of a phenomenon in a model; therefore, when students use

summarize the main happenings in that episode. Next, we iteratively analyzed each descriptive account for instances in which students engaged in Cognitive Strategies for Modeling, a process that involved writing, refining, and categorizing narratives, discussing analyses with colleagues, and producing summaries. In each narrative, the episode's happenings were interpreted in terms of the modeling strategies in which the students were apparently engaging. The goal of the final, synthetic phase was to identify patterns in the entire analysis, in order to compose a story about the strategies in which each pair of students engaged as they constructed their model. The stories were illustrated with examples from students' modeling sessions, by drawing upon our episode descriptions and upon our analytic narratives as well as upon transcript data.

In order to help exemplify our analysis procedure, in Appendix B we have provided a short sample transcript episode and examples of how it was analyzed.

RESULTS

Here are the end results of our analysis for each of the 8 groups of students, after the analysis was completed. These results are summarized in Table 2 and discussed below. In the discussion, because of length considerations, we will expand on the results for only 3 of the groups.

In this discussion, we will elaborate on the results from Cory and Dan, Cathy and Connie, and Nicole and Mark, because we feel these cases will provide the reader with a flavor for the range of Cognitive Strategies for Modeling in which the pairs of students engaged. Cory and Dan's Cognitive Strategies for Modeling represent a moderate quality; Cathy and Connie represent a high level of quality in their strategies, and Nicole and Mark represent mixed levels of high and some low quality Cognitive Strategies for Modeling.

Cory and Dan

Cory and Dan were two students of average ability and achievement. They created a model of the impact of urban runoff containing human and animal waste on stream quality. Their analysis of the scenario was accompanied by some rather shallow causal reasoning, though they did carefully reason about several key causal relationships in their model. They used the Factor Map as a catalyst for synthesizing their model. They only tested their model a few times. They wrote no written and articulated only a few oral explanations while they created their model.

 

 

Cognitive Strategy for Modeling     Cory and Dan (Scenario: D) Phil and Gary (Scenario: A)
    analyzing Analysis accompanied by shallow causal reasoning No verbal analysis or discussion
    relational reasoning Careful analyzed and reasoned about certain key causal relationships Some
    synthesizing Used the Factor Map as a catalyst for synthesis and analysis Model never completely synthesized
    testing/debugging A few instances Only one (limited) instance of testing
    explaining No written and few oral causal explanations during data collection Mostly reiterative explanations
  Misc. Comment     Solo cognitive modeling strategies
      Cathy and Connie (Scenario: B) Nicole and Mark (Show how weather affects the stream depth and temperature)
    analyzing Carefully analyzed factors and relationships for realism Analyzed scenarios in depth
    relational reasoning Reasoned about how factors should be related to each other Discussed every causal relationship while creating it
    synthesizing Commented on or discussed the big-picture synthesis of their model Evaluated proposed factors and relationships against the goal
    testing/debugging Engaged in mindful testing and questioning None
    explaining Articulated explanations to justify decisions and help each other understand what they were observing. Supported arguments with causal and correlational explanations
      Rachel and Sam (Scenario: A) George and Carl (Scenario: A)
    analyzing Engaged in analysis while referring to science content source Analysis focused on modeling techniques rather than content
    relational reasoning Engaging in correlational and causal reasoning Confusion about causality resolved by software capability
    synthesizing Referred to the Factor Map to provide a synthetic overview of the model; Just-in-time completion and final testing for verification
    testing/debugging Engaged in unsatisfactory (to them) testing Reckless relationship-building early necessitated extensive testing
    explaining Formulated non-causal and causal explanations Some explanations
  Comment     Early mutual work, later solo work
      Denise and Mary (Scenario: A) Nancy and Andrea (Scenario: A)
    analyzing Received help on scenario analysis Minimal scenario analysis as they created their concept map
    relational reasoning Received help on causal reasoning Failed to causally analyze relationships
    synthesizing Tested the model as a synthesized whole Synthesized a concept map, not a model
    testing/debugging Tested the model as a synthesized whole None
    explaining Made both causal and non-causal explanations Made reiterative and factual rather than causal explanations

Table 2. Results of analysis of Cognitive Strategies for Modeling. (Pairs treated in detail in this paper are shaded.)

 

 

Shallow causal reasoning

Cory and Dan engaged in several rounds of analysis before coming to a decision about which scenario to model, during which they considered, in rapid succession, the chemical test, the macroinvertebrate/water quality, and the food chain scenarios. They settled on modeling the effect of rainfall runoff on a stream ecosystem. They sketched out a rough model that included components from several scenarios, but apparently drawn primarily from the rainfall runoff scenario. However, their first day's conversation did not include any indication that they were considering causality at all; not until the very end of the first modeling session when they started testing their model did they begin to talk about how or why rainfall causes changes in runoff.

Factor Map used as a tool for synthesis and analysis

They asked for some help from a classroom helper why their model wasn't working; actually, they had forgotten about how to open a meter to test it. Once they tested their model, they subsequently opened the Factor Map and generated numerous ideas for extending their model, such as: gravity affects rainfall, salt affects total solids, dog and geese feces affect fecal coliform in the runoff, and so on. Viewing the Factor Map apparently helped them think of many ideas for their model, ideas they subsequently discussed and some of which they implemented.

Carefully reasoned about key relationships

The relationships between runoff, parks, animal feces, fecal coliform, and water quality formed the core of their model, and there were several occasions in which they explored those relationships in depth. For example, in one episode midway through their modeling sessions, they tested their model so far and found that rainfall always equaled runoff; Dan argued that runoff should be less, “cause if it rains not all of that will go into [the stream].” They subsequently changed the relationship to more closely reflect Dan's understanding of the relationship. In the brainstorming session mentioned above, they articulated several causal explanations, including: road salt enters the stream through runoff making the water quality go down; and geese and dog feces are washed into the stream by runoff making water quality decrease.

In another episode, Cory tried to create a relationship between animals and fecal coliform. Dan argued that “it's not just `animals.' It doesn't matter how many animals there are. ... It's about the animal waste.” They proceeded to create an “animal waste” factor and a relationship between “animal waste” and “fecal coliform.” Although they were not engaged in a deep level of causal reasoning, still it was evident that they were carefully analyzing their factors and attempting to link them in logically causal ways.

A few instances of testing

There were only a few instances of testing, one near the end of the first session, in which they forgot about using meters, and one near the beginning of the second session in which they asked for help and then went on to the Factor Map to brainstorm and create more relationships. Most of their testing sessions seemed satisfactory to them, or helped them analyze what else they needed to do to their model. However, their testing was inadequate because it did not reveal several conceptual flaws in their final model.

No written, few oral causal explanations

They did not type in any explanations during either of their modeling sessions. Neither did they express many oral explanations about the relationships they were modeling, particularly on the first day. They created numerous relationships without articulating any explanations about how or why the relationships worked the way they did. It wasn't until the second day that they made any oral explanations, and even then there weren't very many.

In summary, Cory and Dan represent a moderate quality of modeling strategies. They engaged in quite a bit of scenario analysis, but considerations of causality and underlying explanations were not very evident. The Factor Map seemed to help them view their model as a whole and to energize their analysis. Their testing seemed satisfactory to them, but they actually only tested superficially. Most importantly, however, they spent time reasoning about several key relationships, and took care to express them in reasonable and logical ways in their model.

Cathy and Connie

Cathy and Connie created a model of cultural eutrophication and algae blooms. Cathy was a high achiever, and Connie was average. Together, they carefully analyzed factors and relationships for realism; they reasoned about how factors were related to each other; they engaged in mindful testing and questioning; they commented on and discussed the big-picture synthesis of their model; and they articulated oral and written explanations to help each other understand what they were observing.

Carefully analyzed factors and relationships for realism

Cathy and Connie repeatedly discussed their scenario in order to select appropriate factors and relationships, all the while considering whether they were being realistic. They talked about whether “phosphates” and “nitrates” should be factors or objects; whether they should include either or both “bacteria” and “algae” as population objects; and how to define a “fertilizer” factor in a realistic fashion (Cathy: “there's no real measure for fertilizer runoff, so I guess I'll just leave it as 0 to 100 [the default] for now–100 gallons of fertilizer–that would be spectacular”), among other things.

In later episodes, they often expressed concern for realism. For example, they had some trouble getting their algae population to work the way they though it should. In order to make the algae rate of growth high enough to make the population of algae grow, they had to raise rainfall (which increased fertilizer runoff into the stream) to what they felt was an unrealistically high level. Again, near the end of their modeling sessions, they discovered that when their model showed a level of 0 dissolved oxygen in the stream, there were still organisms living, Cathy observed that that was “totally unrealistic.” The idea that their model should be realistic was apparently never far from their thoughts.

Careful reasoning about relationships between factors

In many episodes, Cathy and Connie discussed how certain factors were related to each other, as they created relationships between them. For example, in the process of creating relationships between dissolved oxygen and organisms in the stream, Cathy realized she didn't understand how dissolved oxygen might affect different organisms differently, so she discussed it with Connie and with her teacher until she was satisfied that she understood. In a later episode, they discussed relationships between algae and bacteria, but Connie was confused about how to connect them. Cathy carefully explained how algae should affect bacteria: “No, but see the algae count makes bacteria grow faster. See, that's the thing–rate of growth is always going to affect bacteria count. How fast they are growing is what makes the population higher.” Her explanation showed that she understood not only the relationship between algae and bacteria (larger quantities of algae will lead to increased rates of growth of bacteria), but also how to model the relationship in the computer.

Discussing the big-picture synthesis of their model

Each of the students made comments indicative of a “big-picture” view of their model. For example, early in their modeling session, when they were working on their concept map, Connie mused out loud about what their “main” factor might be, but Cathy said they didn't have a main factor, and proceeded to read from the map: “fertilizer runoff affects total phosphates and nitrates which increase algae and plants which decrease DO [dissolved oxygen] which kill everything.” [Note that some of her later comments reveal that she knew it was not dissolved oxygen that kills living creatures, but the lack of it.] Connie understood what Cathy said, because later she commented, “We don't really have a web, we have a chain.”

Mindful testing and questioning

When Cathy and Connie encountered situations in which their model didn't behave as they expected it to behave, they treated these situations as problems to be solved. Early on they encountered a problem in the level of nitrates in their model–something was causing it to drop to nearly zero. They ran numerous tests, modified the model several times, and asked for assistance from a classroom helper before finally discovering the source of the problem and fixing it. In another situation their algae population wasn't growing the way they thought it should grow. Cathy suggested that they remove a certain relationship and replace it with another. That didn't work, and they hypothesized that they needed to modify one relationship to make it somewhat stronger. They tried this solution, tested their model, and were finally satisfied with its behavior. At no time did they encounter unexpected or perplexing model behavior without trying to understand, explain, and correct it if necessary.

Articulating helpful explanations

Cathy and Connie justified and explained their decisions, plans, and observations to each other on numerous occasions. For example, to explain the decision to set the initial value of their nitrates factor at 0.1, Cathy explained, “It's where unpolluted streams are usually at. You see 0.2 is going to be quite a bit more I it's only found in really small levels.” Another time, Connie argued for an “increases more and more” relationship between rainfall and runoff, explaining, “If there's more rain, the rain would build up and then wash it all away...it would only do a little bit at first.” They both consistently attempted to help each other understand what they were thinking.

In summary, Cathy and Connie engaged in high quality modeling strategies. Their analysis was thorough, their reasoning was causal, their synthesis was driven by a mental picture of the model as a whole, their explanations were supportive and valuable, and they tested and debugged their model until it worked to their satisfaction.

Nicole and Mark

Nicole and Mark chose to design a model on their own to show how weather factors can affect the depth and temperature of a stream. Both students were of average achievement and ability. They substantially analyzed their scenario, and discussed (sometimes argued about) causal relationships as they created them, supporting their arguments with explanations (both causal and correlational); and they evaluated proposed factors and relationships against their overall main goal.

Analyzed scenarios in depth

Nicole and Mark began their modeling session by analyzing several of the possible scenarios in great depth. In turn, they considered the water quality test scenario, the macroinvertebrate indicator of water quality scenario, and the fertilizer runoff scenarios before finally deciding on their own to do a weather model. For each scenario, they discussed it as if they were really making a model of it. For example, when they talked about water quality tests, they discussed several possible relationships between riffles, dissolved oxygen, and biochemical oxygen demand; when they considered the macroinvertebrate scenario they looked up the pollution tolerance indices for various taxa. Even after finally settling on a model of how weather affects a stream, they generated, considered, and discarded many, many factors, including dissolved oxygen, phosphates, nitrates, fecal coliform, and even “rubbish.”

Discussing causal relationships

Every time they created a relationship, Nicole and Mark discussed it. For example, they had a discussion about how cloud cover is related to air temperature: Nicole claimed there was a causal relationship (“If cloud cover increases, there will be less sunlight”) but Mark countered with an exception (“Sometimes cloud cover keeps the hot air in”). Similar discussions occurred as they considered relationships between “cloud cover” and “rainfall rate” (Mark: “sometimes when the sky is cloudy, it doesn't rain, it doesn't always rain”), and between “stream temperature” and “water quality.” Sometimes their discussions were more like arguments, but they engaged in dialogue about every relationship they put into their model, and many more.

Evaluating factors and relationships against a goal

Because Nicole and Mark chose to create a model of their own that was not already described in a scenario, they found it necessary to constantly make sure they were making progress toward a final model. During their work together, Mark tended to suggest factors and relationships that weren't directly related to weather; Nicole often had to remind him that they were doing a weather model, and persuade him why his idea didn't fit into the model they were constructing. For example, in an early episode, Mark suggested that they include a “trash” factor, and even went so far as to actually create a factor for it. However, eventually they decided to discard it from their model, because, as Nicole put it, “What does it [trash] have to do with the weather, Mark?” Nicole asked a similar question when Mark wanted to create a “ducks” factor; and when he wanted to include acid rain, Nicole said, “I don't think we should use acid rain because it doesn't have to do with anything we are doing. ... We are trying to see how weather affects temperature and the depth of the stream.” Nicole's comments helped keep them on track and ensure a coherently synthesized model.

Arguments supported with both causal and correlational explanations

Nicole and Mark supported their (numerous) arguments with a mixture of causal and correlational explanations. For example, Nicole argued (correlationally) that increased cloud cover would decrease air temperature because there was less sunlight. A similar correlational explanation was presented for the relationship between cloud cover and rainfall rate (more clouds make more rain). Mark explained somewhat correlationally later on that lower temperatures were good for water quality because macroinvertebrates prefer cooler temperatures; he explained causally that deeper streams have lower temperatures because “on deep stuff, the sunlight is hitting the top of the water and it doesn't always make it down to the bottom.” This explanation translated into the following in the explanation box: “The surface is warmer because the sun can reach it better, and if the stream is deeper, the bottom will be cooler.” Thus, their explanations were a mixture of both causal and correlational statements.

In summary, Nicole and Mark engaged in some high quality Cognitive Modeling Strategies, and in some at a lower quality. They analyzed their scenario in depth, both in terms of relevant factors and in terms of how those factors were related. They remained focused on their final goal and produced a coherently synthesized model. However, though they articulated a substantial number of explanations, they did not distinguish between causal explanations and correlational explanations. Also, they did they engage in any model testing and debugging at all.

Summary of Findings

This is a summary of findings from all 8 pairs of students. Most pairs of students engaged in an analysis of appropriate objects and factors for their model. Some of their analyzing strategies were limited to identifying and creating factors whereas others' analysis strategies were richer, more substantial, and more mindful. Most engaged in relational reasoning about their factors, though again a range was evident: some discussed every relationship in depth, some concentrated on only the most important key relationships, and a few were either unaware of or confused about the difference between causal and correlational relationships. Most were able to synthesize a working model, employing a range of strategies: they used the Factor Map as a tool to aid their visualization, they focused on their goal, or they found ways to talk about their model's appearance or form. Similarly, most attempted to articulate explanations for their relationships, but sometimes explanations were shallow or even non-existent. Most tested their model, though some tested their model much more substantially and thoroughly than others. Only a few persisted in their debugging to fine-tune their model's behavior to match their expectations.

Discussion and implications

These findings indicate that creating dynamic models has great potential for use in classrooms to engage students in thought about the science content they are supposed to learn, particularly in those thinking strategies best fostered by dynamic modeling: analyzing, relational reasoning, synthesizing, testing & debugging and explaining. This work builds upon other related work (Jackson, et al., 1996; Mandinach, 1989; Miller, et al., 1993; Roberts, 1981), expanding the base of dynamic modeling research at the secondary level (Stratford, 1997), providing a closer look at thinking strategies employed by students as they create dynamic models, and informing software design and classroom instruction.

The STELLA research (Mandinach & Cline, 1994) explored how students may benefit from systems thinking and from creating models of complex systems, but not the cognitive strategies in which students engage. Miller and colleagues (1993) looked at just one type of cognitive strategy, reporting that students engaged in sophisticated causal reasoning as they created models with IQON. Our research extends this prior research in dynamic modeling by investigating and reporting on a much wider range of cognitive strategies. In this study, it was found that most students engaged in some or all of the Cognitive Strategies for Modeling at some time during their modeling sessions. Most engaged in analyzing their scenario, and were able to select and create appropriate factors. Most engaged in relational reasoning as they created relationships between factors. Most were able to synthesize a working model. Most tested their model, some to a lesser, others to a greater degree. Most groups engaged in explaining their relationships, though the depth of their explanations was sometimes rather shallow and sometimes more correlational than causal.

Thus, students do engage in Cognitive Strategies for Modeling when they create dynamic models. This supports our idea that dynamic modeling can be a performance for understanding (Perkins, 1992) for students in science classrooms, engaging them in analyzing, reasoning about, and synthesizing the content they are learning. Engaging in such activities allowed students to “go beyond the information given” (Bruner, 1973) in the scenarios, building upon what they knew to produce a synthesis of that knowledge in the form of a model. Thus these results also show that, using Model-It, students engaged in the same general kinds of cognitive activities that systems thinking and system dynamics modeling is claimed to support (Forrester, 1978; High Performance Systems, 1992), though in not as rigorous a form.

Model-It's design provides strong support for dynamic modeling, structuring the task into easily understood subtasks. However, additional support is necessary for all students to progress beyond making somewhat superficial relationship connections (that are often based on correlations) toward creating and articulating causal explanations for every relationship. In addition, since testing is critical for verifying a model's behavior, Model-It needs to support not only the act of testing but also its necessity. Further design and implementation of these revisions are being made: the most recent version of Model-It (now renamed “Theory Builder” and in the process of being field-tested) includes, for example, scaffolds to support more in-depth analysis and to support testing and debugging.

In conclusion, this research demonstrates that creating dynamic models is a classroom activity that fosters students' engagement in higher-level thinking performances such as analyzing, reasoning, synthesizing, testing/debugging, and explaining. Constructing dynamic models provides opportunities for them to think about, use, and reflect upon the science content knowledge gained during classroom instruction and investigations.

ACKNOWLEDGMENTS

This research has been supported by the National Science Foundation (RED 9353481) and the Rackham School of Graduate Studies at the University of Michigan, Ann Arbor.

The authors may be contacted as follows: Stratford ([email protected], or 703 Western Meadows Dr., Watertown, WI 53098); Krajcik ([email protected], or School of Education Building, University of Michigan, 610 E. University, Ann Arbor, MI 48109); Soloway ([email protected], Advanced Technologies Laboratory, University of Michigan, 1101 Beal Ave., Ann Arbor, MI 48109-2110).

References

Bruner, J. S. (1973). Beyond the information given. New York: W.W. Norton.

Forrester, J. W. (1968). Principles of systems. Cambridge, MA: Wright-Allen Press.

Gardner, H. (1991). The unschooled mind: how children think and how schools should teach. New York: Basic Books.

Gilbert, S. W. (1991). Model building and a definition of science. Journal of Research in Science Teaching, 28(1), 73-79.

Huebel-Drake, M., Finkel, E., Stern, E., and Mouradian, M. (1995). Planning a course for success. The Science Teacher, 62(7), 18-21.

Jackson, S., Stratford, S. J., Krajcik, J. & Soloway, E. (1996). Making dynamic modeling accessible to pre-college science students. Interactive Learning Environments, 4(3), 233-257.

Mandinach, E. B. & Cline, H. F. (1994). Classroom dynamics: implementing a technology-based learning environment. Hillsdale, NJ: Erlbaum.

Perkins, D. (1992). Smart schools: from training memories to educating minds. New York: The Free Press.

Roberts, N. & Barclay, T. (1988). Teaching model building to high school students: theory and reality. Journal of Computers in Mathematics and Science Teaching, 8(4), 13-24.

Roberts, N., Andersen, D. F., Deal, R. M., Garet, M. S., & Shaffer, W. A. (1983). Introduction to computer simulation: a system dynamics modeling approach. Portland, Oregon: Productivity Press.

Schecker, H. (1993). Learning physics by making models. Physics Education, 28, 102-106.

Silvert, W. (1993). Object-oriented ecosystem modelling. Ecological modelling, 68, 91-118.

Soloway, E. & Pryor, A. (1996). The next generation in human-computer interaction. Communications of the ACM, 39(4), 16-18.

Stratford, S. J. (1996a). Opportunities for thinking: constructing models in precollege science classrooms. Paper presented at the Annual Meeting of the American Educational Research Association. New York, New York, April, 1996.

Stratford, S. J. (1996b). Investigating processes and products of secondary science students using dynamic modeling software. Unpublished doctoral dissertation, University of Michigan, Ann Arbor.

Stratford, S. J. (1997). A review of computer-based model research in precollege science classrooms. The Journal of Computers in Mathematics and Science Teaching, 16(1).

Stratford, S. J., Krajcik, J. & Soloway, E. (1997). Technological Artifacts Created by Secondary Science Students: Examining Structure, Content, and Behavior of Dynamic Models. Paper presented at the annual meeting of the National Association of Research in Science Teaching, Chicago, IL., March, 1997.

Stratford, S. J., Krajcik, J. & Soloway, E. (1996). Cognitive strategies in dynamic modeling: case studies of opportunities taken and missed. In D. Edelson & E. Domeshek, Proceedings of the International Conference on the Learning Sciences. Evanston, IL, July, 1996.

APPENDIX

Description of Model-It Software

Model-It is based upon an “ecological modeling” simulation engine (Silvert, 1993) in which variables are paired using functional mathematical relationships; the simulation engine averages the effects of multiple functions to derive resultant output values. Using Model-It, the user creates objects with which he or she associates measurable, variable quantities called factors and then defines relationships between those factors to show the effects of one factor upon another. Relationships can model immediate effects or effects over time. Model-It provides facilities for testing a model and a “Factor Map” for visualizing it as a whole.

Creating Objects

Typically, objects are chosen to correspond with observable features of a system being studied: trees, fish, weather, people, water, golf courses, and so on. Model-It allows the user to associate a graphic, icon, or photograph with each “object,” so that each becomes visually associated with what it actually represents. Figure 1 shows a Model-It screen that represents the objects in a sample lake model: a picture of a lake, the cattails representing plants, fish graphic representing fish, sun/clouds/rain representing weather, and the faucet (somewhat whimsically) representing water runoff into the lake. It is also possible to specify whether an object is an “environment,” “individual” or a “population” object [1]; in the example model, plants and runoff might be singular, “individual” objects, whereas fish might be created as a “population,” giving it special preprogrammed behaviors and relationships [2]. shows how the fish population object is created with Model-It's “Object Editor.” Note that Model-It does not place any constraints upon the selection of objects or choice of domain–it is content-free, entirely directed by the preferences and choices of whomever is creating the model. The Model-It program is distributed with several sample environment pictures and a selection of graphics that can be used to create objects, particularly in the domain of stream ecosystems.

Creating Factors

Next, the user selects and creates “factors,” each one associated with a specific object. Factors are usually measurable: the temperature of the stream, the speed of the wind, the number of people, the size of the golf course. Factors can also be “calculable” (that is, mathematical constructs), such as the water quality of a stream or the rate of growth of a population. Choosing relevant and irrelevant factors is important in the analysis of the problem. An object that seems to have only one factor at one level of analysis may in fact be decomposable into more factors at a deeper level of analysis. shows how one factor, pond depth, would be created with the “Factor Factory.” If the `minimum' and `maximum' values are unknown, Model-It supplies a generic, default range of 0 to 100. A deeper analysis of the pond depth factor itself might engage the modeler in a consideration of what values might be more realistic, of what additional information might be needed to determine those values, or of how values might differ between locales. The `initial value' of a factor may be an actual value, or may simply represent the modeler's choice of how high or low that factor should start when the model is run. The `units' and `description' entry fields are fields in which the user may include relevant information in the model, but that information is not used by the simulation engine.

As an aside, factors intended to be used as rates (to be discussed shortly) should be created with relatively small ranges in comparison to the possible values of the variables they are likely to affect. If we assume for a moment that “suspended solids” in the pond might be defined with a range of, say, zero to five hundred pounds of solids, then the range of the factor representing the average rate at which solids are entering the stream from the runoff should be quite a bit smaller, say, zero to three.

Defining Relationships–Immediate

After selecting and defining at least at least two factors of one or more objects, any two factors may be associated by creating “relationships,” defined by selecting from among two types and several variations [3]. One type of relationship that can be chosen is the “immediate” relationship that works as follows: changes in the value of the causal factor are immediately reflected in the value of the affected factor regardless of what happened before in previous time steps [4]. Immediate relationships may be defined with one of two orientations (“increases” or “decreases”) and a selection of variations (e.g., “a little,” “a lot,” “about the same,” or “more and more” or “less and less”). These orientations and variations are selected with simple pull-down menus. By default, Model-It creates all immediate factors as “increases about the same.” shows how to define an immediate relationship between the pond's suspended solids and its depth, using the “Relationship Maker.” An “immediate increases about the same” relationship has been selected, following the reasoning that suspended solids eventually settle out and decrease the depth of the pond. Note that the Relationship Maker screen provides a simultaneous graphical representation of the variation selected by the user as a scaffold for making the selection. shows how an immediate relationship can be defined between the fish population's count and the plants in the pond, using a table of values, following the reasoning (notice the explanation in the lower left) that if there are few fish living in the pond, plants will probably maintain a stable level, but too many plant-eating fish may lead to a decrease in the quantity of plants.

Defining Relationships–Rate

The other type of relationship is called the “rate” relationship: at each time step, the value of one factor is added onto [or subtracted from] another factor's value. shows how to define a rate relationship between the average rate of runoff and the amount of suspended solids in the pond. The words on the left hand side of the window verbalize the relationship, saying that at each time step, the value of the average rate of runoff will be added to the amount of solids in the pond. This is why (as mentioned earlier) the runoff rate's range needs to be defined with a much smaller range than the total solids' value: if it were relatively large (say, one-fourth or one-half of the total solids' value), the total solids factor could reach its maximum value within a few time steps, a situation that might be realistic in catastrophes, but not particularly useful conceptually under more equilibrated conditions. When the user attempts to create a rate relationship between two such “mismatched” factors, Model-It produces a warning about the possible problem.

Another potential problem exists when the user tries to create an immediate relationship to a factor that is already affected by a rate relationship (and vice versa). The simulation engine is not able to process a factor affected by both kinds of relationships.

Relationships, conceptually speaking, may be causal or correlational. For example, a relationship between pond depth and fish rate of decay might be more correlational than causal, because the depth of the pond itself won't cause changes in the population, but might be correlated with those changes. Model-It, however, does not require the user to make the difference between causal and correlational relationships explicit, although the user is encouraged to reflect upon how or why one factor affects another by the presence of a field in which “explanations” may be typed. Causal explanations should describe the mechanism behind a relationship; consider the explanation, for example, for the relationship between dissolved solids in the pond and pond depth in . It says “Suspended solids eventually settle to the bottom of the pond, reducing the pond depth.” The key is that more suspended solids cause eventual reduced pond depths by the mechanism of settling. On the other hand, an explanation such as “Fish die if the pond gets too shallow” doesn't articulate the mechanism relating fish populations and pond depth (fish deaths may be caused by reduced oxygen levels or by raised temperatures resulting from shallowness, but not by shallowness itself). So, although a relationship between fish rate of decay and pond depth may be observable and modelable, at best the relationship reflects a correlation, not a cause. What is important is that the user understand the difference.

Testing the Model

In Model-It, the user tests his or her model interactively using several graphical tools. One tool, called a “meter,” presents a continuous display of a factor's current value at the current time step. Multiple meters may be displayed while testing. Meters have a special property that if a factor has no other factors affecting it (i.e., it is an independent variable) its meter is a “slider” whose value can be adjusted while the model. This allows the user to test his or her model while it is running, setting values for independent factors to see what happens, and perhaps generating additional questions or hypotheses for further investigation.

A second tool, called a “Simulation Graph,” presents a line graph display of the changing values of up to five factors over many time steps. and show two tests conducted by Chris, our modeler, using meters and graph windows. The first test (), to the right of the figure, shows what happened when the model ran with the runoff factor set to zero: suspended solids stayed low, pond depth stayed high, fish grew, and plants declined and eventually stabilized at a low level. In the second test (), the model ran with runoff initially set at a value of three. Notice that the amount of suspended solids gradually increased with each time step (because of the rate relationship between runoff rate and suspended solids), which in turn eventually caused the depth of the stream to decrease. Because the pond was becoming shallower, the fish eventually died off, allowing the amount of plants to increase.

The Factor Map Overview

Finally, Model-It provides an overview of the entire model, called the “Factor Map.” Every factor of every object in the model appears in the Factor Map with its name and a small icon-sized graphic of its associated object. shows a Factor Map. Relationships are represented by arrows between factors; immediate relationships are solid black arrows, and rate relationships are gray. Factors can be moved anywhere on the screen and adjusted so they are close to factors they affect, so relationship lines don't cross, or so they are in an aesthetically pleasing arrangement. The Factor Map doesn't provide visual cues for relationships beyond the arrows between related factors (e.g., whether it was defined with “increases” or “decreases”); however, double-clicking directly on the relationship arrow opens the Relationship Maker window where its definition can be viewed or modified. The Factor Map's visual display allows the user to view his or her model as a whole; it assists him or her, then, in planning the model and synthesizing objects, factors, and relationships into a conceptual unit. [5]


Footnotes

[1] The only difference between “environment” and “individual” objects is that the “environment” object's graphic occupies the background in the Simulation Window, whereas “individual” objects' graphics overlay the background. Typically a model has only one environment object, but it is not required to have any at all.

[2] Specifically, Model-It creates three factors and two relationships with each population. One factor is called “count” and keeps track of how many individuals are in the population; one is called “rate of growth” and the third is called “rate of decay.” The two “rate” factors are each related to “count” with special rate relationships so that if “rate of growth” is larger than “rate of decay,” the population count will grow, and vice versa. Thus the population may be affected by creating relationships that manipulate the rates of “growth” and “decay.”

[3] There are constraints upon the kind and nature of relationships that may be created between factors, because of the way relationships are internally represented mathematically. For example, multiplicative relationships, in which factors are multiplied together (as in, for example, a calculation of bank interest), are not possible in Model-It.

[4] When a model is run, a time counter ticks off arbitrarily sized “time steps.” Each time step may represent a minute, hour, day, or whatever interval is conceptually sensible to the user. “Immediate” relationships operate essentially independently of time steps. “Rate” relationships, on the other hand, are driven by the time counter, so that the rate value is added to the affected factor once every time step.

[5] The version of Model-It used by students in this study (version 1.0b12) had a partially functional Factor Map: although factors could be moved around and relationships examined, the spatial arrangement was not saved with the model. Thus, factors had to be rearranged each time a model was opened.


 

Figure 1. The Simulation Window, showing weather, rainfall, fish, and plant objects superimposed on a lake object.

 

 

Figure 2. Creating a Fish population object with the Object Factory.

 

 

Figure 3. Creating a “pond depth” factor with the Factor Factory.

 

 

Figure 4. Creating an immediate relationship between “pond suspended solids” and “pond depth,” using the Text View.

 

 

Figure 5. Creating an immediate relationship between “fish count” and “plants quantity” using the Table View.

 

 

Figure 6. Creating rate relationship between “water average rate of runoff” and “pond depth.”

 

 

 

Figure 7. Running the pond model, showing Meters and Graph Windows (“weather rain” set to zero).

 

Figure 8. Running the pond model (“weather rain” set to three).

 

Figure 9. The Factor Map.

 

 

Appendix B: Sample Data Analysis

The sample data analysis follows this section of transcript in , which has been annotated from the video. The underlined passages are for later reference:

  1. {begin episode 10 creating rate affects depth, 51:20}
  2. M: OK, rainfall rate affects depth. Weather... stream ... [selects rainfall rate affects depth relationship]
  3. N: Rainfall rate affects stream depth. Is it rate?
  4. M: I don't know--Wait...why would it be, well [gets notifier message when they tried to choose rate] it could be both, but if it starts--as soon as the first...as soon as it starts to rain, it's going to change the stream depth. So how would that be a rate...never mind. I just don't understand how it could be a rate.
  5. N: Because the depth keeps increasing as it keeps raining.
  6. M: There's also an immediate. It affects it immediately, too.
  7. N: No immediate is where it never changes. ... Immediate is where it changes once and it never changes again.
  8. M: OK.
  9. N: And rate is when it changes once and keeps changing--that's what ...
  10. M: Fine, fine, fine. I'm looking for something else. Do that, and I'm gonna--cause I don't think we will have enough factors if we just do that.
  11. N: Well we have to think of the--you guys, stop it. [long pause, typing the explanation: “As the rainfall rate increases, the depth of the stream also increases.” They make it a rate relationship.]
  12. M: What?
  13. N: OK, what about a new relationship?
  14. {end episode 10}
  15. {begin episode 11 discussing factor map & looking up t rate of change, 53:05}
  16. M: New relationship,...wait, go to the factor map and see what we have so far. [goes to factor map] Wow. Oh, wow. OK You want to move that rainfall rate.
  17. N: Where?
  18. M: Just move it.
  19. N: Do you know what I'm going to do? I'm going to cool watch this
  20. M: But you see that still goes through rainfall rate. [moving stuff around on screen]
  21. N: Oh, I see
  22. M: Oh, that's cool. Air temperature affects cloud cover which affects stream temperature. [he read it wrong] Oh, look, that's like in the middle.
  23. N: Right. This affects both. [referring to cloud cover, which affects stream temp and air temp]
  24. M: OK. Now, so we're not going to do acid rain. [N: No.] So there's no use for it being there.
  25. N: Right. That's what I said. [deletes acid rain, confirms with notifier] I'm going to move it so it looks--so it looks like the arrow's not just going through it so it looks like the arrow is suppose to be like that.
  26. M: OK, um, T-rate of change? Temperature, well, yeah, we could do that. What affects the temperature rate of change?
  27. N: Well, the air temperature.
  28. M: The stream temperature, I guess.
  29. N: How did we do the rate of change last time? Why don't you look in here and find out how we did the t rate of change last time what affects the t rate of change.
  30. M: What packet would that be in?
  31. N: Oh, my god. I would have no idea. It's not going to be in that one.
  32. M: Yeah, you're right. Probably around 6 then. [looking in Guide packets] The T-rate of change affects...OK, air temperature affects T-rate of change.
  33. N: Air temperature affects...
  34. M: No, no, no.
  35. N: Oh I didn't pick that one. OK. [connects air temperature with t-rate of change to make a relationship]
  36. M: OK, wait, now we just have to find out why. OK, about the same, increases. [setting up air temp affects t rate in rel maker] OK, it means, OK, as the temperature gets higher, the water, OK. Oh, it's easy. As the water, or as the air temperature gets higher, the water changes temperature faster. Then if it gets lower, it changes. [typing explanation]
  37. N: Now wait, as the air temperature...wait, as the air temperature of the water changes...
  38. M: No.
  39. N: As the air temper...changes...
  40. M: Gets higher. [typing explanation: “as the air temperature gets higher, the water changes temperature faster”]
  41. N: Higher, the water changes...
  42. M: No, the water temperature changes, or the water changes temperature faster.[typing]
  43. N: Faster?
  44. M: Faster. And it's like as the air temperature gets lower, the water temperature, water changes temperature slower. The water changes the temperature slower. Slower, slowly. [types the rest: “as the air temp gets lower the water changes the temperature slower”]
  45. N: Slower. OK?
  46. M: Yeah. Enthralled.
  47. {end episode 11}

Listing . Transcript for Chapter II Sample Data Analysis

Episode Division: Divided into two episodes, at line 14 in the selection, where they switch from working out a relationship to looking at their Factor Map.

Episode 1 Descriptive Account: In this episode, they create and explain one factor, rainfall rate affects stream depth.

Because Mark is now persuaded not to think about doing acid rain, he suggests the idea that rainfall rate affects stream depth. They immediately get stuck on the rate vs. immediate problem. Nicole argues for rate: “Because the depth keeps increasing as it keeps raining.” Mark argues for immediate: “as soon as it starts to rain, it's going to change the stream depth.” In the end Nicole's argument wins, as she demonstrates an understanding of rate vs. immediate that Mark can't counter: “Immediate is where it changes once and it never changes again....And rate is when it changes once and keeps changing.” Mark assents and they leave it at rate. One of them finishes typing the explanation.

Episode 2 Descriptive Account: In this episode, Nicole and Mark look at their factor map, then create a relationship from air temp to t rate of change.

First they go to the factor map to see what they have so far. After arranging a few factors, Mark notices the connection between air temp, cloud cover, and stream temp, but interprets it incorrectly. Nicole points out that cloud cover affect the others. Mark asks if they're not doing acid rain, Nicole says no, so they delete it. Mark notices t-rate of change, and suggests that they create a relationship for it. He asks Nicole what affects it--she says air temp, he says stream temp. Mark looks up how they did it in the Guide, and finds that air temp affects stream t rate of change. Nicole connects the two and starts the relationship maker. Mark, having read the Guide, explains how it works: “As the water, or as the air temperature gets higher, the water changes temperature faster. Then if it gets lower, it changes.” So they make it `increases about the same' and type an explanation to that effect.

Episode 1 Event identification: constructing content--relationships

Episode 2 Event identification: constructing content--relationship; obtaining information--looking up conceptual and procedural

Episode 1 Analytical Narrative: Here they are confused about the rate vs. immediate concept. Both of them have reasonable ideas, but neither applies them to Model-It in sensible ways. Both attempt to justify their ideas and to make persuasive arguments; however, they don't go to the next level and try to find out which is right.

Episode 2 Analytical Narrative: Here they are generating another idea for their model. They realize that they lack information needed to connect t rate of change to the rest of their model, so they consult the Guide. Mark looks up more than just how to do it (inc ab same); he also looks up the explanation, and reads it to Nicole so they both know and so she can type the explanation.

Episode 1 Cognitive Strategies for Modeling: Relational reasoning: Making cause/effect statements, creating relationships with Relationship Maker; explaining: How parts are related (causally/correlationally), justifying an argument.

Episode 2 Cognitive Strategies for Modeling: Relational reasoning: Creating relationships with Relationship Maker, discussing/selecting relationships, making cause/effect statements; synthesizing: discussing model's representation in Factor Map, deciding how model should work as a whole; explaining: how parts are related (causally/correlationally).

Back to hi-ce Office