z-logo
Premium
Analyzing results: the tip of the iceberg
Author(s) -
Ebert-May Diane,
Weber Everett P.,
Hodder Janet,
Batzli Janet M.
Publication year - 2006
Publication title -
frontiers in ecology and the environment
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.918
H-Index - 164
eISSN - 1540-9309
pISSN - 1540-9295
DOI - 10.1890/1540-9295(2006)004[0274:arttot]2.0.co;2
Subject(s) - citation , state (computer science) , library science , art history , computer science , history , algorithm
A s faculty dive deeper into educational research, accountability, reliability, and validation will push them to analyze their classroom data in more objective ways. In the May issue of Frontiers, we described two research designs appropriate for classroom research – multiple group and split-group comparisons. We used an example to analyze how students approach an ill-structured problem (Ebert-May et al. 2006). Here, in the final article in this series, we use assessment data from a single course in which we conducted a pilot study to illustrate an approach to research design and analysis. We begin by describing the human subject approval for research and then show the initial analysis of results from the study that led to further investigation. As a final note, we offer ideas about the needs and directions of future ecological education research. Human subject approval of research Reasons for pursuing research into undergraduate learning depend on faculty goals, time, energy, and support (Batzli et al. 2006). Regardless of the reason, faculty are responsible for becoming knowledgeable about conducting research on human subjects and abiding by federal regulations and policies , as implemented by their institutions. At universities and colleges, institutional review boards protect the rights, welfare, and privacy of human subjects who participate in research conducted by students and/or faculty.) we used concept maps to show how students can visualize their thinking by building models that enable them to arrange concepts hierarchically and connect new concepts to those based on prior knowledge (Novak 1998). Concept maps are useful tools that enhance meaningful learning and retention by allowing students to practice making connections among concepts (Ausubel 2000). We designed this pilot study to test whether students who practiced using concept maps performed better on assessments designed to detect their ability to make connections than students who used another instructional tool. We implemented the use of these tools in units on evolution, inva-sive species/ecosystem services. We chose the split-group design, randomly dividing the class into two groups (A and B). For treatments, we asked students to perform multiple representations (MRs) of concepts, a task similar to concept maps. In MRs, students define each concept and then provide an example, an analogy, and a drawing or equation illustrating the concept. Students are not asked to make connections among concepts in MRs, whereas students that constructed concept maps specifically focused on making such connections. We believe that both concept maps …

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here