Agile Education: What We Thought We Knew About Our Classes, What We Learned, And What We Did About It
Author(s) -
Richard Whalen,
Susan Freeman,
Beverly Jaeger
Publication year - 2020
Publication title -
papers on engineering education repository (american society for engineering education)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--3967
Subject(s) - agile software development , perspective (graphical) , variety (cybernetics) , mode (computer interface) , mathematics education , student engagement , computer science , work (physics) , psychology , medical education , pedagogy , engineering , artificial intelligence , human–computer interaction , medicine , mechanical engineering , software engineering
In a continuing effort to improve a first-year design course, a team of faculty has evaluated a variety of learning modes over a two-year period by surveying both the student and faculty populations on the learning potential of each of these modes and on the degree to which each mode is interesting or engaging. Following the first year of the study, efforts were made to address learning modes which were rated low for both categories of learning potential and level of engagement. This paper presents the results of the survey administered in the second year and assesses the effectiveness of changes made to some learning modes. In addition to the student survey results, instructing faculty personal opinions of learning potential and level of engagement for each mode are included along with faculty predictions of how the students would respond from their learner’s perspective. The data was used to establish how well we as educators know our students. Results were evaluated to determine if (a) our prediction for an activity makes a difference in how the students rate a learning mode for learning potential and level of engagement and (b) if any mismatch exists in what we think and what they rate. This work provides examples of the student and faculty surveys, proposes solutions, provides assessment to components and modes presently not hitting the mark, and discusses the results of the faculty opinion survey. The hope is that other educators may identify with these outcomes, use similar tests to judge their student population, and use the results to make helpful adjustments to course content. Introduction As educators, much of what we formulate and choose to apply in the classroom with regard to learning activities is usually measured against the potential of each method to teach a concept. In many instances, whether or not the activity will engage the student is secondary to the primary objective –retention of the lesson. Of course, we would prefer to use activities that have a substantial level of engagement as well as a high learning potential but this simultaneous effect is not always possible. Even learning modes with high engagement levels are no guarantee that the experience will educate students in the most effective way. Therefore, for any course to evolve to its fullest potential, we must also assess each of the learning modes, or activities, used for its level of engagement as well as its potential for learning. The natural response to any educational assessment is to consider modifications in accordance with the feedback obtained. The original research initiative, conducted by a team of faculty at Northeastern University established that our existing first-year design course format was effective from a learning assessment perspective. The course had passed through multiple iterations over an eight-year period, undergoing incremental changes with positive results. It was then time to take a new look at the course. Subsequent research was conducted and presented at ASEE, in which the same team of faculty investigated whether or not high classroom engagement with a variety of learning activities equated to a significant amount of learning for the student. On the survey, the engagement element was defined for the students as follows: “The Interest portion is not merely about how fun the activity is compared to entertainment, but how engaging or interesting it is compared to other classroom teaching options.” Similarly the concept of learning value was described as follows: “The learning rating [of a particular mode or activity] is not merely about the percentage or amount you learned, but how well it helped you to learn the concept/topic at hand.” The learning activities in the study represented various modes of learning which primarily included active learning, service learning, problem-based learning, and case-based learning. In the previous work, the effectiveness of each learning mode was obtained by surveying each student on the self-reported amount learned and on the degree to which each class experience was interesting or engaging. P ge 13164.2 The survey is seen in Appendix A and in the initial publication for this research. A 5-point Likert scale was used in the rating system, providing the necessary quantitative analysis to determine the rank of learn (potential to convey concepts) and like (level of engagement) for each of the learning modes. As noted, the research focused on a first-year design course and results revealed a wide array of learning and engagement level combinations for the activities used in the course. Each mode was profiled with a learn-like designation using a correlation metric. It was not surprising that many of the learning modes in the high learn/high like quadrant touch upon multiple learning styles and those in the low learn/low like quadrant addressed only a limited scope of the students’ learning styles. The options suggested for handling the low/low modes were to (1) eliminate the activity, (2) change nothing (3) alter the activity, or (4) present or package the activity differently. In this follow-up paper, it is now time to look at what was done about the lower scoring modes and to evaluate the results of the students’ perspectives on these changes. The objective of this work is to first show the impact of the responsive changes in the design course components, specifically in the low learn /low like quadrant –what we did about it and if/how it worked. This further emphasizes the importance of continual assessment in a course. Second, prior to this analysis, data had also been collected from the recent instructing faculty for this course on their personal learn/like opinions for each mode and on faculty predictions of how the students would respond from their learner’s perspective. This document is seen in Appendix B. Restated, how well do we as educators know our students? Does it matter if we get it wrong? Results were evaluated to determine if (a) our prediction for an activity has any bearing on how the students rate a learning mode for learn/like and (b) if any apparent mismatch in what we think and what they rate exists. The expectation is that other educators may identify with these results, test their alignment for the benefit of their student population, and make valuable adjustments to course content. Review of Literature Learning Styles and Learning Activities. Previous work by the authors describes how each of the learning activities were infused into a first-year Engineering Design course. These activities related to the following recognized dimensions of learning styles presented by Felder and Brent: (1) sensing (concrete, practical, oriented to facts) versus intuitive learners (conceptual, innovative, oriented to theory); (2) visual (pictures, diagrams, etc) versus verbal learners (written and spoken); (3) active (tries things out, works with others) versus reflective learners (learns by thinking through, works alone); and (4) sequential (linear, orderly, learns in steps) versus global learners (holistic, systems thinkers, learns in large leaps). There were over 20 learning modes assessed in both the student and faculty surveys. These modes have been described in detail in the previous papers, and are again summarized at the end of this paper. As such, a brief review of each learning mode that was surveyed in the design course –and its mapping to the above learning styles– is outlined for the reader’s reference in Appendix C. Methodology Student Survey. As previously stated, a dual-component survey was administered to multiple sections of the student population in the Engineering Design classes in the fall semester of 2006 and again to a subsequent freshman class in 2007 (n = 232, n =191 respectively). The survey was administered in class during the last week of the semester in each year. The rating section of the survey used a 5-point Likert scale and students were asked to rate various activities on the degree to which they helped them learn and how engaging or interesting they were. Not all of the activities were used by every Professor, in which case students were instructed to place an ’n/a’. While Engineering Design sections are taught by individual instructors, the course is conducted with a team-planning approach. All three of the authors were involved in co-coordinating this course over the P ge 13164.3 semesters of interest. Team meetings were conducted for all instructors every two weeks throughout the semester. It was established that instructors of this course conduct the learning modes in a similar fashion. Accordingly, the results across the sections were combined to yield aggregate results for analysis. As noted above, the instructions, format, and ratings are shown in the questionnaire page in Appendix A. In addition to the ratings for each of the learning modes offered, requests for 3 open-ended commentaries were posed on the reverse side of the survey: Comments on what is not effective in your learning process: Comments on what works well for you in learning about and applying engineering methods: Suggestions for improving learning methods and/or ideas for other class-related activities: Results and Discussion Presentation of the research findings will first review the comparisons between the results of the first survey, collected in 2006 and the subsequent ratings found in the second survey collected in 2007. The student results will be designated as such: 2006 and 2007 for initial and follow-up, respectively. Next, the analyses will concentrate on correlations concerning on how the faculty personally regard each of the learning modes, then on the faculty’s predictions of the students’ ratings. Student Survey. The plotted results of the Learn/Like survey from the Fall semester 2006 are shown in Figure 1. The graph is divided into four quadrants with result discussions referring to Learn/Like pairings as: High/High for those activiti
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom