z-logo
open-access-imgOpen Access
Making the Case for Adopting and Evaluating Innovative Pedagogical Techniques in Engineering Classrooms
Author(s) -
Sohum Sohoni,
Scotty D. Craig
Publication year - 2016
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/p.25667
Subject(s) - context (archaeology) , happening , engineering education , computer science , scale (ratio) , engineering ethics , mathematics education , psychology , engineering , engineering management , art , paleontology , physics , quantum mechanics , performance art , biology , art history
There is a large body of literature available on effective teaching and learning both within the American Society for Engineering Education’s (ASEE’s) conference proceedings and journals, and in wider outlets. One primary goal for ASEE is to get engineering educators to adopt effective pedagogies in their classrooms. However, this is not happening at the rate or scale that the community is hoping. Furthermore, even in cases where it is happening, there is often limited evidence of correct implementation (e.g. techniques are implemented as intended or have the desired impact in the classroom). Our hypothesis is that, even for faculty who are interested in adopting innovative teaching methods, research findings and techniques that are not within their specific sub-discipline are difficult for them to implement and evaluate. These would-be adopters might appreciate some of the theory or the general learning principles from these publications, but they are often unclear on exactly how the principles can be applied within their classrooms. More importantly, they may not know how to assess the impact of the changes. Setting up research studies involving human subjects, designing the classroom evaluations, or simply designing the right questions to ask within assessments, are activities most practitioners are not trained in. Most practitioners may perceive that there is no time to implement these principles and evaluations. Thus, there exists a gap, even between most literature in engineering education and what can translate into classrooms. We believe that specifically focused, discipline-based, or even course-granularity based guiding papers are necessary to provide educators the tools and the confidence to employ effective teaching techniques and evaluate the impact of these techniques. This work, a collaboration between a computer architect who has expanded his research into engineering education, and a cognitive psychologist who specializes in the learning sciences and educational technology, aims to provide an example of such a ‘guiding paper’. As an illustration of the kind of specific information and tools necessary for broader adoption, we present details of an experimental design, the pre-post test questions, and a discussion of the choices we had and the decisions we made. In this example scenario, we propose to investigate the impact of an intervention in a computer organization course. By analyzing a previous experimental setup, we will illustrate specific lessons learned that could facilitate the implementation and evaluation. Importance of Translational Research A number of books1,2, research papers3,4,5,6, and reports7,8,9 point out the non-adoption or nontranslation of research-based instructional practices, and the phenomenon is often known as the Figure 1: Typical barriers to adoption of educational innovations, categorized into systemic problems, individual concerns, and issues stemming from the dissemination efforts of the innovators. valley or chasm of death. Research on barriers to change3,4,10,5,11 has revealed a variety of reasons for the valley of death for educational innovations, and there has been a strong push to investigate solutions to this problem. Indeed, the description of programs sponsored by the National Science Foundation, such as Improving Undergraduate STEM Education (IUSE:EHR) and IUSE/ Professional Formation of Engineers: REvolutionizing engineering and computer science Departments (RED), emphasize the need for translation and investigating the causes of non-translation. From a broader scientific perspective, it is important that practitioners implement and test research findings within their specific areas. Learning based research is conducted on a sample of the overall population of students. The practitioner applications work toward establishing the finding’s robustness and add to the finding’s causal generalization12. In essence when practitioners take a research finding and implement it into their class, this extends the external validity of the finding by extrapolating the finding to a new set of students, classrooms, and domain areas. If adequately tested and reported, this provides evidence both for when a finding works, but also when it does not. This information can be essential for enhancing understanding of the causal explanation underlying the effects. Barriers to Research/Classroom Transitions In general, the lack of translation of educational innovations can be attributed to various causes, broadly in the three categories: systemic, individual adopter, and innovation ecosystem, as shown in Figure 1. On the systemic side, there is faculty reward structure that typically promotes research over teaching or rewards ‘more’ teaching instead of evaluating teaching quality, situational and environmental constraints to adoption either curricular or related to infrastructure, push-back from students and reluctance on administrators’ part to support the innovation infrastructure or from the point of understanding student complaints while an innovation is perfected at their institution. From the potential adopter’s perspective, there might simply be a lack of interest, a skepticism of the effectiveness of an educational innovation, pressure from the reward structure to focus on research rather than teaching and learning, fear that the innovation might not be a good fit for their classroom, reservations about their ability to practice the innovation as intended, and time and effort towards implementation. From the innovator’s side, there might be shortcomings with the innovation or the ecosystem surrounding the innovation, such as unclear directions on how to adopt, lack of evidence showing effectiveness of innovation, no information on the time, effort and the difficulties encountered while implementing the innovation, lack of (customer) support, and lastly, no support to test the effectiveness of the innovation in new contexts. It is clear that factors from each of the three categories influence each other, and in the last few years various reports13,14 and research studies5,15,6 have called for changes at the systemic level in higher education, especially in the STEM fields. We welcome these initiatives, and we hope that they bring about systemic changes. However, we believe that there is an important gap in the various efforts to bring about systemic changes, and these changes should be supported by efforts that can be classified as discipline-specific or sub-discipline-specific, so that potential adopters have a support ecosystem that contextualizes educational innovations and facilitates their adoption and evaluation. Contextualizing at the course level will be especially effective, and is in our view essential for facilitating the translation of educational innovations. For example, a well-designed pretest posttest for computer architecture with built-in flexibility to incorporate specific course outcomes at different institutions for a course in computer architecture will serve the following purposes. 1. Provide an educational innovator with a measure for testing the effectiveness of the innovation, without having to invent and validate a new one. 2. Provide a potential adopter the same measure to test the effectiveness in their institutional context. 3. Add credibility to the entire project by facilitating the validation and transferability of the innovation by providing a resource for all adopters to use. 4. Enable both the innovator and the adopter to publish the classroom experiment results, thereby providing an additional reason for engaging in the innovation and adoption. This is important because quite often, faculty are faced with choosing between spending their time improving their teaching and working on research that will lead to publications.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom