Assessing an Adaptive Expertise Instrument in Computer-aided Design (CAD) Courses at Two Campuses
Author(s) -
Michael Johnson,
Elif Öztürk,
Joshua A. Johnson,
Buğrahan Yalvaç,
Xiaobo Peng
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--20972
Subject(s) - cad , engineering education , computer science , field (mathematics) , productivity , metacognition , cognition , knowledge management , mathematics education , psychology , engineering management , engineering , engineering drawing , mathematics , neuroscience , pure mathematics , economics , macroeconomics
In today’s highly competitive market, CAD tools are widely used and thought to reduce time to market and increase engineering productivity. However, to take advantage of these putative benefits requires proper use of CAD tools. Merely teaching declarative knowledge (particular keystrokes and button picks) in CAD is not sufficient; students should acquire deeper procedural knowledge (design strategy) in CAD. This will allow them to gain a level of expertise that is adaptive in nature. Recent research in engineering education finds that experts demonstrate two distinct characteristics: adaptive versus routine expertise. Adaptive experts possess the content knowledge similar to routine experts in the field, but also the ability to effectively utilize and extend their content knowledge. Epistemological beliefs, metacognitive skills, multiple perspectives, and learning orientations are among the constructs that can define adaptive expertise. This work describes the implementation of an instrument used to measure adaptive expertise in two courses at two universities. The instrument contains questions covering four dimensions: multiple perspectives, meta-cognitive self-assessment, goals and beliefs, and epistemology. In one university setting, freshmen and sophomore engineering students were surveyed with the instrument; in the other, junior and senior level engineering students were surveyed. In addition to the student participants, practicing engineers from industry were surveyed using the instrument. Participant demographic, education, and engineering experience data were collected. These data were used to examine the relationships among expertise related responses and demographic variables. We report the factor analyses results and the reliability coefficients of the instrument and the observed differences between students’ and engineers’ responses to survey items.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom