z-logo
open-access-imgOpen Access
Construct Validity Of The Epics Scales Across Groups: A Mimic Modeling Investigation
Author(s) -
Hong Tao,
William Oakes,
Susan J. Maller,
Carla Zoltowski
Publication year - 2020
Publication title -
papers on engineering education repository (american society for engineering education)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--2784
Subject(s) - accreditation , construct (python library) , construct validity , engineering education , service (business) , teamwork , computer science , engineering management , medical education , engineering , psychology , psychometrics , management , medicine , clinical psychology , economy , economics , programming language
Using the Multiple Indicators, Multiple Causes (MIMIC) modeling approach, this study focused on the investigation of the construct validity of the Engineering Projects in Community Service (EPICS) program evaluation instrument. Possible differential item functioning (DIF) among the observed items were detected and described. The extent to which EPICS students’ gender and major are related to their evaluation on the professional skills and outcomes defined by the Accreditation Board for Engineering and Technology’s Engineering Criteria 2000 (ABET EC2000) Criterion 3 was analyzed. Results indicated that the instrument has acceptable construct validity evidence, and in general gender and major were not predictive of students’ noncognitive measures (e.g., communication and teamwork skills) on the EPICS program evaluation subscales. Background and Theoretical Framework First established at Purdue University in 1995, the EPICS program aimed to integrate engineering undergraduate student teams into local community service multidisciplinary service learning projects. Within EPICS program, teams of undergraduates design, build, and deploy real systems to solve engineering-based problems for local community service and education organizations 1 . It is now operating at 15 universities nationwide with over 1350 students participated 1 . Accreditation Board for Engineering and Technology’s Engineering Criteria 2000 (ABET, 1999) Criterion 3 2 Programs Outcomes and Assessment specifies outcomes college graduates are expected to know and demonstrate from accredited engineering programs. The generality of Criterion 3 objectives require engineering programs to articulate desired program outcomes related to professional skills that the participants can P ge 12400.2 assess through self-report instruments. In recognition of this complex task, EPICS ABET EC 3 self-report instruments were developed by a team of engineering educators and psychometricians to measure students’ perception of their professional skills and performance and whether an engineering design course effectively promotes the program and Criterion 3 outcomes 3 . Engineering educators will benefit from understanding students’ professional skills level, because critical information will be provided regarding students’ overall perception of the program and a foundation for continuous improvement. Validity is a critical aspect in testing and measurement. It deals with the meaning of a test or instrument, i.e., what is the test supposed to measure and how well it does the job it claims to do. “Construct” is an informed, scientific idea developed or constructed to describe or explain behavior (i.e., intelligence, anxiety, self-esteem, aggression, etc.). Construct validity asks the question of to what extent the test measures the theoretical construct we are interested in. Test or item bias is a factor inherent within a test or item, which has systematically error and prevents accurate, impartial measurement of the object or individual. A test/item is considered biased and thus lacking of construct validity evidence if it is in favor of or against a certain group of individuals. Construct validity can be assessed through factor analysis using the Structural Equation Modeling (SEM) technique. The Multiple Indicators, Multiple Causes (MIMIC) model is a special application of SEM. The general form of a MIMIC model involves some unobserved latent variables “caused” by several x-variables and indicated by several observed yvariables 4 . The model equations are

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom