z-logo
open-access-imgOpen Access
Use Of Student Portfolios For Outcomes Assessment Of A Software Engineering Program
Author(s) -
James E. McDonald
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--15171
Subject(s) - rubric , syllabus , accreditation , portfolio , software engineering , computer science , process (computing) , engineering management , software , medical education , mathematics education , engineering , psychology , programming language , medicine , financial economics , economics
For ABET accreditation of software engineering programs it is necessary to have a process in place to assess program outcomes that have been specified by the faculty. Monmouth University began its undergraduate software engineering program in 2000 and had its first graduates from that program in May 2004. To assess student achievement of program outcomes the University put into place a process for students to build portfolios and to have those portfolios reviewed by the faculty. The first formal evaluation of the contents of completed portfolios was conducted in spring 2004. This paper describes the process, the results of the review and the actions that have been taken as a result of that review. Several program improvements were made, including changes to specific course syllabi, sequencing of courses and changes in the curricular requirements. Methods for assisting students in construction of their portfolios and assisting faculty in reviewing the portfolios have also been developed. These methods include providing students with more precise guidelines for what to include in the portfolio, availability of a sample portfolio and scoring rubrics for use by the faculty during the formal review. The paper concludes with an outline of lessons learned and recommendations for other programs that are considering the use of portfolios for this purpose. Introduction and Background Portfolios have been utilized for many years in fields such as art and architecture to compile and publicize the capabilities of artists and architects. More recently schools of engineering have become interested in using portfolios to evaluate student progress and effectiveness of programs. Panitz (1996) reported that a variety of portfolio formats had been designed for use at five engineering institutions which she investigated. Olds (1997) described a portfolio program that was initiated at the Colorado School of Mines in 1988 and which has been used as the basis for numerous small changes in a variety of programs at that institution. Brodeur (2002) outlined a portfolio-based assessment program that was developed for evaluating outcomes of a revised curriculum of the Aeronautics and Astronautics engineering program at MIT. A number of authors have proposed and used portfolios to assess student progress in single courses and to assess achievement of specific outcomes across subsets of courses in engineering programs. Gunn, et al. (1997) describe how a portfolio was used to assess the effectiveness of a first year integrated curriculum. In that approach students were required to keep their work, review it periodically and discuss ways of organizing it. At the end of the semester students were P ge 10392.1 Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition Copyright © 2005, American Society for Engineering Education required to select items that showcased their performance and write an introduction. Mourtos (1997) and Mullin (1998) examined how portfolios can be used within single courses to make students more responsible for their own learning. In their examples students are given responsibility for demonstrating minimum levels of competence in basic skills while pursuing excellence in one of them. Mourtos’ examples are closely related to technical engineering disciplines while Mullin’s are more closely related to non-technical skills like English, Art, Sociology, etc. Plumb and Scott (2000) discussed a process for developing performance based outcomes for engineering student writing assessment using portfolio collections of writing examples from 13 students. Most recently, a variety of engineering educators have been promoting the use of, and using, electronic portfolios to collect and review student work. Reis (1998) described the Stanford University Electronic Learning Portfolios project. This effort was intended to help individuals capture, organize, integrate and reuse the results of learning experiences throughout their careers. Rogers (1998) discussed the experience of Rose-Hulman Institute of Technology in the selection and development of an electronic portfolio designed to document, assess and evaluate student outcomes. Rogers and Williams (1999) state that the use of electronic portfolios at their institution was a significant departure from the use of hard copy portfolios at other engineering institutions and they found that in a pilot of their process both students and faculty members found the system to be reliable and easy to use. Faculty members did make several recommendations for changes in the performance criteria and reported that the wide range of student abilities was enlightening. Moore and Voltmer (2000) outlined one planned use of RoseHulman’s electronic portfolio process to obtain both a horizontal view (through a particular course) and a vertical view (sophomore through senior) of an electrical and computer engineering departmental design sequence consisting of two one quarter courses in the sophomore and junior years and a three quarter sequence in the senior year. Upchurch and SimsKnight (2002) report on the use of an electronic portfolio for computer science and computer engineering students in a software engineering course. Student interviews conducted by the authors found both advantages and disadvantages based on student perceptions. During the 2000-2001 academic year Monmouth University initiated an undergraduate program in software engineering. At that time, the faculty developed a set of program outcomes that were compliant with ABET criterion 3 (EAC, 2004). We were then faced with deciding how to assess student achievement of those outcomes in a way that would also be compliant with the criterion. The criterion says that, “There must bean assessment process, with documented results, that demonstrate that these outcomes are being measured and indicates the degree to which the outcomes are achieved.” As outlined above, for several years prior to 2000, numerous publications had appeared promoting the use of student portfolios as an assessment method. A white paper issued by the Joint Task Force on Engineering Education Assessment (1997) mentioned portfolios as one assessment method that correlated with a number of the ABETrequired outcomes. The contents of that paper have gradually been worked into the ABET guidelines provided to institutions, team chairs and program evaluators for interpreting the standards described in criterion 3. Those guidelines currently say that possible evidence can include such things as student portfolios; subject content examinations; performance evaluation of work/study, intern or coops; and/or performance observations. They further say that surveys P ge 10392.2 Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition Copyright © 2005, American Society for Engineering Education and other indirect measures provide secondary evidence and should be used in conjunction with direct measures such as those above. Based on these constraints and the experience of this author as an ABET program evaluator, watching other institutions’ engineering programs struggle with how to do their outcome assessment, our faculty chose to use student portfolios as our primary method for assessing the outcomes that we had specified. We decided to supplement the portfolio assessment mechanism with a senior exit survey that would not attempt to directly assess the learning outcomes, but would supplement the assessment findings. The next section of this paper describes our program outcomes. The sections titled The Portfolio Building Process and The Portfolio Assessment Process outline how students develop their portfolios at Monmouth University and how the portfolios are reviewed and scored by the faculty. Results of the first round of analysis that was done in 2004 are contained in the section titled Assessment Results, Lessons Learned and Planned Process Improvements. Finally, we provide a short summary and some conclusions.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom