z-logo
open-access-imgOpen Access
Using Student Portfolios To Evaluate And Improve An Engineering Writing Program: A Case Study At The University Of Washington
Author(s) -
C. Winfield Scott,
Carolyn Plumb
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--7514
Subject(s) - portfolio , preparedness , session (web analytics) , computer science , engineering education , plan (archaeology) , process (computing) , writing assessment , engineering management , mathematics education , engineering , psychology , world wide web , management , archaeology , financial economics , economics , history , operating system
Using portfolios of student writing to evaluate both writing programs and individual student performance has become popular at all levels of education. However, few (if any) engineering programs have adopted this method of assessment. The call from industry for engineers with better writing skills demands that engineering educators look to new tools to evaluate the effectiveness of writing instruction and the preparedness of students to write on the job. At the University of Washington, we have embarked on a portfolio assessment project that involves collecting writing samples and other indicators of the engineering student writing experience. Through this program, we hope to gain a better understanding of what students are learning about written communication; we also plan to use the data from the project to establish clearer performance outcomes for our writing program. This paper describes the goal of the project and the rationale behind our decision to adopt portfolio assessment. In addition, it describes the information being collected and the process being used to collect this information. This paper will be helpful to other engineering educators who are grappling with the assessment demands of ABET 2000. What Is the Goal of the Portfolio Assessment Project? The College of Engineering at the University of Washington (UW) admits about 800 students into its ten departments and programs each year. In order to prepare these students for writing in their profession, the college offers a writing program that consists of three components: two dedicated technical communication courses, writing completed in many department courses, and writing completed at work or co-ops. In spring 1997, the college embarked on an ambitious three-year portfolio assessment project to gather detailed and comprehensive information on the nature and effectiveness of its writing program. The overall goal of this evaluation is to provide a baseline understanding of the program so that we can start to establish a common approach for teaching and assessing writing in the college. To attain our goal, we will use student portfolios to meet the following objectives: • Identify the writing status of students when they enter the college. • Characterize the writing experience of students while they are in the college. • Determine student writing status upon graduation. • Create performance-based learning outcomes, establish criteria for assessing these outcomes, and propose changes in curriculum and instruction to promote these outcomes. Page 320.1 The procedure used in this evaluation can be used as an ongoing assessment tool to monitor the effects of any changes we make in response to what we learn from the baseline evaluation. Why Are We Undertaking the Project? Despite our efforts to prepare students for writing in their profession, feedback from industry indicates that we are not keeping up with demands in the workplace. Now, with the adoption of ABET's Engineering Criteria 2000 , we will be required to demonstrate that we can. Industry Wants Better Writers For decades, industry has been saying that engineering students are not learning the communication skills they need on the job. 1-6 Writing has been at the center of these complaints. Engineering colleges have adopted measures such as writing practicums, 7 ntegration of more writing into existing courses 8, 9 and team teaching. 10 But the gap between the writers that industry wants and the writers that academia produces still persists. We think that the gap persists because we do not collectively have a clear idea, nor a clearly articulated description, of "good engineering writing." As a result, engineering students are not learning to write because there is no common approach to teaching and talking about writing among the major players: students, faculty, and industry. Regardless of the overall instructional model, the information and feedback that engineering students receive about their writing skills are inconsistent and fragmented, not only from one course to another but also from school to work. ABET Requires Better Instruction and Assessment Starting in fall 2001, our engineering program must meet ABET's new Engineering Criteria 2000 accreditation standards. These standards were developed in response to claims from employers that "engineering success today requires more than up-to-the-minute technical capability; it requires the ability to communicate, work in teams, think creatively, learn quickly, and value diversity." 11, p.1 Accompanying these standards are new ways to evaluate program compliance that rely less on the previous quantitative criteria and more on holistic ways to evaluate innovation in curriculum. Under the new standards, engineering programs will be required (1) to develop objectives (program outcomes) for a number of skills, including an ability to communicate effectively; (2) to design a curriculum that ensures achievement of these objectives; and (3) to implement an ongoing internal assessment process to demonstrate achievement of these objectives and to improve the effectiveness of the program. 11

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom