Scorecards: Tracking Progress In Senior Design Project Courses
Author(s) -
James Baker,
M.A. Yoder,
Bruce J Black,
R.D. Throne,
William Kline
Publication year - 2020
Publication title -
2009 annual conference and exposition proceedings
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--5571
Subject(s) - computer science , tracking (education) , engineering management , engineering , psychology , pedagogy
Monitoring and evaluating the status of engineering design projects has traditionally been part art and part science. Weekly and monthly status reports, Gantt charts, design reviews, time logs, demonstrations, and presentations are often utilized to gain visibility of the progress on projects. Even with all these tools, it is often difficult to gain a clear, definitive view of the status of a given project. In recent years, project dashboards and scorecards have been added to the list of tools employed in industry to give visibility of project status to all stakeholders. These tools seek to concisely display key metrics that give a clear view of project status. In engineering project courses, students and faculty both are often challenged to assess the status and progress of the project. The traditional inputs of submitted homework, quizzes, and examinations are often not applicable. Assessment of progress tends to be more subjective, based on observations and conclusions drawn from reading status reports and team presentations. This paper describes the development and application of project scorecards to traditional classroom senior design projects to help assess status and progress. During 2007-2008 academic year, a weekly scorecard was developed and utilized by 20 senior design project teams in Electrical and Computer Engineering. The tool was designed to aid both the design teams and the faculty in honestly and clearly assessing weekly progress on design projects. The results of the study are discussed including both the perceived benefits and drawbacks. Introduction and Background Dashboards and scorecards have been used in industry to aid in making the status of business and development projects more transparent and visible to upper management and clients. In 1992, Kaplan and Norton introduced the concept of a “Balanced Scorecard” as a management tool based partially on prior experiences at Analog Devices Corporation. 3 The scorecard concept has also been applied in a variety of academic settings. 1,2 The scorecard concept has recently been applied to the tracking of a distance-learning graduate program. 6 Recently a scorecard has been developed to track student internship projects at Rose-Hulman Ventures 5 . The project work at the program is carried out on a contract basis for external clients by teams of students guided by a fulltime engineering project manager. The students are employed to work on the projects and receive pay but not academic credit for their involvement. The clients provide significant funding for the work and set high expectations for results. With 20 projects commonly active at one time, the scorecard provides quick overview of project status and problem areas. The scorecard was developed using a Microsoft Excel spreadsheet incorporating conditional formatting of key metric cells. The conditional formatting was used to automatically highlight each metric cell in red, yellow, or green based on the value entered each week compared to P ge 14043.2 thresholds set in another region of the spreadsheet. The idea of the highlighting being that reviewers of the scorecard, including the project team itself, could quickly identify areas that needed immediate attention or additional diligence. By compiling weekly summaries of all project team’s scorecards, the management team has been able to quickly identify trends across all projects. A user’s guide was also compiled to guide users of the scorecard in communicating project progress to the team members, client, and managers at Rose-Hulman Ventures. The scorecard has been used with over 50 projects over the past three years in the Rose-Hulman Ventures program and has proven itself a valuable management tool in that environment. 2 A Project Team Scorecard in the Classroom In the fall of 2007, this concept was adopted for use at Rose-Hulman to track and help drive progress with classroom projects in the senior design sequence in Electrical and Computer Engineering (ECE). Traditionally, the student project teams in ECE have been required to submit weekly one-page project status memos to their faculty supervisor in addition to maintaining individual time logs. In addition, several times each quarter the teams have been required to complete peer reviews, make formal presentations on their progress to the group of faculty supervisors, and be available for informal “drop-ins” on team meetings by the group of faculty supervisors for demonstrations. The goals of introducing the scorecards were to both provide additional concise and standardized weekly metrics on the status of the project to each team’s faculty supervisor and to also help the teams honestly self-evaluate their own progress. Four sections of the senior design course, ECE460/461, spanning 80 students divided among 20 student project teams, guided by four ECE faculty members, utilized a modified version of the previous scorecard and user’s guide. The teams updated the scorecards and submitted copies for review by each team’s faculty supervisor every week for up to 25 weeks, starting early in the fall quarter 2007 and ending in the spring quarter of 2008. The scorecard employed is shown on the left in Figure 1. The thresholds, embedded in the base spreadsheet and used to make decisions on the highlights to be applied to each cell, are in the table on the right of Figure 1. The threshold table is normally hidden and left fixed for all projects. Some cells require the entry of dates or other numerical data which is compared against the threshold table values to determine the highlight color. An example of this type of metric is very first row in Figure 1, “When was the last scheduled meeting with the client?”. If less than 14 days ago then the cell is automatically highlighted in green, if more than 21 days the cell is highlighted in red, etc. Other cells provide drop down boxes for the user to select an appropriate value. An example of this type of metric is the fifth row in Figure 1 where the user chooses values for “How responsive is the client?” from a dropdown list of “Not”, “Somewhat”, “Very” stored in the threshold table. The cells are again automatically highlighted by the corresponding colors, red, yellow or green based on the selection.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom