SIE ABET Tools and Guidelines
The following evaluation tools and guidelines were developed by the COEM development team and by the SIE Department to support the measurement of student outcomes, primarily criteria A-K. The development team consisted of COEM faculty, administrators, and representatives from the University Teaching and Evaluation Center. Also, some of the tools were adapted from evaluation tools used by our industrial customers. The descriptions are given below. The forms used are provided in the Materials section of this site. Acrobat reader is required to view selected Figures in PDF format.
You can download Adobe Acrobat Reader here.
COEM Developed Tools
- ASSESS ASSESS is a database-spreadsheet program that enables simple evaluation of surveys. Each survey question must be measured on a numerical scale and assess computes the average score for each question over the sample. Also, the customer can be asked to measure question importance and ASSESS computes the average importance value for each question. The important features of ASSESS are easy input format, compatibility with ACCESS database files, and its ability to create plots of survey question response versus question importance for all questions. Often, improvement directions can be determined by looking at questions with high importance and low response (upper left quadrant of the ASSESS graphs).
- Near Term Alumni Survey (1 - 5 years out)The COEM Assessment team with cooperation from the University Teaching Center developed the survey form for all alumni that graduated in the past 5 years (Each year hereafter it will be sent to graduates 1, 3 and 5 years after graduation). This form is given in Figure 1D-1 (PDF format). The intent of the form is for alumni to rate their satisfaction with their education on the criteria 3 A-K outcomes and to rank their education relative to their peers. The SIE Department added 2 questions to help measure our educational goals of professionalism and the ability to design an entire system. The rating scale was from 1 to 7 where 7 represents "extremely satisfied" and 4 represents "satisfied." Scores were averaged over all respondents to get an average satisfaction for each criterion. Besides satisfaction, we also asked for the 3 most important criteria (only considering the 13 listed) for success. We counted the number of times each criterion was selected and used this to measure criterion "importance." Satisfaction and importance were graphed together to see if there are important criteria where we have low satisfaction. We also asked questions concerning additional education, professional society involvement, co-op experiences, and their overall feelings about the Department. Note that we are asking students for a self evaluation, and hence the results may be biased high in that people who respond tend to have a good opinion about themselves. This survey can be analyzed using ASSESS.
- Graduating Senior SurveyThe intent of the form is for students to rate their education on the 11 ABET criteria and to see where the students would add more emphasis. This form is given in Figure 1D-2 (PDF format). The rating scale was from 1 to 7 where 7 represents "extremely satisfied" and 4 represents "satisfied." Scores were averaged over all respondents to get an average satisfaction for each criterion. Besides satisfaction, we also asked for the 3 most important criteria (only considering the 13 listed) for success. We counted the number of times each criterion was selected and used this to measure criterion "importance." Satisfaction and importance were graphed together to see if there are important criteria where we have low satisfaction. We also asked questions concerning advising, professional society involvement, co-op experiences, and their overall feelings about the COEM and their department. This survey can be analyzed using ASSESS and we have included a sample importance-satisfaction plot in Figure 2.2 of Criterion 2.This form will be given to all graduating seniors each semester.
- Faculty Importance SurveyThe COEM Assessment team developed a survey form so that faculty could rate the importance of each of the 11 ABET criteria in section 3. These criteria matched those used for alumni and seniors and therefore, we could compare how faculty stand relative to where students stand on different criteria. We considered both the current program and a "future ideal" program for the faculty analysis. When the faculty and the students or alumni dont match up, then there could be a mismatch in the efforts and expectations of the students and the efforts and expectations of the faculty. The survey is depicted in Figure 1D-3 (PDF format).This form will be used every 3 years when the issue of Educational Objectives is revisted.
- Facilities and Processes Survey for Continuing StudentsThe COEM Assessment team developed a survey for continuing students that helps measure the quality of advising, classroom facilities, lab facilities, computer facilities, and the general education environment. Questions pertaining to the 11 ABET criteria are listed, but not in the same manner as the Graduating Senior Survey or the Near Term Alumni Survey. The portion on facilities can be analyzed using ASSESS and the entire survey is depicted in Figure 1D-4 (PDF format).This form will be used every year in the Spring semester.
- Industry Advisory Council SurveyThe COEM Assessment team developed a survey of employers of students that was given to the COEM Industrial Advisory Council (IAC). The purpose here was to evaluate COEM graduates on the 11 ABET criteria in section 3 and to compare COEM graduates with those from other universities. This IAC group took the survey back to their companies for completion and discussion. The survey results can be analyzed using ASSESS, and the survey is depicted in Figure 1D-5 (PDF format).This form will be used every 3 years when the issue of Educational Objectives is revisted.
- Guidelines for Portfolio Construction The COEM Assessment team has constructed guidelines for developing student portfolios. The major use of the portfolios is to evaluate the progress of students over time. The portfolios should be constructed over time; should be manageable in size; contain material that can be used to evaluate intellectual growth in engineering topics, communications growth, and teamwork experiences; and contain student self-evaluations of their education and resumes. We anticipate that portfolios will be evaluated by a committee that contains members from all of our constituent groups; department faculty, COEM faculty, industry, and students. The specific guidelines can be used to varying degrees and they are contained in Figure 1D-6 (PDF format).
- Guidelines for Pre-Requisite Testing and Evaluation,To evaluate material from pre-requisite courses, The COEM Assessment team developed guidelines for beginning-of-the-semester tests and homework assignments. These guidelines are set to ensure fairness to students, a low pressure environment, and a valid measure of pre-requisite knowledge. The best strategy is to place these evaluations at a few key points in the curriculum where pre-requisite material is most needed (minimize effort and duplication on the students and facultys part). The guidelines are contained in Figure 1D-7 (PDF format).Pre-requisite testing and evaluation is done every semester as appropriate.
- Course Classification Forms and the QFD Matrix The COEM Assessment team has designed a form and presentation vehicle to analyze our courses and our curriculum relative to the 11 ABET criteria in section 3. The form asks each faculty member basic information about their courses and asks to rate the course on the content of each criterion (high, medium, low, or not included scale). For each rating, the faculty member is also asked to describe the course material that suggests the appropriate rating. Generally, a course will have no more than 2 "high" criteria and our guidelines suggest that faculty use a scale where "high" represents one credit hour worth of work. These forms can be displayed on the department web site and can be used as a resource for students thinking about taking the course and for faculty who have to teach material related to the course. A course classification form is included in Figure 1D-8 (PDF format).This form will be used every time a course undergoes a major revision and will be used to guide the revision process.
Once the classification forms were completed, they were complied in a Quality Function Deployment (QFD) matrix, (PDF format), where the columns (Hows) are the courses and other curricular content (professional society, co-op, seminars, clubs and organizations, outside activities,
) and the rows (Whats) are the 11 criteria plus any additional criteria that a department believes are important in their program. The QFD matrix is a convenient tool to evaluate the criteria for which a program needs additional or possibly less coverage. By simply looking across a row, one can easily see the total amount of coverage. If courses are sorted according to pre-requisite and curricular sequence, then one an easily see where in a student's program different criteria are addressed and emphasized.
SIE Developed Tools
- Senior Exit Exam on ModelingSince our primary educational goal is for the students to be able to develop mathematical models for a variety of problems, it is critical that we measure our success in this area. To this end, we developed a senior exit exam on modeling. The exam consists of 10 word problems and the students are asked to simply explain the modeling technique they would use to attack the problem. The questions span most areas covered in both degree programs: deterministic and stochastic operations research, probability, statistics, and simulation. The exam is given in a low presurre environment a the end of the senior design experience (SIE 442) and is designed to be completed in 30 minutes. Note that the students are not asked to solve problems. We believe that if they can determine the appropriate modeling technique, then they can review material necessary for model implementation and solution. Therefore, the exam measures recognition and classification of problems. The ability to implement the model is measured in the individual courses (or during a pre-requisite evaluation). Grading is done by 2 faculty members to help ensure consistent scoring standards and the scores range from 0 to 10 using 1 point for each problem. The modeling exam is contained in Figure 1D-9.The exam is given to ALL graduating SIE seniors as part of the capstone design experience.
- Senior Exit Exam on Design ProcessAnother primary goal of the programs is the ability to design entire systems using the engineering design process. To measure achievement in this area, we constructed an exam with 3 open ended design problems. The students were asked to describe what they would do to design a system to solve the problem (we do not effectively measure the translation from "need" to "problem" since the exam questions are stated as problems). The exam is designed for 30 minutes and is administered at the end of the senior design class. Exam problems come from a variety of applications areas in both industrial and systems engineering . Each exam problem is graded by a single faculty member (generally the person that designs the question) and is worth 10 points (the maximum score on the exam is 30 points). The design exam is contained in Figure 1D-10.The exam is given to ALL graduating SIE seniors as part of the capstone design experience.
- Senior Design Client Survey We have constructed a survey to measure project quality and student professionalism in the senior design experience. Each project is offered by a client external to the SIE Department. We ask the client to comment on the overall experience, the appropriateness of the solution to the need, and the general attitude of the team. These evaluations are then used to consider material to include or amend in the next offering of the design course. The client survey is contained in Figure 1D-11 (PDF format).This survey is given to each capstone design project client in every semester.
- Senior Design Evaluation of Presentations (Outside review)We invite outside reviewers to view the final presentations for each of the senior design projects. The reviews address issues such as: clarity of the presentation, appropriateness of the solution and problem to the stated need, professional attitude, and ability to answer questions and stimulate discussion. Generally the outside reviewers are faculty from other departments in the COEM or faculty in SIE that are not associated with the course. The presentation evaluation form is contained in Figure 1D-12 (PDF format).This is done as the availability of reviewers permits. We would like to do this for each capstone design project team.
- Pre-Requisite Evaluation Implementation Since it is critical that our students retain knowledge in courses early in the curriculum, we have implemented the COEM pre-requisite guidelines in many of our courses. This evaluation represents a measure of the quality of our "Work in Process". SIE 305 (Probability and Statistics), SIE 340 (Deterministic Operations Research), SIE 350 (Linear Systems Theory), and Basic Mathematics (Calculus material) are particularly critical as prerequisites to many of our courses, so knowledge in these areas are evaluated in courses such as SIE 321 (Stochastic Operations Research needs SIE 305), SIE 440 (Optimization needs SIE 340), SIE 462 (Production Control needs SIE 340), SIE 330 (Engineering Statistics needs SIE 305 and basic math), SIE 453 (Control Systems needs SIE 350), and SIE 270 (Numerical Methods needs Basic Mathematics). These courses are spaced throughout our degree programs and hence represent numerous opportunities to measure our education process while there is still time to take corrective action.This is done every semester as appropriate for the particular course.
- Final Exam Evaluation In courses that have high content in criterion 3A (application of math, science, and engineering to solve problems) (SIE 265, SIE 270, SIE 305, SIE 340, SIE 321, SIE 350, SIE 453, SIE 462), we use final exam results for student evaluation. The evaluation consists of filtering out questions that relate to the criterion, measuring student scores on those questions, and then giving an overall evaluation on student performance. Poor performance by the students requires explanation and an approach for improvement. The extent of the process is highly dependent on individual faculty and the ability of the faculty to calibrate exams to recognize and deal with poor performance. Final exams will be available for the reviewers at the site visit.This is done every semester as appropriate for the particular course.
- Project Report EvaluationIn courses that have significant design content and are high in criterion 3C (design a system to meet a stated need) (SIE 370, SIE 431, SIE 442, SIE 463, SIE 474) final project reports and lab reports are used to evaluate student quality. Presently, each individual faculty member evaluates his or here own course. However, we are trying to design a system where a group of faculty evaluate a set of courses and provide a more objective and comprehensive evaluation.
- Portfolio Evaluation Implementation We have designed a set of material that is to be included in a student's portfolio. This material is listed in Figure 1D-13, (PDF format), and represents a specification of the COEM guidelines for the SIE Department. The portfolios represent a longitudinal study of a student's progress and hence must be evaluated periodically. Our current plan is to select a group of students to participate in the portfolio program and then evaluate yearly. We have not yet developed evaluation guidelines, but are considering the methods used at Rose Hulman (checklist system measuring coverage) and a more detailed evaluation approach used at the Colorado School of Mines. We will have the portfolio material from students at various levels of their careers available for the reviewer at the site visit.
SIE ABET Information Site
The University of Arizona
October 30, 1998
Systems and Industrial Engineering
All contents copyright © 1998. All rights reserved.