Introduction to Engineering
I. Anonymous Journals & Course Evaluations (by students)
A. Purpose: to evaluate the course on a bi-weekly and quarterly basis
B. Process of Implementation (Site Administrator)
1. Create questions designed to evaluate whether the course has met its objectives created by the faculty.
2. Create the survey using Course Sorcerer, an OSU-grown web-based tool for course management.
3. Compile and review responses from the journal entries and evaluations to improve the course(s)
a) Short Term: discuss summaries at weekly team meetings to address current questions and concerns from students; address these concerns in class.
b) Long Term: improve future course materials, content, and teaching styles; evaluate course based on ABET criteria.
C. Lessons Learned
1. Students are more apt to participate in the evaluations and journals when they are given credit for completing them.
2. Students are extremely responsive when concerned with an issue; often writing a paragraph in response to some questions.
3. Students will respond when encouraged to come up with solutions to their complaints about the course.
II. Anonymous Journals by Teaching Assistants (TAs)
A. Purposes: use the TA’s close personal experience with the students to:
1. Create a TA lab training manual based upon their experiences.
2. Improve the lab training process.
3. Modify the TA’s roles to improve student experiences.
B. Implementation Process (Site Administrator)
1. Create types of questions designed to highlight the experiences that a TA has with particular labs.
a) Questions specifically targeting possible problem areas in specific labs
b) General questions regarding student attitudes towards labs and the TA’s general experience
c) Questions in regard to the actual logistics of the course and the TA’s ability to manage a full schedule
2. Create the survey on-line using Course Sorcerer, an OSU-grown web-based tool for course management.
3. Use the compiled data from the journal entries and evaluations to improve the course(s) on a long-term basis.
III. Purdue Visualization Test (Pre & Post)
A. Purpose: to measure any change (increase or decrease) in student’s abilities to visually comprehend an object.
B. Implementation Process (Administrator, Faculty, Testing Services)
1. Students are given twenty minutes in-class time to complete this test.
2. Results in a report format are then taken to the testing center and results are returned electronically within 24 hours.
3. Reports are compiled and given to instructors to use at their discretion.
4. After students have taken the post-test at the end of the quarter, those results are included with the pre-test results, differences calculated and compared.
5. Results from these tests are used to determine if the course is meeting its goal of teaching the fundamental concepts of engineering graphics.
IV. Learning Styles Inventory
A. Index of Learning Styles Questionnaire
Barbara A. Soloman (
Felder (Department of Chemical Engineering,
1. Allow students to discover where their strengths and weaknesses are in relation to how they internalize information.
2. Allow professors to get an accurate idea of the best ways to instruct their class, or what teaching styles to utilize more of the time.
C. Implementation Process (TA’s, Administrator)
1. Students complete inventory on-line in class and get immediate results as to where they fall on the scale.
2. TA’s collect these results and they are compiled into a large database to calculate and visualize the tendencies per section and per entire class.
3. Faculty are then provided with the compiled information, to be used at their discretion.
D. Lessons Learned
Many professors tend to agree that the compiled information about the section should not be revealed to the students since students not fitting into the pattern have a tendency to feel as though they do not belong in Engineering.
A. The Pittsburgh Freshman Attitudes Survey is part of a
research effort headed by the
1. Mary Besterfield-Sacre: email@example.com
2. Ray Hoare: firstname.lastname@example.org
3. Rob Shield: email@example.com
B. Our Purpose: to determine if any link exists between the attitudes of freshman, our teaching approaches, and retention in Engineering.
C. Implementation Process (Administrator, Pitt Staff)
1. The Administrator at OSU compiles a list of student’s email accounts, pre-survey email, reminder email, and follow-up emails to the Pitt Staff.
2. The Pitt Staff sends email to all of the students containing a username, password, and URL at which to take the survey. The email and website appear to be OSU based so as to gain the trust of the students.
3. The Pitt Staff compiles all data from the pre-survey and again for the post-survey at the end of the course sequence.
VI. Monitoring Grades of Current Students
1. Short term: identify students who are not doing well and give them the opportunity for help.
2. Long term: attempt to establish a link between patterns of behavior and performance in the first Engineering classes to retention statistics.
B. Implementation Process (Site Administrator, TA’s, Faculty)
1. Use the grade book on WebCT, a commercial on-line program designed for course management, to monitor grades as they are entered.
2. Raise TA awareness of students who appear to be falling behind.
3. Check up on any student who receives less than a 60% in either 181 or 182 to see if they are continuing to remain in the Engineering program or continuing to take Engineering-related classes.
VII. Longitudinal Tracking Database
A. Original Purpose: to monitor the pilot and control groups involved in the Freshman Engineering Honors (FEH) program and Introduction to Engineering (IE) program.
B. Current Purpose:
track retention, quarter to major, and major
trends over the past 10 years for FEH and the past three for IE; provide
C. Implementation Process (Administrator)
Each year, a new
database must be created for incoming freshman to the IE and FEH programs. Originally control groups were established as
well. Now that both programs are out of
the pilot stages, all freshmen entering the
2. Fields in the database include, but are not limited to, math placement, grades for particular classes, cumulative grade point averages, majors, and retention status.
3. Each quarter, the database is updated to include the cumulative grade point averages and retention status; grades are often added in the summer when there is more time.
4. Reports are generated for an annual report and on an as-needed basis for journal articles, presentations, etc…