CERI, Professional PracticeProfessional PracticeCERIUniversity of Cincinnati

CERI, Professional Practice

Preliminary Evaluation Plan

Evaluation Plan For The University Of Cincinnati Professional Practice Division’s ‘Developing A Corporate Feedback System For Use In Curricular Reform’ Project

(P116B040276)

I. Background

Throughout the years, Professional Practice (co-op) faculty advisors have been involved with the college curriculum development through anecdotal evidence of the impact of the curriculum on student employability. The list below gives a good view of the level of feedback today, however, this is only the tip of the iceberg of feedback that can be gained through this project:

  • In the architecture and interior design co-op program, the co-op advisors were unable to secure positions with technical experiences for students in their first co-op quarter because the students had not mastered AutoCad. The school reacted to this information by moving AutoCad earlier in the curriculum. These students no longer spend their first co-op work term gathering samples and doing research, but are involved in actual projects as functioning team members. This clear connection between course content and the types of positions that students are able to secure illustrates the importance of monitoring curriculum to ensure that co-op positions will have meaningful content to the position.

  • In the materials science engineering curriculum, the course, Introduction to Metallurgy, was historically a part of the sophomore curriculum and co-op employers came to rely upon the understanding that students will gain from that course. When the department moved the course to the third year, the co-op employers complained that they were no longer able to use these co-op students in their positions. In fact, one large co-op employer would not allow students to return to the company for a second quarter, despite good work performance, simply because they had not taken the Introduction to Metallurgy course. After learning of this situation through the co-op adviso r, the department returned the course to its original place in the curriculum. This illustrates the rapidity with which changes in curriculum are felt by and reacted upon by our industrial partners in the form of co-op hiring and termination patterns.

  • In the mechanical engineering technology program, employer feedback indicated that approximately 20% of the employers requested that students be trained in traditional drafting techniques. A simple analysis of this data indicated a curriculum change might be warranted. A closer inspection revealed that this employer group represented companies that were behind the times, using antiquated methodologies with respect to design technology. In this case the department determined that it was best to look for new co-op employers who were using the latest technology. This illustrates the importance of filtering data and not allowing statistical data to make decisions in lieu of critical reflection on the part of knowledgeable experts.

  • In the late 1980s the University of Cincinnati created a 2-year degree program in automated software technology due in large part to the influence of one of UC’s largest, long-term co-op employers. This company was very interested in creating a program that would produce associate- level graduates with an understanding of robotics. It was also important that the program have a mandatory co-op component. The program was terminated after six years. By the time the curriculum was developed, the needs of industry had changed and there was no longer a market for hiring co-op students or program graduates. This was a significant lesson for the University of Cincinnati in the importance of listening to employer needs without allowing employers to determine the degrees it offers or the curriculum content that these degrees should contain.

The above examples show the potential of cooperative education as a methodology to monitor corporate feedback and make adjustments while the students are still enrolled in the program. The downside of relying on anecdotal evidence is that it makes the adaption reactive rather than proactive. The objective of this proposal is to develop a feedback system that identifies discrepancies between curricular offerings and stakeholder needs before they have produced obvious negative impact on an operational level. The proposed system not only analyzes the information gathered through comprehensive employer evaluations, but combines that with anecdotal information and focus groups to properly filter the information. The recurring statistically-valid feedback process that this project concentrates on developing will be an invaluable asset for any institution that wants to develop its operation in harmony with both academic criteria and employer demand.

II. Project’s Purpose

The three year project focuses on: a) identifying curricular activities exhibiting a strong correlation with student co-op work performance; b) designing and implementing processes allowing the systematic use of employer assessment in curriculum design; c) evaluating the impact of changes in curricular design upon student work performance d) piloting and contrasting projects in both different academic fields and at different colleges; and e) developing a set of best practices to be used for further refinement and dissemination of the process. Initial collaborators include University of Cincinnati (UC) academic units as follows: the Department of Architecture (College of Design, Architecture, Art, and Planning); the Department of Civil and Environmental Engineering (College of Engineering), the Department of Civil and Construction Management (College of Applied Science), the College of Business Administration, and the Division of Professional Practice. The assessment data will be analyzed by the UC Evaluation Services Center. The Evaluation and Assessment Center for Mathematics and Science Education at Miami University is the external evaluator for the project. Schools accredited by, or subscribing to, the attributes of or the Accreditation Council for Cooperative Education will act as a reference group, ensuring a transferable end process.

The ultimate objective of the project is to move schools engaged in cooperative education to a novel era of market alignment. The objective is to build feedback structures that keep the schools abreast with a rapidly changing environment. The processes will be built to support an efficient allocation of educational resources. The inclusion of a wide array of programs and a large, diverse reference group caters to building a process that can be effectively utilized in schools engaged in cooperative education within a diverse set of academic fields and educational levels. The objective is to develop a process that can be successfully implemented both in a community college as well as a four/five year institution environment.

This evaluation will assess the efficacy of the project to develop methodologies that allow it to use assessment data of student work term performance in curricular development, thereby, continuously aligning experiential or cooperative education based curricula with industrial needs.

III. Audience

Co-operative education departments within the University of Cincinnati as well as universities nationwide with co-op programs will benefit from data produced by this project. Success and failures will be charted and information concerning the program will be disseminated within the university system as well as nationally.

IV. Evaluation Questions

The University of Cincinnati Professional Practice Division’s project will attempt to demonstrate that co-operative education students in participating academic departments perform better according to industry standards when compared to co-op education students not in participating academic departments.

The evaluation plan encompasses the University of Cincinnati Professional Practice Division’s three goals and leads to formative and summative evaluations (see Evaluation Plan). The Miami University’s Evaluation & Assessment Center for Mathematics and Science Education will provide semiannual reports to the Principal Investigator. Data collected in year 3 will provide summative information.
Formative data and analyses will address the following questions:

  1. How effective are the Assessment Instruments I and II in assessing students’ performance in industry?

  2. How did participating programs implement changes based upon a circular upgrade of goals?

In addition to year 3 findings for the above questions, summative data and analyses will address the following question:

How has the program been institutionalized at other departments, colleges, and universities?

V. Evaluation Approach and Data Collection Measures

The evaluation of the University of Cincinnati’s Circular Curriculum project will use a mixed methods approach. Pre/post data analysis using Assessment Instruments I and II will provide a rich quantitative data source that can be used to track changes in student and employer outcomes for participating and comparative groups. The evaluation will also include qualitative measures such as analysis of faculty focus group meetings as well as expert panel reviews of departmental syllabi that have been redesigned. Expert panel reviewers will be chosen from nationally rank universities.

VI. Data Analysis

Quantitative data analysis includes univariate, bivariate, and multivariate statistical techniques. Sophisticated statistical analysis such as multivariate regression analysis as well as Item Response Theory (IRT) may be employed, as appropriate. Multivariate regression analysis will control for extraneous factors that may influence outcomes and confound results. Item response theory will be used to analyze items within Assessment Instruments I and II. Reviews of interviews will provide qualitative data for this evaluation.

VII. Evaluation plan

Goal 1: Develop baseline data and Develop and implement Assessment Instrument II based upon prior findings.

Activities Evidence & Benchmarks Measures Data
Activity 1: 
Establish performance baseline using relevant statistical methods.
Evidence: Accumulated assessment data is collected and analyzed using ‘Assessment Instrument I’ during W, Sp & Su 2004. Benchmark: In Year 1, 100% of collected data covering 4 majors analyzed. Track number, demographics, and findings of the Assessment Instrument I. Collect and analyze data gathered from the Assessment Instrument I.
Activity 2: Design Assessment Instrument II targeting detailed evaluation of curricular strengths and weaknesses. Evidence: ‘Assessment Instrument II’ targeting detailed evaluation of curricular strengths and weaknesses is created. Benchmark: In year 1, Assessment Instrument II is created and piloted. a. Validation of Assessment Instrument II.

b. Reliability check of Assessment Instrument II.

a. Supervisors and Professional Practice faculty expert review of Assessment Instrument II and focus group interviews with faculty, employers and students.

b. Collect and analyze data from piloted Assessment Instrument II questionnaire to check for reliability.

Activity 3: Assess student work term performance using Instruments I and II. Evidence: Items in Assessment Instrument I and II are highly correlated. Benchmark: High correlations between items on both questionnaires.
Correlations between items on each questionnaire. Collect and analyze aggregated data from both Assessment Instrument I and Assessment Instrument II.
Activity 4: Develop a redesigned Assessment Instrument II based upon feedback. Evidence: Newly created Assessment Instrument II reflects changes suggested by expert panel, employers, and students. Track development of redesigned Assessment Instrument II questionnaire. Collect and review questionnaire.
Activity 5: Use newly created and validated Assessment Instrument II. Evidence: Program students and employers are using Assessment Instrument II as their ‘student assessment’ tool. Benchmark: 100% of participants are using questionnaire. Track number, demographics, and findings of the Assessment Instrument II. Collect and analyze data gathered from the Assessment Instrument

Goal 2: Develop and implement agreed upon Circular upgrade goals for individual teams.

Activities Evidence & Benchmarks Measures Data
Activity 1: Develop and modify Circular Upgrade Goals. Evidence: Each team reaches agreement on goals for specific term. Verify team goals. Comparison of annual faculty focus group interviews and review team meeting documents and final report.
Activity 2: Develop syllabi for specific classes. Evidence: Syllabi for specific classes reflect agreed upon goals for each team 
Verify content of syllabi. Review by outside expert panel of course syllabi.

Goal 3: Institutionalize Feedback Cycle process to other departments, colleges, and universities.

Activities Evidence & Benchmarks Measures Data
Activity 1: Assess quality of Feedback Cycle. Evidence: Students, faculty, and employers give positive ratings of the program. Student, faculty, and employer program evaluation questionnaire and random interviews. Collect and analyze data from evaluation questionnaire and random interviews.
Activity 2: Dissemination through publications, meetings, and conferences. 2a. Evidence: Program model information is presented to a wide audience.

2b. Evidence: Program is continued beyond STEP funding.

2c. Evidence: Program is continued beyond FIPSE funding.

2d. Evidence: University of Cincinnati increases level of support.

2a. Track presentations.

2b - d. Continued monitoring of program components and institutional support

2a. Collect records of presentations and publications from Professional Practice personnel.

2b - d. Collect and review long-range program planning from Professional Practice personnel.

VIII. Evaluation constraints

Anticipated constraints to the evaluation include problems related to low employer response rates. The project’s Principle Investigator suggested that employer response rates for paper/pencil questionnaires were as high as 70-75%, however, in 2003, the Professional Practice Division moved from a paper questionnaire to an online questionnaire, where the response rates declined to 40%. The evaluators have emphasized to the PI the importance of increasing response rates to the 70-75% range. A second anticipated constraint may be in obtaining information from faculty, who may view this project as obtrusive. Finally, two of the five participating academic departments (civil engineering and architecture) have already undergone recent changes to improve their curricula, therefore, statistically significant changes might not be detected during the three year project.

IX. Budget

The budget for this project is approximately $35,000 a year to include both and internal and an external evaluation (for a complete explanation, see formal proposal).

X. Report activities and Meetings

Internal and external evaluators will meet with the project team members monthly. Annual reports will be prepared by the external evaluator for the Principal Investigator, project member teams, and FIPSE project director. A final evaluation report will be disseminated two months after the completion of the project.