Evaluation Plan For The University Of Cincinnati Professional Practice Division’s ‘Developing A Corporate Feedback System For Use In Curricular Reform’ Project
Throughout the years, Professional Practice (co-op) faculty advisors have been involved with the college curriculum development through anecdotal evidence of the impact of the curriculum on student employability. The list below gives a good view of the level of feedback today, however, this is only the tip of the iceberg of feedback that can be gained through this project:
The above examples show the potential of cooperative education as a methodology to monitor corporate feedback and make adjustments while the students are still enrolled in the program. The downside of relying on anecdotal evidence is that it makes the adaption reactive rather than proactive. The objective of this proposal is to develop a feedback system that identifies discrepancies between curricular offerings and stakeholder needs before they have produced obvious negative impact on an operational level. The proposed system not only analyzes the information gathered through comprehensive employer evaluations, but combines that with anecdotal information and focus groups to properly filter the information. The recurring statistically-valid feedback process that this project concentrates on developing will be an invaluable asset for any institution that wants to develop its operation in harmony with both academic criteria and employer demand.
The three year project focuses on: a) identifying curricular activities exhibiting a strong correlation with student co-op work performance; b) designing and implementing processes allowing the systematic use of employer assessment in curriculum design; c) evaluating the impact of changes in curricular design upon student work performance d) piloting and contrasting projects in both different academic fields and at different colleges; and e) developing a set of best practices to be used for further refinement and dissemination of the process. Initial collaborators include University of Cincinnati (UC) academic units as follows: the Department of Architecture (College of Design, Architecture, Art, and Planning); the Department of Civil and Environmental Engineering (College of Engineering), the Department of Civil and Construction Management (College of Applied Science), the College of Business Administration, and the Division of Professional Practice. The assessment data will be analyzed by the UC Evaluation Services Center. The Evaluation and Assessment Center for Mathematics and Science Education at Miami University is the external evaluator for the project. Schools accredited by, or subscribing to, the attributes of or the Accreditation Council for Cooperative Education will act as a reference group, ensuring a transferable end process.
The ultimate objective of the project is to move schools engaged in cooperative education to a novel era of market alignment. The objective is to build feedback structures that keep the schools abreast with a rapidly changing environment. The processes will be built to support an efficient allocation of educational resources. The inclusion of a wide array of programs and a large, diverse reference group caters to building a process that can be effectively utilized in schools engaged in cooperative education within a diverse set of academic fields and educational levels. The objective is to develop a process that can be successfully implemented both in a community college as well as a four/five year institution environment.
This evaluation will assess the efficacy of the project to develop methodologies that allow it to use assessment data of student work term performance in curricular development, thereby, continuously aligning experiential or cooperative education based curricula with industrial needs.
Co-operative education departments within the University of Cincinnati as well as universities nationwide with co-op programs will benefit from data produced by this project. Success and failures will be charted and information concerning the program will be disseminated within the university system as well as nationally.
The University of Cincinnati Professional Practice Division’s project will attempt to demonstrate that co-operative education students in participating academic departments perform better according to industry standards when compared to co-op education students not in participating academic departments.
The evaluation plan encompasses the University of Cincinnati Professional Practice Division’s three goals and leads to formative and summative evaluations (see Evaluation Plan). The Miami University’s Evaluation & Assessment Center for Mathematics and Science Education will provide semiannual reports to the Principal Investigator. Data collected in year 3 will provide summative information.
Formative data and analyses will address the following questions:
In addition to year 3 findings for the above questions, summative data and analyses will address the following question:
How has the program been institutionalized at other departments, colleges, and universities?
The evaluation of the University of Cincinnati’s Circular Curriculum project will use a mixed methods approach. Pre/post data analysis using Assessment Instruments I and II will provide a rich quantitative data source that can be used to track changes in student and employer outcomes for participating and comparative groups. The evaluation will also include qualitative measures such as analysis of faculty focus group meetings as well as expert panel reviews of departmental syllabi that have been redesigned. Expert panel reviewers will be chosen from nationally rank universities.
Quantitative data analysis includes univariate, bivariate, and multivariate statistical techniques. Sophisticated statistical analysis such as multivariate regression analysis as well as Item Response Theory (IRT) may be employed, as appropriate. Multivariate regression analysis will control for extraneous factors that may influence outcomes and confound results. Item response theory will be used to analyze items within Assessment Instruments I and II. Reviews of interviews will provide qualitative data for this evaluation.
Goal 1: Develop baseline data and Develop and implement Assessment Instrument II based upon prior findings.
|Activities||Evidence & Benchmarks||Measures||Data|
Establish performance baseline using relevant statistical methods.
|Evidence: Accumulated assessment data is collected and analyzed using ‘Assessment Instrument I’ during W, Sp & Su 2004. Benchmark: In Year 1, 100% of collected data covering 4 majors analyzed.||Track number, demographics, and findings of the Assessment Instrument I.||Collect and analyze data gathered from the Assessment Instrument I.|
|Activity 2: Design Assessment Instrument II targeting detailed evaluation of curricular strengths and weaknesses.||Evidence: ‘Assessment Instrument II’ targeting detailed evaluation of curricular strengths and weaknesses is created. Benchmark: In year 1, Assessment Instrument II is created and piloted.||a. Validation of Assessment Instrument II.
b. Reliability check of Assessment Instrument II.
|a. Supervisors and Professional Practice faculty expert review of Assessment Instrument II and focus group interviews with faculty, employers and students.
b. Collect and analyze data from piloted Assessment Instrument II questionnaire to check for reliability.
|Activity 3: Assess student work term performance using Instruments I and II.||Evidence: Items in Assessment Instrument I and II are highly correlated. Benchmark: High correlations between items on both questionnaires.
||Correlations between items on each questionnaire.||Collect and analyze aggregated data from both Assessment Instrument I and Assessment Instrument II.|
|Activity 4: Develop a redesigned Assessment Instrument II based upon feedback.||Evidence: Newly created Assessment Instrument II reflects changes suggested by expert panel, employers, and students.||Track development of redesigned Assessment Instrument II questionnaire.||Collect and review questionnaire.|
|Activity 5: Use newly created and validated Assessment Instrument II.||Evidence: Program students and employers are using Assessment Instrument II as their ‘student assessment’ tool. Benchmark: 100% of participants are using questionnaire.||Track number, demographics, and findings of the Assessment Instrument II.||Collect and analyze data gathered from the Assessment Instrument|
Goal 2: Develop and implement agreed upon Circular upgrade goals for individual teams.
|Activities||Evidence & Benchmarks||Measures||Data|
|Activity 1: Develop and modify Circular Upgrade Goals.||Evidence: Each team reaches agreement on goals for specific term.||Verify team goals.||Comparison of annual faculty focus group interviews and review team meeting documents and final report.|
|Activity 2: Develop syllabi for specific classes.||Evidence: Syllabi for specific classes reflect agreed upon goals for each team
||Verify content of syllabi.||Review by outside expert panel of course syllabi.|
Goal 3: Institutionalize Feedback Cycle process to other departments, colleges, and universities.
|Activities||Evidence & Benchmarks||Measures||Data|
|Activity 1: Assess quality of Feedback Cycle.||Evidence: Students, faculty, and employers give positive ratings of the program.||Student, faculty, and employer program evaluation questionnaire and random interviews.||Collect and analyze data from evaluation questionnaire and random interviews.|
|Activity 2: Dissemination through publications, meetings, and conferences.||2a. Evidence: Program model information is presented to a wide audience.
2b. Evidence: Program is continued beyond STEP funding.
2c. Evidence: Program is continued beyond FIPSE funding.
2d. Evidence: University of Cincinnati increases level of support.
|2a. Track presentations.
2b - d. Continued monitoring of program components and institutional support
|2a. Collect records of presentations and publications from Professional Practice personnel.
2b - d. Collect and review long-range program planning from Professional Practice personnel.
Anticipated constraints to the evaluation include problems related to low employer response rates. The project’s Principle Investigator suggested that employer response rates for paper/pencil questionnaires were as high as 70-75%, however, in 2003, the Professional Practice Division moved from a paper questionnaire to an online questionnaire, where the response rates declined to 40%. The evaluators have emphasized to the PI the importance of increasing response rates to the 70-75% range. A second anticipated constraint may be in obtaining information from faculty, who may view this project as obtrusive. Finally, two of the five participating academic departments (civil engineering and architecture) have already undergone recent changes to improve their curricula, therefore, statistically significant changes might not be detected during the three year project.
The budget for this project is approximately $35,000 a year to include both and internal and an external evaluation (for a complete explanation, see formal proposal).
Internal and external evaluators will meet with the project team members monthly. Annual reports will be prepared by the external evaluator for the Principal Investigator, project member teams, and FIPSE project director. A final evaluation report will be disseminated two months after the completion of the project.