Assessment and Program Review

2022 Assessment Practice and Discovery Mini Grant Award Recipients

COLLEGE OF AGRICULTURE

Program: MS in Agriculture, Nutrition and Food Science Option
Title: Assessment by Committee of NFS MS Thesis Proposals
Faculty Lead: Gabriel Davidov-Pardo

Five faculty members from the NFS department was constituted to assess thesis proposals defended in the Spring and early Summer of 2022. The committee included four faculty from the nutrition discipline and one from the food science and technology discipline. The thesis proposals in our program have never been assessed in a rigorous manner before and that created inconsistencies.

A rubric was created to assess proposals from the students with the following criteria: Formatting, Abstract, Introduction, Literature Review, Methods (Research Design), Methods (Instruments and Procedure), Methods (Data Analysis), Potential limitations and possible solutions, Timeline, and Overall Quality of Writing. Four criteria are evaluated in a binary way (met, not met or included, not included) and the rest are evaluated at five levels from developing to exemplary.

After evaluating five proposals, the committee found that both the literature review and methods were generally strong and close to exemplary. The writing style was satisfactory. Nevertheless, the introduction was, in general, weaker with disagreement on the topics to cover under this section. The rest of the criteria scored low due to some proposals not including the specific sections such as limitations, abstract, or timeline.

The rubric will be shared with the rest of the department and the instructors teaching proposal writing for the College, with the objective of standardizing the expectations regarding the different sections of the document and the topics to be included in each section. Also, the committee expects that the quality of the proposals will be elevated due to having a clear and useful rubric.  The findings were then presented to the faculty during the department´s Fall retreat and actions will be taken to address the identified areas of improvement.

COLLEGE OF EDUCATION AND INTEGRATIVE STUDIES

Program: BA Early Childhood Studies
Title: Helping to Close the Loop on Key Assessment Data
Faculty Lead: Denise Kennedy

The purpose of this project was to be able to collect data that can be used for Early Childhood Studies (ECS) National Association for the Education of Young Children (NAEYC) accreditation. We have 5 key assessments based on rubrics that faculty use in some courses to assess NAEYC standards. One major issue has been that some faculty members (full-time and part-time) have changed rubrics or do not complete and submit them. To address this, we created rubrics on Qualtrics to assign links to faculty to complete them. This data will allow ECS faculty to meet and discuss data and close the loops on assessment findings if any are indicated in the data and help us with reporting for accreditation. 

One full-time and 1 part-time faculty worked together this summer to create the key assessments on Qualtrics and spoke with several faculty that teach the courses where these are required to ensure ease of use. It was straight forward and successful; we need to create links for each section as not all have been assigned yet. The key assessment data can allow us to look at individual student data, course data, standard data, and much more which we have not be able to do. It is really exciting to be able to gather data for program/course improvement, as well as report writing, and student outcomes.

COLLEGE OF ENGINEERING

Program: MS Aerospace Engineering 
Title: Revising Assessment Plan for the Hybrid MSAE Program
Faculty Lead: Navid Nakhjiri

The MS in Aerospace Engineering program was under the umbrella of MS Engineering for many years, so the PLOs had to be general and flexible to satisfy disciplines under the MSE program. In Fall 2021, MSAE became discipline-specific and was separated from MSE, now offered as a hybrid program starting Fall 2022. These changes required us to review this program's entire assessment plan, PLOs, and SLOs. 

We reviewed the PLOs of existing MS programs across CPP and MS Aerospace Engineering programs within CSUs and UCs. A set of 3 revised PLOs were developed for the MSAE program that reflect the core value of our program, our target student demographic, and the outcome expected by industry partners. These PLOs were submitted to the Department Faculty for review and evaluation before being finalized.

We also reviewed the SLOs, identified the areas no longer applicable to the MSAE program and emphasized the areas of importance for our graduates to acquire. We changed the SLOs into 4 revised statements more appropriate for the level of thinking for a graduate-level program These SLOs were shared with faculty, and were revised and approved.

For the assessment plan, we created a schedule for collecting direct and indirect data to be able to review the program and student outcomes. The development of this new schedule was necessary due to a change in the modality of courses. We identified the best courses to assess the new SLOs and a reviewing cycle that allows us to evaluate and close loops in the program. The next step is to finalize the rubrics developed for each SLO. These assignments and rubrics are implemented online to allow all students (in-person and virtual) access and the graduate coordinator to collect and track data over the years. The framework was created over the summer but needed to be finalized before being implemented in the ARO courses.

Program: BS Computer Engineering; BS Electrical Engineering
Title: Closing the loop - SO7 (Acquire, learn and apply new knowledge)
Faculty Leads: Anas Salah Eddin and Hyoung Soo Kim

The main goal of the project was to finalize the analysis and close the loop on the assessment of Student Outcome 7 (an ability to acquire and apply new knowledge as needed, using appropriate learning strategies). SO7 is used for both the computer engineering and the electrical engineering programs. The outcome was assessed using 3 performance indicators: A) “Acquire”, B) “Learn”, and C) “Apply”.  “Acquire” and “Apply” were assessed using in-class projects and exams. “Learn” was assessed using a survey that measured students’ propensity as lifelong or autonomous learners using two cross-listed scales.

The assessment data was analyzed and summarized using flag heuristics. The results and summary were presented during the department’s Fall 2022 retreat. A closing-the-loop discussion followed with further discussion in a future department meeting. Students in both programs (computer engineering and electrical engineering) had similar performance. “Acquire” and “Apply” raised a yellow flag; whereas “Learn” did not raise any flag. A Yellow flag indicates potential underperformance, and that further data collection and analysis is warranted. Closing the loop suggested the team should be directing recourses to reassess the “Acquire” and “Apply” performance indicators.

The team was also able to assess and close the loop on SO4 (an ability to recognize ethical and professional responsibilities in engineering situations and make informed judgments, which must consider the impact of engineering solutions in global, economic, environmental, and societal contexts) and SO5 (an ability to function effectively on a team whose members together provide leadership, create a collaborative and inclusive environment, establish goals, plan tasks, and meet objectives). Assessment results for both SO4 and SO5 showed no flags and as a result, there was no needed action.

Our next steps are to continue assessment activities following our department’s assessment plan and to hold a department meeting summarizing all closing-the-loop action items for student outcomes assessed during this assessment cycle.

Program: MS Engineering Management
Title: Closing the loop on assessment findings of MS Engineering Management-PART II
Faculty Lead: Shokoufeh Mirzaei

This mini-grant helped build upon the work that was done last year for the EMT program. During Fall 2021, the MS Engineering Management program underwent an external review which revealed a lack of evidence that 1) the curriculum content is reflective of current debates, trends, technologies, and the latest important developments; and 2) the program uses feedback from students/alumni/employers for its improvements and goals.

Last year, surveys were designed and conducted to solicit feedback from: 1) Industry advisory council (IAC) regarding the current debates, trends, technologies, and latest developments in the field; 2) graduating students regarding their general sentiment toward the program; and 3) alumni regarding the connection between the program curriculum and their current job’s required skills. Using the mini-grant, the results of these surveys were analyzed and improvement areas were identified.

From the alumni survey, results showed research methods must be improved in the program. One underlying reason for the lack of knowledge of research methods could be the small number of full-time students in the program, thus more emphasis on projects for culminating experience as opposed to the thesis.

The IAC survey results identified the following as important trends: Probiotics manufacturing, Cybersecurity, Data Science and AI, Bioreactors and other biologic processing, Cell & Gene Therapy, Antibody-Drug Conjugates, and 3D Printing for dental usage. While 67% of the IAC survey participants are from pharmaceutical companies and thus a little biased, the current faculty search included additive manufacturing and bio-manufacturing among the preferred qualifications for hiring.

The graduating students' survey results showed room for improvement in course scheduling. Although the overall sentiment for course availability was positive, it wasn’t as strong as other areas studied in the survey (e.g. faculty and culture). There also were comments on the improvement of contract preparation and processing:

The EMT committee will review the results of these surveys to define action items that further improve the program.

COLLEGE OF ENVIRONMENTAL DESIGN

Program: BA Art History
Title: Art History Program Assessment Restart 
Faculty Lead: Karlyn Griffith

The purpose of this project was to update the Art History program assessment process, plan and documents, but first I needed to understand how our program relates to discipline standards. 

I reviewed and studied the current Art History Assessment plan and assessment documents in terms of recent feedback and in comparison to other Art History Programs. Surprisingly, there is great variety among Art History degree programs. Few Art History programs post student learning outcomes on their websites. However, our current SLOs reflect well NASAD standards and even go beyond them. I was, nevertheless, able to uncover a number of gaps in our assessment process and program learning outcomes while reviewing these materials. I drafted modified SLOs to reflect the feedback from previous assessment reviews, and assembled ideas on how to tweak our SLOs to fit better with university learning outcomes. I identified a significant issue in our current assessment plan: we have only one course that assesses at the mastery level. This is especially important because SLOs 4 and 5 assess for advanced skills necessary for the job market. 

I was able to update and draft a new alignment matrix based on the provided template, which we were lacking, and composed a timeline and list of next steps for updating the Art History assessment plan, including improving our SLOs 4 and 5. 

Our next steps include refining and finalizing SLOs 4 and 5 after receiving faculty feedback, redoing our curriculum matrix, identifying courses – new and old - which can provide opportunities for mastering SLOs 4 and 5, and discussing with dean and faculty about the development of new classes that meet mastery-level.

Program: BS and MS Urban and Regional Planning
Title: Assessing the development of knowledge about equity and inclusion in URP core courses
Faculty Lead: Gwen Urey

This project explored how three URP courses include considerations of DEI in the work required of students. The project began with an enumeration of SLOs from all URP core courses from both the undergraduate and Master’s programs. About half of core courses do include SLOs related to DEI, with an expectation that student work would reflect DEI content. Of the three courses selected for a deeper dive into actual artifacts of student work, DEI content stood out strongly in only one course.

In URP 2010/L, students responded to instructions to focus on a specific social group defined by age. They were also required to engage in site analysis that included demographic background. Neither prompt led students to consider how their designs might relate specifically to the needs of racial or ethnic groups, facilitate equity in a neighborhood context, or embrace motions of inclusivity beyond age. The design assignment could easily embrace these ideas. In the case of URP 3310/L, many students chose topics relating to policies with potentially differential impacts on different groups by race or ethnicity, or on conditions with similar differential dimensions. Again, the assignment could easily embrace DEI ideas.

These results were presented informally to URP faculty during a “brownbag” meeting in September 2022. Suggestions included further assessment and further engagement with faculty on how to design assignments that will insure student learning about DEI, and applications in the planning field. The expanded course outlines (ECOs) for most courses (excepting URP 1051, which was created as a course to meet the Ethnic Studies requirement) were created in 2015-16, so discussion about reviewing more recent syllabi as well as updating ECOs also came up.

Program: BS and MURP Urban and Regional Planning
Title: Developing and testing a tool to assess critical thinking
Faculty Lead: Richard Willson

During our first 3-year assessment cycle, we faced several challenges in collecting, assessing, and interpreting data due to broadly and vaguely written program SLOs.

This project reports on the assessment of a Multimedia Learning Object (MMLO) developed to support URP faculty in assessing critical and reflective thinking among graduate and undergraduate students. The MMLO was developed with the support of CAFE in 2021 and beta tested in the author’s classes and in professional workshops in the last year.

The MMLO provides the student with a brief inquiry case study and asks them to engage in a reflective process to make a “practical judgment” of how to proceed. They do this individually and then discuss their choices in groups.

The MMLO assessment considered the distribution of choices made, outcome likelihood assessments, reasoning processes employed, user satisfaction with the MMLO, and a rubric-based assessment of a follow-on assignment. It revealed that students did choose different practical judgments for the same case and had good reasons for doing so. The MMLO also facilitated grounded group discussion and follow-up assignments.

The MMLO assessment process identified refinements that are currently in progress with CAFE and the development of additional cases. A paper entitled “Deliberate – Decide – Discuss: Flipping Case Learning with Practical Judgment Simulations” was prepared to explain this effort for presentation at the Association of Collegiate Schools of Planning conference this November.

The next step is to present the revised MMLO and the paper to URP faculty, and encourage them to create additional cases. The intention is to have the MMLO used in a variety of classes and develop methods of compiling data from its various uses to address metrics in the learning outcomes defined by the Planning Accreditation Board.

COLLEGE OF LETTERS, ARTS, AND SOCIAL SCIENCES

Program: BS Communication 
Title: Enhancing the Assessment of Student Learning Outcomes: Development of Assessment Surveys for Incoming Students
Faculty Lead: Kang Hoon Sung

The Communication Department conducted regular exit surveys, but never developed an official entry survey before. This was highlighted by the external reviewers during our program review.

During summer, we were able to develop a Qualtrics entry survey that not only provides us with important information about the incoming students but also is expected to provide us with more insights on students' improvement and growth in conjunction with the subsequent assessments and our final exit survey. In this entry survey, we also included questions that complement our assessments of the Critical Thinking and Diverse Perspective SLOs.

Our next steps include: 1) analyzing the entry survey data comprehensively and presenting the findings to the department faculty; 2) using the findings to improve our future student learning outcome assessments; and 3) comparing the entry and exit survey data to understand our strengths and areas of improvements.

Program: BA Theatre
Title: Program SLO's and Goals Review & Promotion
Faculty Lead: Sarah Krainin

Information gathered through our AY 20-21 assessment activities and program review process showed that students and alumni need help identifying the learning outcomes of the program. The project sought to review the language and specificity of our existing SLOs, and then re-design and re-launch the portion of our website aimed at communicating them to students and the general public.

Faculty met and reviewed the existing SLOs, and established four benchmarks for our work: 1) SLOs are specific and measurable, 2) SLOs are relevant for multiple types of student interests and pursuits, be it professional technicians, grad school, etc., 3) SLOs should invoke the University value of workforce readiness by focusing on outcomes most desired by potential employers and transferability of skills, 4) SLOs should align well with University core competencies.

We identified three potential student paths that we wanted our SLO’s to be able to support their success towards: 1) Career readiness, 2) Grad school or advanced training readiness, and 3) Transferability of skills to other industries. We interviewed professional industry contacts and faculty in theatrical graduate programs to identify the skills they prioritize in selecting new employees or students. This data helped us cultivate new SLOs that will support the paths students are likely to take after graduating from the program.

Integration the new SLOs into our students’ learning materials is temporarily on hold, while we consider whether the potential process of a curricular redesign may necessitate another iteration of review, to optimize alignment. Once complete, our next steps are to implement the new SLOs by 1) developing new alignment matrices and assessment plans, 2) updating our website and promoting it on social media, 3) integrating the SLOs into student learning via updated ECOs and a department faculty engagement campaign aimed at illuminating outcomes for students for meta-cognitive purposes, and assuring that CLOs are in proper alignment throughout the program.

COLLEGE OF SCIENCE

Program: BS Computer Science
Title: Closing the Loop: Analyzing the Assessment Data and Preparing an Annual Assessment Report for ABET Evaluation
Faculty Lead: Hao Ji

The ABET-accredited BS in Computer Science program worked on the second year’s assessment activities of the revised ABET assessment plan proposed in Fall 2020. The purpose of this project is to review assessment data collected in AY2021-22 and prepare an annual assessment report for ABET evaluation. In the summer, we analyzed both direct and indirect assessment data collected through several major core courses and graduating student surveys. We also drafted the annual assessment report with recommendations for action. The evaluation results of this project will be discussed and approved by the department in the early Fall for the program's continuous improvement.