Closing the Loop: Assessment of the Psychology Curriculum from 1995-2010

November 1, 2010

The Psychology Department has addressed   assessment through two primary mechanisms: formal program review and annual outcome assessment.

 Formal Program Review.  Evaluation of the Psychology major has been a primary focus of the formal five-year program review process.  In each of the last three program reviews (1995, 2003, and 2008), both the department self-studies and the recommendations of external reviewers have provided clear sets of goals.  For example, the 1995 reviewer suggested that we examine the content of the statistics/methods sequence, and that we develop a senior capstone experience. Statistics and methods class instructors met regularly over the course of a year to develop an outline of concepts and skills we expected students to learn in every section.  During a lengthy and careful process of redesigning the major, we developed a creative reconfiguration of our statistics/methods sequence (250, 251, 252) by replacing 252 with a senior capstone course (352). By placing a redesigned course later in the sequence, students gained a “culminating experience” that still provided advanced work in statistics and methods.     

The 2003 reviewers also noted that our redesign of the core areas into four “content” areas provided an organizational scheme that appropriately covered the content of the major subfields in psychology, and that more closely matched the typical curricula of undergraduate majors at other colleges.  In 2008, the external reviewers recommended adding a required content with a clinical (applied) focus.  As the department develops its proposal for transitioning to a possible four-course load model, this recommendation will be considered.

Our department has and continues to consider several other suggestions from external reviewers.  These include increasing our contribution to the college general education core, adding History & Systems as a required course; creating additional courses on diversity issues; offering separate courses in developmental psychology for majors and non-majors; and adding a new interdisciplinary major in neuroscience.  The department has made proposals to address some of these recommendations (e.g., adding Introductory Psychology to the Social Science core; hiring a new faculty member to support development of a neuroscience major).  However, restrictions on the number of faculty and other resources have precluded adopting all of these recommendations at this time. 

External reviewers also have provided useful suggestions for our ongoing annual outcome assessment procedures. For example, one reviewer suggested revising our assessment techniques to more closely map with our learning outcomes. Another suggested taking a multi-year approach to assessment, with a focus on one or two learning outcomes each year.  Comparison of newer with more advanced students was also suggested to allow assessment of improvement, particularly in writing. Because of constraints on resources, we have been unable to pursue another suggestion of using a commercial testing service (ETS) to provide standardized assessment of outcomes.

Annual Outcome Assessment.  The Psychology Department has actively and regularly conducted outcome assessment of its curriculum on an annual basis since the 2002-2003 academic year.  Much of the assessment effort during the early part of this period was directed toward evaluating the impact of the major redesign of our major requirements implemented in the latter part of the 1990s.   The transition to our new curriculum was largely completed by about 2001.  As already noted, the centerpiece of the new major was the introduction of a senior capstone course (Psyc 352) that was designed to provide students with an in-depth research-based experience on a specific topic in psychology.  We began offering Psyc 352 in 2001.  Our assessment program in 2002-2003 compared the performance on writing of empirical reports of students who had completed Psyc 352 to the writing of empirical reports by students from the late 1990s.  The results from this assessment were encouraging. 

Since then, the department’s annual outcome assessment program has tended to focus on the assessment of psychology-related language/presentation skills.  One reason for this focus is that the rubrics that we have used to assess language/presentation skills have had embedded within them measures of critical thinking skills in psychology.  For example, the rubrics that we have used to assess the quality of empirical reports and literature review papers in our capstone Psyc 352 class go beyond the evaluation of writing style.  They include evaluations of student’s understanding of research design and statistical analysis.  Thus, while the focus of our assessment program appeared to be entirely on the language skills of our majors, there was also an implied focus on the assessment of critical thinking skills.  As part of this strategy, we tended to examine the work of our students in our senior capstone course Psyc 352 because we felt it would give us the best indication of the impact of our program as a whole on student development. 

The results from our assessment efforts are available online through the assessment wiki.  A review of these results across the years since 2002-2003 reveals that we consistently find that our students are performing well on a variety of different language based assessments.  Over the years, we have assessed students in terms of empirical report writing, literature review writing, poster presentation of research, and oral presentation of research.  Based on these results and the fact that our current major program resulted from a significant and lengthy re-examination of our curricular requirements in the 1990s, the department has not made any significant changes to our curricular offerings that are directly tied to assessment results.  There are instances where the assessment results have served to remind faculty of the importance of specific skills.  For example, one year the assessment program revealed a weakness in the abstracts that students had written as part of their papers.  This information was shared and discussed with department faculty. 

Last year, we began exploring a new approach to empirical assessment based on a statistical analysis of students at all levels of progress through our program.  This approach holds the promise of showing whether completion of specific requirements contributes to the development of specific discipline-related skills.  A detailed report of this effort is available on the assessment wiki.   In connection with this effort, the department’s assessment committee is currently redesigning the department’s learning outcomes for the major.  As we move forward with more clearly defined objectives, our intent is to focus our assessment efforts on an empirically-based approach that examines the impact of all components of our program on each intended learning outcome.  Results of these statistical analyses will provide us with specific feedback regarding which parts of our program significantly contribute to students’ success in achieving the intended learning outcomes, and will help us to identify areas where program modifications may be needed in order to increase our students’ mastery and success.  In this way, there will be a continuous feedback loop that allows us to monitor the relationship between our academic program and students’ achievement of our intended goals.  Such an empirical approach may also aid us in redesigning the curriculum as part of a possible transition to the four-course model.


  • No labels