Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 5.3

...

Section
Column
width25%
Panel

Summary
The institutional effectiveness process at Geneseo produces clear, direct, useful, and reasonably reliable evidence that learning and administrative outcomes are being met. Academic programs and administrative units are closing the loop on this evidence by using it to advance the college's mission of transformational learning. In order to achieve a more thorough vertical integration of the effectiveness process, work remains to be done at both ends of the scale: learning outcomes must be better communicated to students, and assessment data must play a more systematic role in strategic planning and budgeting.


Panel

Contents

Table of Contents
maxLevel2

Panel

Chapters


Panel

Learn More

Column
Panel
bgColor#666

Standards addressed in this chapter
7 — Institutional Assessment
14 — Assessment of Student Learning

Introduction

In "Campus Change for Learning: Leading a Category Shift in Liberal Education," Lynn Swaner writes:

Given the demands of scarce resources and the competitive environment in which colleges and universities function, it is a recognized challenge for the institution to prioritize its resources in order to create a context in which engaged learning and its documented outcomes are dominant objectives of the institution, its faculty and staff, and its programs and opportunities.

Dr. Swaner, who serves on the advisory board of the Bringing Theory to Practice project at the Association of American Colleges and Universities, continues:

While recognized as a challenge, it is also clear that the only sustainable path is to categorically shift priorities and internal resources. It is unlikely that individual campuses can meet the challenge and make the needed shift in priorities without collective support and encouragement, and without there being a shared perception in the academy of the value of doing so.

The institutional effectiveness process at SUNY Geneseo, including student learning outcomes assessment, supports transformational learning in two ways: first, by helping to establish the "shared perception" that the college must "prioritize its resources" so as to promote "engaged learning and its documented outcomes"; and second, by providing the data — about "the institution, its faculty and staff, and its programs and opportunities," as well as about the "outcomes" of student learning itself — needed to prioritize and allocate resources intelligently and effectively.

The institutional effectiveness process at Geneseo has room for improvement, but it possesses the characteristics necessary to serve the role described above. Specifically, it is:

  • a process that people at the college understand
  • a process that produces clear, direct, useful, and reasonably reliable evidence
  • a process that closes the loop by feeding this evidence into discussion and decision-making
  • a process that is adequately supported by the college
  • a process that foregrounds the dimensions of transformational learning

Overview: institutional effectiveness process

As described in Geneseo's wiki space for Academic Assessment, the institutional effectiveness process at Geneseo feeds information from programs, offices, departments, and committees to groups or individuals responsible for oversight at the unit level (Academic Affairs, Advancement, etc.) and from there to the College Assessment Advisory Council (CAAC).

The council responds downstream to unit-level oversight groups and reports upstream to the Strategic Planning Group, which uses the information for both its ongoing review of college goals and objectives and its formulation of recommendations to the Budget Committee.

The process is designed to ensure that the activities of programs, offices, departments, and committees inform and are informed by planning and budgetary decisions.

Each unit in the diagram above maintains its own recursive process of data collection, reporting, reflection, and (when appropriate) decision-making. In keeping with Planning Goal 6, the college has maintained this as a decentralized process, without a campus-level assessment coordinator, in order to emphasize the point that assessment is the responsibility of every department and office. However, recognizing the need for someone to coordinate data for accreditation processes, the college recently has created such a position.

Student learning outcomes assessment process

The learning outcomes assessment process comprises academic program assessment and general education assessment and is organized under the auspices of the Academic Program Assessment Committee (APAC). All academic programs have learning outcomes, and each program assesses some number of these outcomes each year. Similarly, each general education area has outcomes; every third year, each area assesses all of its outcomes. Until very recently, general education assessment results were not only reviewed on campus but forwarded yearly, together with reflection, to the Office of the Provost at SUNY System Administration.

On the course level, learning outcomes are included on a guide syllabus required for each new course proposed for adoption. On the program level, academic programs are free to change their learning outcomes in response to new priorities, reflection on assessment, or feedback from the Academic Program Assessment Committee. In general education, learning outcomes may be changed through the campus governance process.

Assessment results are recorded online according to a publicized timetable. Each year's salient activities in program and general education assessment are announced on pages titled, respectively, This Year in Program Assessment and This Year in Gen Ed Assessment.

The word recorded in the paragraph above is used purposely. Recording in this context stands in contrast to reporting. The difference is important to the culture of assessment at Geneseo.

A reporting model of assessment implies the transfer of information through a process of submission. I write a report and submit it to you. It has left my hands (or office, or computer) and landed in (or on) yours. If I do not share the document with colleagues in my department, they must now obtain it from you. I may decide to delete the document from my computer, or I may have difficulty retrieving it if I decide to consult it later on. The report is a discrete entity over which I may feel minimal ownership once it has left my hands. Indeed, once submitted it is "out of my hands" — that is, no longer, in some basic sense, my business.

By contrast, a recording model implies continuous responsibility for and ownership of information. I do not "submit" the information through the transfer of a document; instead, in a process resembling that of journaling, I log the information in a virtual location to which I can return at any time, and to which I can provide others (colleagues, a committee, an office, an administrator, a visiting team from Middle States) access. They now see it, but it has never "left my hands."

Recording assessment results is the responsibility of a designated assessment coordinator for each academic program or, in the case of general education, the chair of each area's oversight committee.

Once recorded, assessment results are available for review by the Academic Program Assessment Committee, an oversight group chaired by a faculty member serving in the capacity of assistant to the provost, and composed of faculty. The group includes representation from students and administration, and is aided by the director of the institutional research office. Assessment coordinators and general education area chairs may request a meeting with the committee for advice, guidance, or feedback, but the main venue for communication from the committee is the same one where the results are logged: the committee records "notes" in the wiki designed to help programs improve their assessment process. The notes — such as these on biology — speak to the continuity and consistency of each program's assessment process, the measurability of outcomes, the methods of assessment, and the quality of reflection and revision (that is, loop-closing) in the program. The notes speak only to matters of process; the assessment committee does not concern itself with content and pedagogy within programs.

In a typical page of assessment results, a program will record the outcomes chosen for assessment in the previous year and indicate the method(s) of assessment used, the criteria for judging student success relative to the outcomes, and (by way of closing the loop) reflection on the results. Programs are free to organize these pages in whatever way serves them best, as long as outcomes, methods, criteria, and reflection are clearly marked. (Some programs continue to organize their pages using the template of a superseded "assessment report" form.)

A handful of programs use multiple measures for evaluating student learning on each outcome. When Geneseo's assessment process was first instituted around 2000, the assessment committee required multiple measures, but the requirement was set aside as the campus worked to establish a culture of assessment and programs climbed the learning curve to understanding the purpose of assessment and concepts such as validity and reliability. The college has finally reached the point where it makes sense to begin insisting on multiple measures once again.

As already noted, the student learning outcomes assessment process "closes the loop" by incorporating reflection in the results that programs record each year. In reflecting on their results, programs may refer to changes they have made to content, pedagogy, or other aspects of their curriculum, and they often consider possible future changes.

In addition to the reflection contained in each year's results, programs have recently begun keeping track of their loop-closing conclusions and changes on dedicated pages of the assessment wiki. An example would be this page on how the program in geological sciences has closed the assessment loop.

A special case: general education assessment

Assessment of general education runs parallel to academic program assessment. As mentioned above, the chairs of the various general education oversight committees serve as assessment coordinators for their areas, with each chair recording results (and reflecting on them) once every three years.

This three-year assessment cycle owes its existence to a 1999 resolution of the SUNY Board of Trustees adopting system-wide learning outcomes in general education. Beginning in 2002, SUNY campuses were to report assessment results on each general education area to system administration every third year. From 2006 to 2010, campuses were also required to conduct so-called Strengthened Campus-Based Assessment on three outcome areas: critical thinking, communication (writing), and mathematics.

In July 2010, the SUNY provost's office issued a memorandum replacing the reporting process for regular and "strengthened" assessment with the instruction that each campus "maintain records of its assessment plans, findings, and resulting actions and their impact, and share them, as appropriate, with campus constituencies, regional and programmatic accrediting bodies, the Provost of the State University of New York, and external auditors."

The SUNY general education assessment process has never been an entirely comfortable fit with Geneseo's general education curriculum, since the adoption of system-wide outcomes required the college to "map" its previously adopted internal learning outcomes to those developed in Albany. (The alternative would have been for the college to sacrifice some degree of intellectual autonomy, and perhaps integrity, by replacing its more rigorous and fine-grained outcomes with the more generic system outcomes.) In addition, reporting and loop-closing have involved an additional layer of bureaucracy that has made general education assessment feel at times less than fully integrated into the overall student learning outcomes assessment process. Nevertheless, general education assessment at Geneseo comprises the elements of regular data collection, recording of results, and reflection.

Administrative assessment

Each of the college's four administrative divisions has its own assessment process.

  • Following the model of academic program and general education assessment, administration in Academic Affairs manages assessment in a wiki space. Most of the ten units within this division have placed an assessment plan in the wiki, and over half have recorded results from one round of assessment.
  • In Student and Campus Life, the seven department heads have embedded in their performance program an annual expectation that a function or specific service under their purview will be examined. An array of assessment strategies is provided by the vice president and includes methodologies such as benchmarking, full program reviews utilizing national standards and/or external visitation teams, participation in national studies, and site visits. Results of these initiatives are tabulated and selected findings are reported annually to the College Assessment Advisory Council (CAAC). Planned actions and their progress are also documented. The divisional approach to assessment is practical in nature, and is based on the expectation that assessment leads to action that closes the assessment loop.
  • In Administration and Finance, assessment has been conducted since 2001. Each department within the division is expected to assess one or more functions or services within its area annually. Departments use assessment results to evaluate and improve quality and effectiveness of services. Department directors are expected to include a summary of assessment activities in their annual report to the vice president, and submit annual assessment reports to an assessment committee for the division. In June of each year, every department in the division submits an annual report to the vice president that includes a description of assessment activities. Therefore, the division assessment committee collects assessment reports from each unit in July. A summarized divisional assessment report is prepared and submitted to the vice president and CAAC on September 1 of each year.
  • In Enrollment Services, multiple efforts are extended to reach out to prospective students and their families, school counselors, and transfer admissions personnel at community colleges. An extensive array of print and electronic promotional materials/resources is in place and is reviewed/edited regularly in consultation with external marketing consulting organizations. Illustrative examples of the assessment of promotional efforts include the administration of several recruitment-related surveys and the use of focus groups to determine the extent to which Geneseo's preferred message/market perception is being received. Numerous systems are in place which support the generation of annual applicant pools (travel, campus tours/visit, admitted student preview days, to name a few). The detailed specifics of the actual application process and review of candidates for admission are also examined on a regular basis. Perhaps the best means of assessing the effectiveness of these efforts are the statistics associated with Geneseo's rates for the retention of first-year students and the six-year graduation rates benchmarked against national figures.
  • In College Advancement, the performance of each department is measured at regular intervals. For example:

    • The Office of Alumni Relations tracks the number, attendance, and variety of on- and off-campus events to measure its effectiveness in engaging and connecting alumni and friends to the college and to each other. The office also compares Geneseo to nine other institutions (SUNY and non-SUNY) through an alumni engagement initiatives survey that measures alumni on record; the number of on-campus and off-campus events; event attendance; regional chapters; online community registration; and alumni giving participation rates. Geneseo engages more constituents than all but one college on that list.

    • The Office of College Communications assesses its effectiveness by tracking the quantity and quality of printed publications and news stories generated by proactive outreach. This includes the number of news releases and advisories — and stories placed; web page visits via Google analytics; issues of Encompass (faculty/staff newsletter) and the Scene (alumni magazine) produced annually, and feedback generated to editor via emails, phone calls or personal communication; newsletters and electronic email communications sent to alumni, parents, and donors, and feedback generated through email and telephone calls; invitations and marketing support materials designed and printed; photography requests; and social networking comments. The office also reviews media stories daily to determine tone, fairness, and accuracy.

A process that the college understands

Transparency

The processes described above and the purposes they serve are transparent to the SUNY Geneseo community. Not only are they "communicated" in the conventional sense of being objects of reference in memos, emails, and meetings; in the case of learning outcomes assessment, and to some degree of academic affairs administration, the process takes place in a publicly visible space that incorporates information about the purpose.

The wiki space for learning outcomes assessment is part filing cabinet, part how-to manual, part news source, part water cooler. The pages where academic programs and general education maintain their learning outcomes are open to the world's view. So are the pages where general education areas record and reflect on assessment results. The results pages for academic programs, together with the feedback they receive from the Academic Program Assessment Committee, are closed to the outside world but visible to the entire Geneseo community, so that programs can learn and borrow from one another's assessment strategies and experiences, and the community is assured of every program's accountability. A standard feature of wiki software is its ability to preserve a history of revisions to every page; as a result, a member of the community interested in how the learning outcomes in psychology, for example, have changed over time, need only select Tools > Page History to trace the development of those outcomes.

Some programs and general education areas use the "comment" feature in the wiki for conversation about assessment results, thus making the wiki as much a medium for shared reflection as a record of it. Scroll down in the English department's 2008-09 assessment results, for instance, to find an instance of the vibrant discussion that assessment is designed to generate.

The wiki page that explains the importance of vibrant discussion, together with pages that explain the institution's assessment structure, the meaning of the assessment loop, and other important assessment concepts, helps to ensure that the community understands how and why we do learning outcomes assessment at Geneseo. Other pages, such as the one providing a timeline of events related to SUNY-wide general education assessment provide meaningful context for the assessment process. Nuts and bolts information about tasks and due dates is published yearly on the pages for This Year in Program Assessment and This Year in Gen Ed Assessment. At the other end of the spectrum, a built-in blog provides the chair of the Academic Program Assessment Committee with a medium for explaining the intellectual value of assessment.

Participation

One measure of how thoroughly the campus understands the process and purpose of assessment is simply participation.

As the table below shows, a large majority (71 percent) of academic departments recorded assessments results for at least one learning outcome in the most recent year. Since 2002-03, participation in program assessment has fallen below 50 percent only once, in 2004-05.

Percentage (%) of Departments
that Assessed by Year

 

2009-2010

60%

2008-2009

80%

2007-2008

84%

2006-2007

72%

2005-2006

80%

2004-2005

84%

2003-2004

80%

2002-2003

52%

These figures do not tell the whole story about participation in assessment, however. In many cases, a program that omits to record results for a given year does so because it is engaged in a college-mandated program review (these take place on a five-year cycle), or because it has conducted assessment for a national accreditation body such as NCATE or AACSB. (Brian Morgan, a faculty member in the School of Education, has compiled some of Education's NCATE assessment materials in his personal space in the Geneseo wiki.)

As mentioned above, general education assessment has been required by SUNY system administration since 2002, so areas have participated in triennial assessment at a rate of 100 percent.

Communication to students

Every course proposed for adoption or revision at Geneseo must be accompanied by a guide syllabus that includes learning outcomes for the course (as indicated on this course proposal form). However, the College's official syllabus policy, included in the classroom policies overseen and regularly distributed by the Office of the Dean of the College, says nothing about including learning outcomes on syllabi disseminated to students. A review of 107 randomly selected courses revealed that learning outcomes were listed on 95 percent of syllabi.

The Undergraduate Bulletin does not publish the learning outcomes of academic programs. Neither does it list general learning outcomes for students, although it does contain a statement, now some two decades old, about the Principles and Goals of a Geneseo Education. It contains broad descriptions of purpose for each general education area but not the outcomes for each area, and it does not list outcomes for general education as a whole. The web pages for academic programs typically refer to the "missions" of these programs but do not list learning outcomes.

Nevertheless, a recent survey of faculty revealed that 70 percent "devote class time to explaining or discussing learning outcomes in the courses [they] teach," suggesting that, informally at least, the language of outcomes is being shared not only among faculty and administration, but also between faculty and students.

Clear, direct, useful, and reasonably reliable evidence

Most academic programs at Geneseo, and all general education areas, use course-embedded methods of assessment, evaluating student work such as essays, problems, examination questions, and capstone projects against locally developed rubrics. A few programs use nationally normed exams; chemistry, for example, uses its students' scores on the Major Field Assessment Test in chemistry and on exams from the American Chemical Society.

Inspection of student work constitutes a direct measure of learning; the use of rubrics ensures that the measurements are reliable. Reliability is greatest in those programs where a representative sample of student work is evaluated by a group of raters who first practice using the rubric together. Reliability is still reasonable, however, in the many programs that make continuous use of an internally devised or adopted rubric applied by faculty to their own students' work.

As reliable as any one measure may be, the overall reliability of an assessment process requires multiple means of measurement. This is one area where Geneseo could do better. A few programs apply more than one measure to some of their outcomes, but, as noted earlier, the present norm at the college is a single, direct measure per outcome. In the early years of assessment at Geneseo, the challenge of establishing a culture of assessment — increased by the imposition, at about the same time, of the SUNY Board of Trustees learning outcomes and assessment regimen in general education — proved significant enough to warrant a slow, developmental approach. In 2010-11, the faculty chair of the Academic Program Assessment Committee and the associate provost in charge of assessment visited most academic departments and explained the need to fortify the reliability of Geneseo's asssessment process through the adoption of multiple assessment measures; there was not a single objection to the idea. The time has arrived, and the implementation of multiple assessment measures is now a top priority for the college.

A changed culture at Geneseo is only one of the reasons that this priority should be easy to achieve. The other is that, in fact, many programs already employ indirect measures of their impact on students that could quite reasonably serve as secondary means of assessment. The geography department, for example, is one of many at Geneseo that routinely survey students or keep track of their participation in activities relevant to their intellectual and professional development. It is not so much the measures that are missing, then, as the systematic and appropriate integration of these measures into the routine recording of assessment results and the reflection that closes the assessment loop.

The development of a culture of assessment at Geneseo should probably be attributed, in large measure, to the evident usefulness of the results that programs have collected. The best evidence of that usefulness lies in the way that programs have closed the loop on assessment.

Closing the loop on learning outcomes assessment

Academic programs

How have academic programs and general education areas actually used the evidence of assessment to improve curriculum and pedagogy? As indicated above, programs and areas typically describe these improvements when "reflecting" on a given year's assessment results. Listed below are a few examples of loop-closing changes. In addition, each program and area has recently summarized how it has closed the assessment loop on a page dedicated for that purpose: the academic program pages are aggregated here, the general education pages here.

  • Anthropology — After discussing assessment results, the department intends to change one of their rubrics so that style and content can be addressed separately rather than in combination in the coherence and form category.

  • Art History — The department recently decided to try a multiple choice formula for midterm and final exams in one set of 200-level classes in order to compare it with the paired-images-with-essay format and the essay-only format. The multiple choice format will provide a more objective assessment of student learning and knowledge than the essay or comparative essay format does. Depending on these results, the department will consider changing other aspects of the program. In addition, the research paper changes that were made as a result of assessment, where students are introduced to advanced research methods at the 200-level instead of waiting until they reach the 300-level, was found to be working. More specifically, assessment of the change showed that students are now much more aware earlier in the major of what is expected in researching and writing a term paper, and they develop skills for argument and abstract thinking faster once they understand the basic tools for doing the research.
  • Biology — Based on results from the 2008-09 assessment, the department agreed to continue to offer both Biol 117 and Biol 119 in 2008-09 and to use the PRS (Personnel Response System) in their lectures. Furthermore, they decided that they would de-register any major who enrolled in the second class if the student had obtained a grade of D or E, and that a student who obtained a D or E in a class would have to repeat it in order to continue as a biology major. Finally, any student who repeated a class and failed to obtain a C- would be required to leave the major.
  • Business — In light of assessment results, the school decided that it would be worthwhile to develop an advisement handbook, as an overwhelming majority of students expressed an interest. The Curriculum Committee proposed that they develop a handbook over a period of two to three years, and the department held a retreat in October 2006 to re-evaluate educational outcomes and related criteria.
  • Chemistry — This department found that the results of an administered survey did not correlate well with the results of a standarized exam. Students rated their level of preparedness in physical chemistry much lower than in analytical or inorganic chemistry, although on the MFAT they achieve similar results in the three categories. Given these results, the department will continue to focus efforts to improve the level of preparation and performance of majors. The department notes that one approach that began in organic chemistry three years ago seems to paying off. There was a conscious decision to alter instruction in organic chemistry in the hope that students would better retain knowledge in this subdiscipline. Compared to previous years, the indicators for organic chemistry have been showing improvement, especially this year and last. The overall indicators show weaker performance than in the last few years but are not lower than data collected 4-5 years ago.
  • English — The department noticed a slight decline in their literary interpretation outcomes across the board, suggesting that they may wish to discuss how to incorporate aspects of literary interpretation into their teaching and perhaps begin to discuss ways to help students hone these skills.
  • Sociology — Based on recommendations from the Academic Program Assessment Committee, the department has begun to use pre-test and post-test measures for assessment.

General education

In accordance with a process established by the SUNY Office of the Provost and overseen by the SUNY General Education Assessment Review group (GEAR), Geneseo conducts ongoing assessment of general education learning outcomes. Geneseo assesses all its general education areas on a three-year rotation using assessment measures that are for the most part locally developed.

From 2006 to 2009, SUNY required all campuses to conduct "Strengthened Campus-Based Assessment" (SCBA) in three general education areas: Critical Thinking, Basic Communication (written), and Mathematics. Assessment in these areas was understood to be "strengthened" by the requirement that campuses collect assessment data comparable to either national or SUNY-wide norms. It was nevertheless "campus-based" because campuses were permitted to choose among several methods for meeting this requirement and were not required to administer standardized tests. Geneseo chose to adopt, with slight modification, rubrics developed by faculty panels assembled through GEAR. The rubrics, still in use at Geneseo, can be found on the GEAR wiki page.

General education assessment has produced changes in curriculum and rubrics and is strengthening the programs in a number of ways. For example, in 2009-10, as part of the new, "strengthened campus-based" asessment initiative, Critical Writing and Reading adopted a SUNY-developed rubric in order to assess writing and revision practices more precisely. This rubric combines trait-analysis categories and descriptions from SUNY-developed rubrics for basic communication and critical thinking. Assessment data from 2004-05 demonstrated that approximately 50 percent of the students were meeting or exceeding standards for writing and revising coherent texts. By 2008-09, this number has improved marginally through such efforts as: (1) individual faculty mentoring by the co-chairs for Critical Writing and Reading (including the provision of sample syllabi and assignments to new instructors, review of course syllabi, suggestions for text and reading selections, and advice on best teaching practices, grading standards, and methods); (2) the expansion of library support to include advice on plagiarism and more discipline-focused research workshops; (3) the enhancement of the Writing Learning Center in Milne Library (including additional tutor training in English as a Second Language, learning disabilities, and stress management); and (4) productive pedagogy discussions sponsored by the Teaching Learning Center. Based on the positive feedback received after summer workshops for Critical Writing and Reading faculty,the return of such workshops is expected to improve student assessment scores still further. Additionally, smaller class sizes were identified by Critical Writing and Reading as necessary to permit the individualized attention that coherent writing and meaningful revision require.

Other general education areas that have demonstrated the usefulness of assessment are Western Humanities, Natural Science, Social Science, and U.S. Histories. Western Humanities reports that assessment results have improved since 2003-04, for at least two reasons: faculty have focused on what the intended learning outcomes imply and have sought, as time and resources have permitted, to ensure that instruction would become more and more effective; and faculty have become more familiar with assessment. Furthermore, in response to stated 2006-07 goals for improvement, faculty have been encouraged, specifically, during end-of-semester meetings, to develop assignments that "[require] students to demonstrate interdisciplinary thinking." In fact, since the program's inception, every new faculty teaching Western Humanities has been required to team-teach with, and be mentored by, an instructor from a different discipline. Western Humanities recently re-instituted the practice of holding faculty forums that focus on particular authors, works, or intellectual movements. During these gatherings, colleagues can formally share insights from their own disciplinary specialties, which should, at least theoretically, enable instructors from other disciplines to expand the interdisciplinary scope of their classes.

In 2004-05, assessment of U.S. Histories revealed that most students were either exceeding or meeting the stated learning outcomes. Under the learning outcome "Understanding of the distinct, overlapping and shared history of people based on varied identities and experiences," 36 percent of students exceeded expectations, 46 percent were meeting expectations, 12 percent were approaching expectations, and 3 percent were not meeting expectations. The outcomes for the other assessed categories netted similar results, as evident in the assessment final report. In spite of students' high levels of success, however, questions were raised about the rubric, the assessment process, and faculty participation in assessment. To address these concerns, the final report called for the U.S. Histories Committee to revisit the rubric and suggested the use of multiple assignments in assessment. In addition, the committee called for hosting meetings and workshops for faculty U.S. Histories courses. Based on these recommendations, by the time of the 2007-08 assessment period, the rubric had been revised to better reflect the learning outcomes of the requirement. In addition, the U.S. Histories chair organized one meeting and sent out three emails informing faculty about the upcoming assessment and the importance of participation.

Once again, as during the previous assessment period, the vast majority of students in U.S. Histories either met or exceeded the learning outcomes as stated in the rubric. Under the category, "Knowledge of a basic narrative of American history," 51 percent of students exceeded expectations, 38 percent were meeting expectations, 7 percent were approaching expectations, and 3 percent failed to meet expectations. The results for the other assessed learning outcomes yielded similar percentages. The only minor exception to this trend was in the category, "Understanding America's evolving relationships with the rest of the world." In this case, while the vast majority of assessed students still either met or exceeded expectations (29 percent exceeding, 47 percent meeting), 17 percent of students were approaching and 6 percent failed to meet the learning outcome. Unfortunately, while the overwhelming majority of students managed to meet the requirement's expectations, not all faculty participated in assessment, nor did the faculty who participated address each part of the assessment rubric. To "close the loop" in this case, then, will require that more if not all faculty teaching U.S. Histories courses take part in the assessment process. In 2011, the U.S. Histories chair will be organizing two workshops where faculty can gather to meet and discuss the assessment process. In addition, the U.S. Histories chair will ask to visit a department meeting of seach department that teaches courses for the requirement. At the meeting, the U.S. Histories chair will stress the importance of participation in the assessment process. In addition to this face-to-face interaction, numerous emails reminding faculty of workshops and deadlines will be sent.

While changes have resulted in some general education areas as a result of assessment, other areas' results indicate that learning outcomes are being satisfied. For example, the 2007-08 assessment results for Natural Sciences showed that more than 80 percent of the more than 2000 students assessed met or exceeded expectations. Based on these results, the Natural Science general education committee concluded that no changes in the N/ offerings or assessment plan were warranted. Furthermore, several general education areas have identified possible changes or improvements, but many program-related matters remain somewhat on hold until a decision is announced regarding the College's possible shift from offering primarily 3-credit courses to offering mostly 4-credit courses. For example, based on assessment results, the Social Science committee has determined that the basic learning outcomes in Social Science need to be revised but are awaiting the decision on whether this shift will take place. Nevertheless, the Social Science committee believes that using the evidence from assessment is vital to any changes made in general education. Furthermore, in addition to the changes discussed above, the Western Humanities faculty has discussed on many occasions possible new directions for the courses, as well as what those new directions might imply, in intellectual terms, and necessitate, in practical terms. It seems likely that some of the new directions the course might take could improve student performance with respect to interdisciplinary thinking. This would probably also be the case as concerns the other three intended learning outcomes. Once an institutional determination about the above-mentioned possible transition has been reached, Western Humanities faculty will be able to move forward more decisively to continue closing the loop.

Geneseo is exploring revisions to its general education curriculum. In summer 2011, a group of four faculty and one administrator attended the AAC&U General Education Institute conference in San Jose. Since their return, the general education committee has been working to develop college-wide baccalaureate learning outcomes and considering how the entire general education curriculum might be redesigned to help meet these outcomes and advance transformational learning.

Closing the larger loop: institutional effectiveness

If the immediate purpose of student learning outcomes assessment is to improve pedagogy and curriculum, the larger purpose is to provide data capable of informing budgetary and strategic planning decisions. Together with all the other evidence about an institution, evidence and conclusions regarding student learning enable the institution to prioritize and allocate resources so as to better meet the institution's mission: in Geneseo's case, transformational learning.

As illustrated by the flow chart above, the institutional effectiveness loop at Geneseo is closed when the president, in consultation with the cabinet, takes budget- and planning-related actions informed by recommendations of the SPG and the Budget Committee, recommendations that are in turn based partly on information provided to them by CAAC, which itself serves as a filter of information and recommendations originating from the assessment data and reflection generated by all areas: academic programs, general education, and the college's various administrative divisions.

With CAAC just now compiling its first report to SPG, the college is poised to close the particular effectiveness loop pictured above for the first time. However, it should be noted that Geneseo has closed the loop on institutional effectiveness whenever administrative action has been taken in direct response to systematic research and reporting spurred by some immediate question, concern, or initiative. Recent examples have included actions taken in response to reports of the President's Commission on Diversity and Community (established 1998), the President's Task Force on Faculty Roles and Rewards (2002-04), the Provost's Task Force on Curriculum (2007-09), and the six working groups comprised by the president's Six Big Ideas initiative. For instance, in response to the report of the Six Big Ideas working groups on Re-thinking the Course Load and Bringing Theory to Practice, the provost in 2011 invited academic departments to apply for "curriculum innovation" grants of up to $25,000 to conduct activities, purchase books and other materials, and bring in consultants — all towards the end of developing new models of instruction designed to promote transformational learning.

It should also be noted that the loop pictured above shows some information flowing to the SPG and the Budget Committee directly from the Office of Institutional Research. Thus, although CAAC is still compiling its first report, the SPG has been able to develop a set of Strategic Planning Indicators based on mission-related data from all areas of the college — such as six-year graduation rate, community service, and alumni giving — that are reported yearly.

The combination of data from Institutional Research and systematic reporting by issue-specific investigative bodies such as the Six Big Ideas working groups has enabled Geneseo to make strategic planning decisions based on a wealth of information about institutional effectiveness. The addition of a yearly report from CAAC will enrich and diversify this information by incorporating filtered data collected annually from every academic program and administrative division of the college. As a result, Geneseo will continue to close the loop on institutional effectiveness with heightened confidence that its decisions are based on the most solid and comprehensive evidence available.

Support for the institutional effectiveness process

It is not by accident that Geneseo has developed a culture of assessment. Over the ten years that the college has been measuring institutional effectiveness, support for the process has been continuous and strong.

The College Assessment Advisory Council is chaired by an associate provost in Academic Affairs who also directly oversees student learning outcomes assessment. The associate provost is assisted by the faculty chair of the Academic Program Assessment Committee, who also serves on CAAC. The faculty chair receives released time from teaching in order to manage the affairs of APAC; communicate with department chairs, assessment coordinators, general education area chairs, and the faculty as a whole regarding assessment initiatives and timelines; and maintain the various wiki spaces for assessment.

The academic affairs division has paid for the faculty chair of APAC to attend assessment workshops, assessment conferences, and meetings of the Western New York Assessment Leaders group. Interested assessment coordinators, general education chairs, and department chairs have also received financial support to attend assessment workshops.

Geneseo has financed staff attendance at several conferences of the American Association for Higher Education (AAHE) and on a number of occasions has sent teams of faculty to the annual Assessment Institute (focused exclusively on learning outcomes assessment) held by Indiana University-Purdue University Indianapolis.

The college has invited outside consultants to visit the campus and give presentations on institutional effectiveness and learning outcomes assessment. These include Nichols and Nichols in 2001, Dr. Jay Armino in 2003, and Dr. Ephraim Schecter in 2006.

The Office of Institutional Research, whose director serves in an advisory capacity to APAC, supports the effectiveness process by gathering and disseminating information that gives the college a clearer picture of its strengths and challenges.

As noted earlier, the wiki spaces for academic assessment and academic affairs administrative assessment support the institutional effectiveness process by promoting transparency, making information easy to find, and providing feedback. On a recent survey 80 percent of faculty agreed or strongly agreed with the statement, "I know where to find the assessment results for my program."

Of the 20 departments that received written feedback from APAC over the past two years, seven showed some signs of incorporating this feedback into their upcoming assessment efforts, demonstrating effective support of these programs. This was sometimes indicated in changes made in a subsequent assessment report (especially revised learning outcomes or updated evaluation rubrics). In other instances, comments added to the wiki indicated the departments intention to incorporate APAC's suggestions into future assessment plans. A recent survey revealed that 56 percent of chairs and assessment coordinators have found APAC's assessment feedback useful. Fifty percent of faculty said that they believe Geneseo provides support for assessment; nearly 75 percent said that their departments provide support for student learning outcomes assessment.

Foregrounding transformational learning

To return to Lynn Swaner's words at the head of this chapter, assessment at Geneseo is helping to "create a context in which engaged learning" can become one of the "dominant objectives of the institution." It is doing this by shifting the campus culture toward an emphasis on educational outcomes rather than inputs, by helping to establish an expectation that the college's success will be judged by direct evidence of what students know and can do, and by modeling (in the assessment loop itself) the process of self-transformation through reflection on experience.

The shift in campus culture is evident in the two-day retreat on Bringing Theory to Practice attended by 20 faculty and professional staff in June 2009. The retreat was an outgrowth of Geneseo's involvement in the national Bringing Theory to Practice project of the Association of American Colleges and Universities. As described in the final report of President Dahl's Six Big Ideas task force on Bringing Theory to Practice (2009), participants in the retreat developed 12 "outcomes" for transformational learning at Geneseo, including such learning outcomes as "Students will be able to see problems from multiple angles" and "Students will develop leadership capabilities and apply them in college and after graduation in order to serve the public good."

The Bringing Theory to Practice Task Force compiled an inventory of transformational learning activities at the college. Under the heading "Assessment" in its final report to the president, however, the task force observed, "We do not ... have a good idea of how effectively these activities promote transformational learning or student psycho-social well-being." Without a culture of assessment, this is not the kind of observation that a campus makes about its own achievements.

The National Survey of Student Engagement (NSSE) provides some evidence of Geneseo's performance on the kinds of outcomes that emerged from the Bringing Theory to Practice retreat. Geneseo performs better than its COPLAC and national peers on many high-impact learning experiences; for example, more Geneseo seniors report performing community service or volunteer work, working on a research project with a faculty member, or studying abroad. However, on other outcomes, such as problem solving or relationships with faculty, Geneseo scores below its COPLAC peers and is on par with national peers; Geneseo seniors do not examine the strengths and weaknesses of arguments or try to understand someone else's viewpoint as frequently as do seniors at other COPLAC institutions. Geneseo seniors report forming very strong relationships with other students, but their relationships with faculty are not as strong as those formed by students at other COPLAC schools. Relationships with administrators and other staff are not as strong for Geneseo students as they are for students at both COPLAC and national peers. These comparatively lower scores suggest that Geneseo students may not be meeting the outcome of "emotional flourishing" characterized by the Bringing Theory to Practice Working Group as the feeling that, "I came here and somebody paid attention to me."

Other measures of flourishing and student well-being have yielded both positive and negative results. According to the American College Health Association Collegiate Health Assessment, administered in 2011, overall student health is positive, with 94 percent rating their health as good to excellent. The majority of students reported feeling safe on and near the campus, indicating confidence that Geneseo provides a protective environment. However, evidence pointed to room for improvement in areas of student well-being such as feeling anxious, depressed, and overwhelmed.

To address these concerns, the college has run faculty-specific and college-wide workshops on understanding and responding to student distress. Since the goal of transformational learning is for students to develop emotionally and socially as well as academically, it will be essential for the college to measure students' progress on these outcomes.

Conclusion and recommendations

The institutional effectiveness process at Geneseo meets Middle States standards and supports the college's mission of transformational learning. However, it can be improved.

  • Bring all programs and general education areas up to best practice standards for learning outcomes assessment by ensuring that these programs and areas have multiple means of assessing student learning.
  • Ensure participation in assessment by each academic program every year. This must be a priority for the Academic Program Assessment Committee and the Office of the Provost.
  • Ensure that programs that conduct assessment for accreditation (School of Education, School of Business) share some version of their results with the campus community in the academic assessment wiki.
  • Improve communication of learning outcomes to students by publishing them in the Bulletin and on the college website.
  • Highlight the importance of institutional effectiveness. Upper management should emphasize its support for this planing goal across all areas of the college.
  • Increase the visibility and influence of the College Assessment Advisory Council. Although the council is responsible for overseeing and supporting the institutional effectiveness process, and for collating and filtering effectiveness data so as to make possible informed strategic planning and budgeting, its activity and prominence are not yet commensurate with its place in the assessment loop.

...