2022
Response to Theatre and Dance Department’s 2021-22 Assessment
Assessment Process: 1.5 minimally/needs development
The APAC Committee empathizes with Professor Ferrell’s comments. Instructions on how to use rubrics and grades in Canvas were lost halfway, making it more complicated to understand results. Ferrell indicates that he does not know how to access the data. We think it would be helpful to him if a data person would assist him with this task. Including details on assignments and selecting 1 or 2 assignments that are most relevant to the learning outcome would make the results more relevant.
We think the department is trying to get started with assessment. The department needs help simplifying a process that will work for them. Whatever they did in Canvas seems only to have complicated things.
There’s not a clear plan about which goals to assess each year. The department should define a cycle of assessing all outcomes on a 3-5 year rotation.
The department doesn’t appear to receive or analyze the assessment data gathered. Theatre/Dance needs an export of data to inform faculty members’ individual summaries and then a discussion of results by the department. Faculty comments seem disconnected from data gathered on the outcomes.
2. Learning Outcomes: 2 minimally developed
Learning Outcomes look fine as articulated on the wiki. The APAC committee has some minor suggestions. The adjectives are vague, perhaps purposefully so. That is, the discipline may change over time in such a way that "substantial knowledge" could mean different things over time. Learning outcomes 1 and 4 could be sharpened, with specific language, and outcomes 2 and 3 stated so as to be measurable.
3. Artifacts and Instruments: 1 needs development
No rubrics or examples were provided. Perhaps creating a rubric will help to reach a consensus on what to assess regardless of the instrument used by each instructor/section. To start developing the Collaboration rubric, how does the department define collaboration? Give some examples. Alternatively, you might start from the AAC&U VALUE rubric for collaboration and make the categories more discipline specific.
Information on the assessments used in collecting the data is missing. There are neither artifacts nor instruments.
Conclusions are presented. The two spreadsheets contain an assessment score to each student. It is not clear if a rubric was used to get the score. For department discussion and for our committee, it is better not to display student names or G numbers with assessment data; the spreadsheet can have only the relevant columns excerpted.
4. Use of Results: 1 needs development
Data could be presented and analyzed in graphics, since you already have a spreadsheet. We recommend a department discussion on creating rubrics, and using assignments for assessment of learning outcomes.
APAC appreciates the effort to use data from Canvas, changing the old method and investing time in a different approach for this cycle of assessment. Learning is a growing process :)
The committee found that some attempt at general statements about the learning outcomes is made, but these attempts do not refer to the data collected.
We would love to see a cycle of feedback that the Department can use to improve its instruction.
We don't see evidence of any discussion of how the department is using assessment results. That said, they are still trying to set up a process, and we want to encourage that.
5. General Comments:
We think Theater and Dance would really benefit from assistance in developing, implementing, and archiving assessment-related plans and procedures. The APAC committee discussed how to provide help with data to small departments; possibilities include a student intern in Data Analytics major or a student worker for Matt’s IR office.
APAC plans an assessment for the arts session at ASSESStivus in January/February 2023.
APAC Notes on 2017-18 THEA Assessment
The following notes were made during APAC's 28 March 2019 meeting.
While the committee found the breaking out of the data by course section to be interesting, there was some confusion as to what outcomes were being assessed. The department has three listed Program Leaning Outcomes: these did not seem in full alignment with the "criteria" being used in each course.
It would also follow best assessment practices to breakout the results by criterion, particularly when there are as many as seven different ones.
Missing are details concerning the rubrics (if any) used, the instruments' definitions and specific criteria, and the kinds of student work assessed.
Department provided PDF with multi-year assessment data, but little along the lines of reflection or "loop-closing."