Collect data and analyze for continuous improvement

The Department of Psychology gives an example of how to collect, analyze, and present data as well as describe how the data was used for continuous improvement (click here). They incorporated data from a field test, an alumni survey, a critical thinking assessment, and data from presentations from the Research and Creative Works conference.


 A coach or a director might well respond in irritation if a colleague were to respond to requests for feedback on a game or theatrical performance with pabulum such as "Looks good," "Good job," or "That's fine."  So long as neither athletic nor theatrical perfection have been reached, helpful feedback highlights weaknesses, problems, and deficiencies.  Likewise, since we can be confident that we have not reached instructional perfection, assessment results are useful when they help us to identify learning and teaching problems. 

As you make plans for analyzing, interpreting, and acting on your assessment data, look for ways to involve students and colleagues from departments that your program serves.  What do they have to say about the quality of the learning in your program?  What do the employers of your graduates tell you about strengths and weaknesses in your program?  Remember also that the data won't speak for themselves.  They must be interpreted and you must make meaning from them.

Above all, remember Barbara E. Walvoord's focusing words: the end of assessment is action.

Reporting Assessment Data

Reporting of assessment data can be done in many ways, including excel charts.  Here is one example of assessment data by semester. Reid Nielsen, a department chair for Construction Design and Management, noticed one of their program outcomes assessed dipping significantly from one semester to another. He brought together a team to discuss what might have happened during this particular semester, identified a root cause, made an adjustment, and then monitored the program outcome assessed until it returned back to a "normal" level.

Discussion questions

Which students are progressing well toward outcomes?

What do we know about why they are doing well?

Which students aren't making good progress?

What do we know about why they aren't doing well?

What have we learned about learning during the past year?

What new questions has this raised?

Planning for improvement

What outcome or outcomes will we explore this year?

What curricular or pedagogical change do we propose?

What information do we have?  What is missing?

What funding and/or faculty leaves will we need?

What role do we anticipate for student researchers?

How will we go public with our findings?