Skip to Main Content

Academic Assessment

Diving Into the Data

Scuba clipart

Assessment data can offer evidence of student learning, reveal a program's weaknesses and strengths, and inform actions towards improving or enhancing student learning. o get the most out of your assessment results, it is necessary to dive into the data. 

  • Compare data to provide greater meaning. A single point in time does NOT do this. Data can be compared to baseline data, previous results, existing standard or criteria, or between different student populations. 
  • Analyze data in the context of stated SLOs and benchmarks.
    • What does the assessment data say about achievement of student learning? Did students demonstrate an acceptable level of proficiency for the stated SLO? Did they meet established benchmarks?
    • Are there weaknesses in any particular skills?
    • Alternatively, are there areas where students excelled?
    • What does the assessment data say about students’ preparation for the next course in the program or next step in their career pathways?
  • Note and describe observations and trends. Include representative excerpts or samples of student work to accompany quantitative displays.

 

A finding is a concise summary of the results gathered from a given assessment method. Further findings should provide information necessary to make informed decisions related to improving outcomes.

Analysis of Findings

Analysis is a systematic examination and evaluation of the findings or data obtained through assessment. It is about deriving meaningful and useful information about students' learning from assessment results. Analysis of data should be a group effort. It needs to involve all program faculty as well as faculty from outside the program when appropriate.

Look for trends or patterns of evidence. Common patterns to consider:

  1. Patterns of Consistency: this type of pattern develops by studying data acquired from the same outcome over a period of time. The period of time could be from semester to semester or year to year.
  2. Patterns of Consensus: this involves disaggregating the data to determine if all populations are achieving the expected level of performance. Aggregate data (i.e. reporting an average score on an outcome measure) may hide the fact that a certain population of students is NOT achieving the expected level of performance. Data may be broken down by gender, first-generation students, non-traditional students, students of various ethnic backgrounds, students enrolled in traditional versus online classes, students enrolled in day versus night classes, etc.

Other questions to ask or situations to consider as data are analyzed include:

  • Do you need to disaggregate data to analyze results for particular variables such as method of instruction, day/evening section, campus, adjunct versus full-time faculty?
  • Is the "N" in the data set reasonable? Have proper sampling procedures been used?
  • Does the data represent an acceptable level of achievement? For instance, if data indicates that 80% of the students performed at the expected level of achievement, what happened to the other 20%? Is it acceptable that 20% of the students did not meet the minimum standard?
  • Whether a target was achieved or not, were there areas defined within the tool in which students consistently demonstrated deficiencies? Likewise, were there areas in which students’ performance exceeded expectations?
  • Did the assessment tool work? Was it appropriate? Did the tool validate student learning of a particular outcome?
  • Did the tool satisfactorily distinguish various levels of achievement?
Accessibility at GTC