Various uses of the term, “Assessment”

Q: I keep on hearing about assessment. Assessment as in a test, assessment as in learning outcomes, assessment as in course assessment, assessment as in program assessment. What’s up?

A: The term, “assessment,” when commonly used at the college refers to a midterm or final exam, or another assignment designed to test for student acquisition of foundational knowledge or ability to clinically reason. In this sense, “assessment” serves as another word for examination.

Assessment can also refer to a plan or structure established for constant course and program improvement, and a way to ensure a college is collectively doing what it says it does — that students learn what we say they will during their time with us.

Because examinations are designed to test student knowledge, it would seem reasonable to equate the exam with learning outcomes achievement. This assumes that questions on examinations are aligned (directly related to) stated outcomes for a lecture or course or program.

“There is often confusion over the difference between grades and learning assessment, with some believing that they are totally unrelated and others thinking they are one and the same. The truth is, it depends. Grades are often based on more than learning outcomes. Instructors’ grading criteria often include behaviors or activities that are not measures of learning outcomes, such as attendance, participation, improvement, or effort. Although these may be correlated with learning outcomes, and can be valued aspects of the course, typically they are not measures of learning outcomes themselves.

“However, assessment of learning can and should rely on or relate to grades, and so far as they do, grades can be a major source of data for assessment.” (http://www.cmu.edu/teaching/assessment/howto/basics/grading-assessment.html#scoringparticipation)

When deciding on what kind of assessment activities to use, it is helpful to keep in mind the following questions:

  • What will the student’s work on the activity (multiple choice answers, essays, project, presentation, etc.) say about their level of competence on the targeted learning objectives?
  • How will the instructor’s assessment of their work help guide students’ practice and improve the quality of their work?
  • How will the assessment outcomes for the class guide teaching practice?

This table from Carnagie Mellon University presents examples of the kinds of activities that can be used to assess different types of learning objectives, and the ways that we can analyze or measure performance to produce useful feedback for teaching and learning. The categorization of learning objectives is taken from the revised Bloom’s Taxonomy.

Type of Learning Objective Examples of Types of Assessment How to Measure
Remember
Students will be able to:

  • recall
  • recognize
  • Objective Test items that require students to recall or recognize information:
  • Fill-in the Blank
  • Multiple Choice items with question stems such as, “what is a…”, or “which of the following is the definition of)
  • Labeling diagrams
  • Reciting (orally, musically, or in writing)
  • Accuracy – correct vs number of errors
  • Item Analysis (at the class level, are there items that had higher error rates? Did some items result in the same errors?)
Understand
Students will be able to:

  • interpret
  • exemplify
  • classify
  • summarize
  • infer
  • compare
  • explain
  • Papers, oral/written exam questions, problems, class discussions, concept maps, homework assignments that require (oral or written):
  • Summarizing readings, films, speeches, etc.
  • Comparing and/or contrasting two or more theories, events, processes, etc.
  • Classifying or categorizing cases, elements, events, etc., using established criteria
  • Paraphrasing documents or speeches
  • Finding or identifying examples or illustrations of a concept, principle
  • Scoring or performance rubrics that identify critical components of the work and discriminates between differing levels of proficiency in addressing the components
Apply
Students will be able to:

  • execute
  • implement
  • Activities that require students to use procedures to solve or complete familiar or unfamiliar tasks; may also require students to determine which procedure(s) are most appropriate for a given task.  Activities include: Problem sets, performances, labs, Prototyping, Simulations
  • Accuracy scores
  • Check lists
  • Rubrics
  • Primary Trait Analysis
Analyze
Students will be able to:

  • differentiate
  • organize
  • attribute
  • Activities that require students to discriminate or select relevant from irrelevant parts, determine how elements function together, or determine bias, values or underlying intent in presented materials. These might include: Case studies, Critiques, Labs, Papers, Projects, Debates, Concept Maps.
  • Differentials
  • Rubrics, scored by instructor, juries, external clients, employers, internship supervisor, etc.
  • Primary Trait Analysis
Evaluate
Students will be able to:

  • check
  • critique
  • A range of activities that require students to test, monitor, judge or critique readings, performances, or products against established criteria or standards.  These activities might include: Journals, Diaries, Critiques, Problem Sets, Product Reviews, Case Studies.
  • Diagnosis
  • Rubrics, scored by instructor, juries, external clients, employers, internship supervisor, etc.
  • Primary Trait Analysis
Create
Students will be able to:

  • generate
  • plan
  • produce
  • Research projects, musical compositions, performances, essays, business plans, website designs, prototyping, set designs
  • Treatment plans
  • Rubrics, scored by instructor, juries, external clients, employers, internship supervisor, etc.
  • Primary Trait Analysis

http://www.cmu.edu/teaching/assessment/howto/basics/objectives.html (with some adaptation)

Assessment of student learning also occurs at the program (DVM) level and the institutional (OSU) level.

Faculty members may be asked to contribute data that helps the college or the university measure student achievement of program/institutional outcomes. For CVM, that might be submitting rubrics used in a second-year course to grade an assignment that requires clinical reasoning, or a rubric from a surgical skills rotation that measures the competency of a student demonstrating a particular skill. An assignment on research methodology and medical literature review in the first year, as well as one required during a fourth-year rotation, might be used in assessment of our students’ achievement of AVMA’s 9th outcome, “critical analysis of new information and research findings relevant to veterinary medicine.”

Program assessment follows this type of pattern.

  • Step 1: Identify a limited number of learning outcomes for the program.
  • Step 2: Create a plan for exploring one or two of those outcomes and what data you will need to do so.
  • Step 3: Collect data — student work from projects or papers, as well as other indirect and direct evidence of learning.
  • Step 4: Analyze the data to see how the students are meeting the outcomes.
  • Step 5: Use the evidence collected to confirm the quality of the program or to make changes for improvement. (https://provost.illinois.edu/assessment/learning-outcomes-assessment/what-is-learning-outcomes-assessment/#categories)

If you would like to explore any of these nuances further, please don’t hesitate to contact the Office of Teaching & Learning.

 

Leave a Reply

Your email address will not be published. Required fields are marked *