Monthly Archives: February 2016

Measurement of Skills – what do we actually want to know?

Raising key questions, yesterday’s published quote of HECQO recommended testing of “soft” / “essential” skills (math, communication, problem-solving, critical thinking, teamwork…) upon entry and graduation to PSE. (Read news article at www.thespec.com/news-story/6327523-young-grads-need-to-brush-up-on-3-r-s-employers-say/)

The question or measurement is an important issue in Canadian PSE. So I ask two questions of the individuals within the PSE community whom I greatly respect: What do we really want to know? What would it mean to get measurement right?

First, Why would we measure? 

I can think of two reasons:

  1. Learners period. Regardless of who sets the goals and processes of their learning, students care, educators care, and society cares. We may see success differently, but generally there is a continuation of learning, of growth in knowing and seeing, and a contribution to society through their gained ability to wrestle with big (and small) issues and tough (and daily) problems.
  2. Contributing our understanding of teaching concerns and successes is an important role of measurement. Do students really know “less math” now than 10 years ago when they arrive? Is there a shift in language and conception that mismatches with expectations? For those who like numbers, quantitative measurements can provide a more comfortable ground for addressing and celebrating. For those who like narrative but talk with those who like numbers, conversations become more possible.

Second, What are we measuring?

  1. Knowledge or Expertise: I may know a formula or close reading, and still not be able to apply it to a public debate about water sovereignty or creation of a new innovation. Knowledge (especially at the recall or near-application level) that might be tested is not a substitute for assessing thinking process, reasoning & decision-making based on knowledge, and the nuanced application of knowledge to a complex situation that requires recognition of deep features rather than superficial ones.
  2. Specificity: Are we sufficiently measuring the relevant breadth of knowledge when we measure numeracy, literacy, writing…(informational literacy…statistical literacy…)?
  3. Building on the work of others: How might the VALUE rubrics of AACU be adapted? What are the limits of measurement and the tensions of measurements identified in the UK, Europe, Australia, USA? What can we learn from colleagues coast to coast to coast?

“Knowledge” like measuring foot size for running speed

Fundamentally I think the question of measuring “knowledge” is misnamed and in the misnaming there is a risk of mismeasurement. “Application” is insufficient too, as is “job-readiness” for I don’t think we could be prepared for every task or job all of us do. Knowledge is likely related and likely necessary but is not sufficient for what we are seeking for our learners.

Instead, I think we are seeking the foundational concept of expertise in its many forms. We want learners to go from superficial snap trial by error that leads to incomplete or inappropriate answers -> to seeing the real (deep) features of an issue, to thinking through the process, making decisions about what matters and what to do, and then creating and applying in nuanced ways the ideas, facts, processes and capabilities of their disciplines. Whether it is to HR decisions, communication rhetoric, bridge spans, server backups, policy creations or critiques, society needs more from our students and us than simply knowledge.