A failure of imagination
A discussion of assessors' impressions on vocational education, with Amy Bolezny, Phillip Rutherford, and Ross Woods
Many assessments fail, not just for technical reasons, but because writers of competency standards and assessment tools forget to imagine what the job actually requires. They fail to see the essential link between the competency standard and the reality of what workers do on the job.
The fatal flaw
Many writers of assessments at higher certificate and Diploma levels seem to have been indoctrinated in the belief that they must set academic-style assessment tasks. Admittedly, essays have a place in an academic environment to promote research and critical thinking, and paper-based learning can be good or even unavoidable in vocational training and assessment. Yet these approaches should not be the default position.
This misplaced trust in academic-style assessment tasks arises from various causes:
- Many writers of competency standards and assessment tools use language that hides the main points: the skills people need to do the job for which they are trained. They have not actually performed the roles about which they write, so they really do not know what people do. Some competency standards are better than others, but many are vague and open to interpretation rather than objective analysis.
- Some writers of assessment tools and assessors feel they need to prepare students to continue to higher education where they will be required to write essays.
- Writers of assessment tools try to replicate their own educational experience by setting up a "school" experience with exams and lots of paper worksheets.
- Writers of assessment tools are accustomed to being overly analytical and prescriptive to get the approval of auditors and validators.
These fatal flaws prevent otherwise good assessors from translating what workers actually do on the job into assessment tasks that reflect best practice.
“Naturally occurring” evidence
Compared to disembodied academic tasks, it is more effective to use “naturally occurring” evidence, that is, the evidence of skills and knowledge that occurs naturally in the workplace by actually doing the job. It is neither forced nor manufactured, and might not even require the student to especially demonstrate skills for the purpose of assessment.
Here’s an example from management:
This week, we had another assessment writer assign a task to management students: 'Write an essay about how you would … .”
As a manager, do you write essays at work? What do you actually do? Ah, you write plans and reports, you analyze risks, you keep risk registers, and monitor compliance with an overall strategy statement with legislation. What else? Ah, you solve any problems on the way, you evaluate the skills of staff, and you develop training to fill the gaps.
Now, what are you going to do to match up the assessment with what is done on the job? Consequently, instead of an essay, why not get students to write a management plan that incorporates all the “how I will actually do this,” get it approved, and then implement and monitor the plan.
That is what managers do in real life. It only takes a little shift in imagination and focus to develop great and valid assessments that meet the competency criteria and all the other minutiae of the requirements.
The assessor can get the paperwork of the plan and other workplace records. The assessor can also gather comments from others in the process (for example, a supervisor, client, or colleague) providing testimony that the student did something a certain way, and this “certain way” is assessed to determine whether or not it accords with the standards.
Naturally-occurring evidence is the purest form of evidence possible. However, it is the most difficult form of evidence to gather if the college is an offsite, independent body. For example, people often exaggerate when asked to provide testimony to someone's performance, or they fail to incorporate the full action along with the context within which it was performed. Saying that "Such-and-such gave good customer service" doesn't quite explain what was done and how.
It is much easier to collect naturally-occurring evidence if assessment tools are workplace-centered, supervisors become co-assessors, and the assessment system is part of the organization’s performance management system. By doing so, assessors continually assess the progress of individual and group competence towards organizational objectives as well as helping individual staff members pick up qualifications along the way.
When training is directly linked to the achievement of organizational goals, such as profit or change, organizations are more likely to invest in training because it can see direct returns from their investment.
Conclusion
If we train our fledgling assessors and colleagues to use their imaginations, vocational colleges can better exploit their opportunities to have a greater impact on their clients, and fewer would fail audits through inferior assessment practices.