Validation of assessment
Ross Woods, 2018
Validation is a meeting where two or more assessors meet to review assessment. They need to bring to the meeting the standards (e.g. the units), some student evidence (it may be samples), and the assessment tools. Validation is also called moderation, but if you're a school teacher, the kind of comparative activity normally called "moderation" in schools is something different and probably won't meet VET standards.
The Australian training sector has changed its definition and requirements for validation quite erratically in the last ten years, so this can only be a general guide. To make it more confusing,
validationhas two separate meanings:
- Writing a document that confirms that an assessment procedures covers all requirements.
- Writing a document that evaluates assessment judgements, which is the meaning used in the remainder of this paper.
The people in validation meetings attempt to answer some particular questions:
- Was the assessment process:
- consistent with the units (or group of units)?
- consistent with the requirements of industry?
- appropriate to the context and what was being assessed?
- consistent between assessments?
- Were the assessment tools/instruments appropriate?
- Was the evidence appropriate?
- Was the evidence interpreted correctly to make a judgment of competence?
- What improvements need to be made?
The validation is not complete until they implement the improvements and write down what they did.
Validation has these elements:
- multiple assessors
- standards (e.g. the units)
- evidence
- assessment tools
- industry requirements
- evaluation of what has been done to identify improvements
- implementation of improvements
- a written record
Why do it?
Its main purpose is to improve the quality and consistency of assessment. Other than that, its purpose may be any of the following:
- as part of organizational quality assurance processes
- to address an identified area of risk in assessment practice
- to ensure assessments meet the requirements of the competency standards
- to provide evidence for an external or internal audit
- to evaluate assessment tools
- to provide professional development and increase assessor confidence
- to determine whether different assessors using the same tools collect the same kinds and amounts of evidence
- to determine whether different assessors interpret the same evidence similarly
- to determine whether assessment decisions reflect the principles of assessment and rules of evidence
When do you do it?
It is still good practive to do validation at least once a year. You can validate:
- before the assessment if you already have evidence that you can use. The main question is "How will we get the assessment right?"
- during assessment, by having more than one person conducting the assessment with you. The main question is: "Are we getting the assessment right?"
- after assessment as part of your review of what you did. The main question is: "How did we go and what could we do better next time?"
Because it has a review component, this approach places it after the assessment. Besides, you might not already have evidence so can't do it beforehand.
Special Cases
You should validate assessment as a routine procedure, and you'll seldom have a problem. But every now and then, somebody will be concerned about assessment. They might:
- think "the standard" is unclear, whatever they mean by that,
- challenge an assessment result
- claim the procedure is wrong, or
- say that the system doesn't work for their students.
When you meet to resolve the matter, go through the problems and get workable answers. Write notes of the meeting, call them validation meeting notes, and file these with your other validation records. This is doing validation because you really, really need to do it. Consequently, these kinds of validation results are usually more valuable to everybody.
Do your preparation
You will need to schedule processes and to work within agreed timeframes, and you'll need the interpersonal skills to work with a wide range of people.
First, say why are you doing the validation.
Second, say what you are validating. A whole qualification? Particular units?
Third, get records. These should include samples of:
- evidence that was assessed
- assessment tools
- any other assessment records
- previous vaidation results
Then, get hold of the feedback from clients, students and other instructors and assessors. This should be recorded, most likely in feedback forms from students and perhaps in assessment forms. Your organization might also have other kinds of feedback systems such as surveys or focus groups.
If you use validation forms such as at the end of this chapter, you will be better able to organize everything you gather. The format should help you analyze and discuss what you have. You might also want forms or proformas for reporting to management or observing assessments.
Fourth, get hold of your organization’s system for quality assurance, assessment procedures, and continuous improvement. It might be a written policy, a procedures manual, or a set of forms. It probably uses particular strategies or standards. It may also have particular goals for which the organization is aiming. To use it in your validation, you will have to be sure you understand what it requires of you.
Discuss your organization’s quality assurance system with your supervisor to find out what particular expectations he/she has of you.
It is quite likely that some organizational requirements affect the validation, for example, the time and resources allocated to validation may already be defined, and you may have to follow particular procedural constraints such as:
- how assessment information is maintained and retrieved
- goals, objectives, plans
- legal and organizational policies or guidelines
- business and performance plans
- collaborative or partnership arrangements
You will find other principles that could affect what you do, such as ethical standards, confidentiality, OHS, access, and equity.
Fifth, define the organizations and people that will be involved. The organizations might be:
- internal to your organization, either on the same site or across sites
- external to your organization, e.g. in a industry, region, city, state, assessor network
- a licensing or professional body
What about the individuals? How will you contact them? These must include other assessors, some of whom might be peers from other training organizations.
If you are the only person in your organization who assesses in your field, you must do validation with assessors in other organizations.
Sixth, analyze the benchmarks against which you are assessing. What do you conclude? (For example, you might be not be covering all requirements equally well.)
Seventh, when and where will you meet?
Eighth, get agreement on:
- the evidence that will be validated
- the benchmarks or standards
- any relevant related documentation
- any materials that you will use in validation sessions (assessment reviews, assessment tools, forms, etc.)
Ninth, confirm the arrangements with supervisors and the other assessors.
Doing validation
Back to the simple principle of doing what you planned. If your RTO has a range of assessors who are assessing against the same competency standards, have a look at the evidence that they assessed. These questions will be a good guide:
- How consistent are the assessments? Are there good reasons for any variations?
- Are assessment judgments reliable and consistent?
- Have all reporting and recording requirements been met?
- How do the assessment tools compare against national criteria for assessment materials?
- Are records accurate? Do they meet the requirements of the current training sector standard?
- Does your procedure work?
- Is any information inconsistent, ambiguous, or contradictory?
- Does it work equally well for each student or each environment in which you will conduct assessments?
- Could you achieve better cost-effectiveness?
- Were there lessons you learned or innovations you made that you could use in other situations?
- Suggest any necessary changes. What would you do differently next time?
Specific things to do
It isn't enough for you simply to participate. You'll need to play your part in determining the results. Join in the discussions, analyze the findings, and contribute to the group agreement to improve assessment quality. Take part in making recommendations and writing them down.
Recommendations for improvement may include anything helpful. Assessors might need better professional development. You might change policies and procedures, strategies, assessment plans, methods, records management, resources, or evidence collection. You might want to change tools or develop new ones.
You might change partnership arrangements or form new partnerships. You might recommend giving more or better information for assessors or students, or give better advice, support, or supervision of assessors. You might instigate a system of exemplars. You might think that liaison with technical experts might be most helpful.
Here are some specific things that you'll want to examine when you do validation:
- Specify exactly the difference between acceptable and unacceptable performance wherever it isn’t clear enough in unit statements. It might help to file examples of students' work. Define those differences clearly and concisely.
- Check the principles of assessment and rules of evidence.
- Check all documents for accuracy and version control.
- Make clear interpretations. Many outcomes and assessment criteria need “interpreting” to be useful. How much is “adequate”? How long should a research paper be? You might do interpretion by adding more explicit criteria or further explanations for students. Of course, avoid very long explanations that confuse the point.
- Identify significant assumptions. One lecturer found that all his assessments worked very well using the assigned unit statements. However, he was assuming that the context of assessment was always the same, that is, the work requirements of his particular organization. It was quite valid, but the assumption needed to be made explicit.
- Suggest better procedures. If variations in assessment procedures are the problem, suggest something a way of making them more consistent.
- Anticipate problems and do something about them as early as possible. Sounds obvious, but it just goes to show that validation is another form of risk management.
- Take good notes. Why bother with validation if it’s just talk? Notes should be more than personal scribble, they need to be put on on record.
Your validation results will almost certainly also result in better unit statements, better assessment procedures and more useful assessment criteria. Then students would have a better idea what to expect in assessment, and you will have a better idea of what works and what doesn’t. You might record your validation as changes in workbooks.
Some interpretations are so clearly implied that you won’t include them in the unit statement, but you should still write them down in your validation notes. In other cases, satisfactory requirements were written down all along, but you realise that you’ve been less diligent lately and simply need reminding that they are still important.
Making changes
So far so good. But the point of it all is to make improvements, and you still have to use all this to improve your teaching and assessing.
Decide what changes need to be made. You might be able to make the changes at the validation meeting. For example, you might edit the tools straight away, or send your manager a request for a change of assessment procedures. If you can't make the changes straight away, have a way of reporting back so you can document what you did.
Documentation and reporting
It’s now time to report what you have found. Find out who need to report to. You should report in writing and include any recommendations you may have to improve the quality and consistency of assessment. You should also give constructive feedback to other assessors.
You may also have the opportunity to report by:
- oral presentation
- audio visual
- report in the staff area of the website
- a professional development meeting
- a staff meeting, or
- a board meeting
Other things that you might do
As we said, you'll review assessment tools, collected evidence, assessment decisions, assessment decisions, and records. Other than that, you have a wide toolkit of kinds of activities you might do as part of validation. You probably won't need all these approaches, but some of the main ways of going about it are:
- Working with another assessor, see if you both produce the same conclusions from the same evidence. (Another variation: Have someone else re-assess your evidence to see if you produce the same conclusions.)
- Process feedback:
- Interview managers, clients, trainers/facilitators, and/or students and analyze their feedback
- Observe assessment conduct
- In discussion with colleagues, improve criteria by:
- discussing and comparing what assessors do and why, and agree on procedures and interpretations of criteria.
- removing potentially ambiguous language and making criteria clearer (e.g. more explicit or specific, removing stretching)
- re-wording them to become less cumbersome and more workable.
- identifying which factors are most significant and writing them into the assessment tools. (For example, in an oral assessment of foreign language, spontaneity and understandability are often more important than grammatical accuracy.)
- Revising assessment materials by:
- developing and updating detailed training and assessment manuals
- filing or improving exemplars of benchmark materials
- filing examples of satisfactory passing performance and unacceptable performance, and record the reasons for making those decisions.
- Moderate with industry, in which case, it isn’t so much about ensuring consistency as about meeting industry standards
- Check standards:
- Reviewing and interpreting Assessment Guidelines
- Re-checking evidence against key assessment criteria
- Examining assessor qualifications
- Examining assessment records and systems
- Research industry standards
- Analyzing particular assessments that went to appeal and the appeals processes
If you are the only assessor in your field in your RTO, you might:
- do an Internet search of benchmarks and compare your assessments to the benchmarks.
- moderate with other RTOs.
- moderate with industry rather than other assessors, which will probably double up as your industry consultation.
Ongoing quality maintenance
Various strategies are available to you as the person ensuring that assessments are transparent and credible:
- You can provide assessors with written information about common pitfalls and errors that affect judgment.
- You can stimulate open and ongoing communication between assessors. This might be done through the ordinary staff meetings, moderation meetings, providing professional development activities, or linking people into assessor networks.
- You can get assessors to evaluate their own performance.
- You can maintain contact with industry representatives
- You can use assessment panels or teams. They can work together to plan, critique, or review assessments.
- You can also arrange mentoring and coaching in assessment. This most likely affects new staff, especially those still teaching under supervision.
At organizational level
Assessment reviews are also organizational, not just your personal responsibility as an assessor. The present trend is to require assessors to take a more active role in the quality management of programs.
Aspects of quality management that may affect you are:
- Assessment plans
- Assessment policies and procedures
- Assessment instruments
- Information for candidates
- Internal audit
- Stakeholder evaluations
- Record keeping
Accreditors sometimes conduct reviews of assessment in a number of RTOs, and your organization might be required to participate.
An example of how to do validation badly
A very simple kind of validation is review by staff meeting, also known as peer review. While this approach can be done well, it can also be done poorly:
Alex was the only lecturer in literature in the college, so he was always assigned to teach and assess all literature units by himself. That put him in a difficult situation; the standards meant only what he said they meant so his assessments were biased to his personal literary opinions. And if he became stuck, he had nobody to ask.
Last semester, some of Alex’s students thought several assessments were quite unfair, and many good students performed very poorly. Although they mentioned this in feedback, Alex thought that it was because they hadn’t really got several important points. So he suggested no changes.
When review came up, he usually got the final say because other lecturers didn’t know enough about his field to suggest anything different. For the most part, staff meetings were not much more than a time of saying “Yes, we’re happy with what we do” and validation reports more-or-less said just that.