Validation of assessment

Ross Woods, 2018

Validation is a meeting where two or more assessors meet to review assessment. They need to bring to the meeting the standards (e.g. the units), some student evidence (it may be samples), and the assessment tools. Validation is also called moderation, but if you're a school teacher, the kind of comparative activity normally called "moderation" in schools is something different and probably won't meet VET standards.

The Australian training sector has changed its definition and requirements for validation quite erratically in the last ten years, so this can only be a general guide. To make it more confusing, validation has two separate meanings:

  1. Writing a document that confirms that an assessment procedures covers all requirements.
  2. Writing a document that evaluates assessment judgements, which is the meaning used in the remainder of this paper.

The people in validation meetings attempt to answer some particular questions:

  1. Was the assessment process:
  2. Were the assessment tools/instruments appropriate?
  3. Was the evidence appropriate?
  4. Was the evidence interpreted correctly to make a judgment of competence?
  5. What improvements need to be made?

The validation is not complete until they implement the improvements and write down what they did.

Validation has these elements:

  1. multiple assessors
  2. standards (e.g. the units)
  3. evidence
  4. assessment tools
  5. industry requirements
  6. evaluation of what has been done  to identify improvements
  7. implementation of improvements
  8. a written record

Why do it?

Its main purpose is to improve the quality and consistency of assessment. Other than that, its purpose may be any of the following:

When do you do it?

It is still good practive to do validation at least once a year. You can validate:

Because it has a review component, this approach places it after the assessment. Besides, you might not already have evidence so can't do it beforehand.

Special Cases

You should validate assessment as a routine procedure, and you'll seldom have a problem. But every now and then, somebody will be concerned about assessment. They might:

When you meet to resolve the matter, go through the problems and get workable answers. Write notes of the meeting, call them validation meeting notes, and file these with your other validation records. This is doing validation because you really, really need to do it. Consequently, these kinds of validation results are usually more valuable to everybody.

 

Do your preparation

You will need to schedule processes and to work within agreed timeframes, and you'll need the interpersonal skills to work with a wide range of people.

First, say why are you doing the validation.

Second, say what you are validating. A whole qualification? Particular units?

Third, get records. These should include samples of:

Then, get hold of the feedback from clients, students and other instructors and assessors. This should be recorded, most likely in feedback forms from students and perhaps in assessment forms. Your organization might also have other kinds of feedback systems such as surveys or focus groups.

If you use validation forms such as at the end of this chapter, you will be better able to organize everything you gather. The format should help you analyze and discuss what you have. You might also want forms or proformas for reporting to management or observing assessments.

Fourth, get hold of your organization’s system for quality assurance, assessment procedures, and continuous improvement. It might be a written policy, a procedures manual, or a set of forms. It probably uses particular strategies or standards. It may also have particular goals for which the organization is aiming. To use it in your validation, you will have to be sure you understand what it requires of you.

Discuss your organization’s quality assurance system with your supervisor to find out what particular expectations he/she has of you.

It is quite likely that some organizational requirements affect the validation, for example, the time and resources allocated to validation may already be defined, and you may have to follow particular procedural constraints such as:

You will find other principles that could affect what you do, such as ethical standards, confidentiality, OHS, access, and equity.

Fifth, define the organizations and people that will be involved. The organizations might be:

What about the individuals? How will you contact them? These must include other assessors, some of whom might be peers from other training organizations.

If you are the only person in your organization who assesses in your field, you must do validation with assessors in other organizations.

Sixth, analyze the benchmarks against which you are assessing. What do you conclude? (For example, you might be not be covering all requirements equally well.)

Seventh, when and where will you meet?

Eighth, get agreement on:

Ninth, confirm the arrangements with supervisors and the other assessors.

 

Doing validation

Back to the simple principle of doing what you planned. If your RTO has a range of assessors who are assessing against the same competency standards, have a look at the evidence that they assessed. These questions will be a good guide:

  1. How consistent are the assessments? Are there good reasons for any variations?
  2. Are assessment judgments reliable and consistent?
  3. Have all reporting and recording requirements been met?
  4. How do the assessment tools compare against national criteria for assessment materials?
  5. Are records accurate? Do they meet the requirements of the current training sector standard?
  6. Does your procedure work?
  7. Is any information inconsistent, ambiguous, or contradictory?
  8. Does it work equally well for each student or each environment in which you will conduct assessments?
  9. Could you achieve better cost-effectiveness?
  10. Were there lessons you learned or innovations you made that you could use in other situations?
  11. Suggest any necessary changes. What would you do differently next time?

Specific things to do

It isn't enough for you simply to participate. You'll need to play your part in determining the results. Join in the discussions, analyze the findings, and contribute to the group agreement to improve assessment quality. Take part in making recommendations and writing them down.

Recommendations for improvement may include anything helpful. Assessors might need better professional development. You might change policies and procedures, strategies, assessment plans, methods, records management, resources, or evidence collection. You might want to change tools or develop new ones.

You might change partnership arrangements or form new partnerships. You might recommend giving more or better information for assessors or students, or give better advice, support, or supervision of assessors. You might instigate a system of exemplars. You might think that liaison with technical experts might be most helpful.

Here are some specific things that you'll want to examine when you do validation:

  1. Specify exactly the difference between acceptable and unacceptable performance wherever it isn’t clear enough in unit statements. It might help to file examples of students' work. Define those differences clearly and concisely.
  2. Check the principles of assessment and rules of evidence.
  3. Check all documents for accuracy and version control.
  4. Make clear interpretations. Many outcomes and assessment criteria need “interpreting” to be useful. How much is “adequate”? How long should a research paper be? You might do interpretion by adding more explicit criteria or further explanations for students. Of course, avoid very long explanations that confuse the point.
  5. Identify significant assumptions. One lecturer found that all his assessments worked very well using the assigned unit statements. However, he was assuming that the context of assessment was always the same, that is, the work requirements of his particular organization. It was quite valid, but the assumption needed to be made explicit.
  6. Suggest better procedures. If variations in assessment procedures are the problem, suggest something a way of making them more consistent.
  7. Anticipate problems and do something about them as early as possible. Sounds obvious, but it just goes to show that validation is another form of risk management.
  8. Take good notes. Why bother with validation if it’s just talk? Notes should be more than personal scribble, they need to be put on on record.

Your validation results will almost certainly also result in better unit statements, better assessment procedures and more useful assessment criteria. Then students would have a better idea what to expect in assessment, and you will have a better idea of what works and what doesn’t. You might record your validation as changes in workbooks.

Some interpretations are so clearly implied that you won’t include them in the unit statement, but you should still write them down in your validation notes. In other cases, satisfactory requirements were written down all along, but you realise that you’ve been less diligent lately and simply need reminding that they are still important.

Making changes

So far so good. But the point of it all is to make improvements, and you still have to use all this to improve your teaching and assessing.

Decide what changes need to be made. You might be able to make the changes at the validation meeting. For example, you might edit the tools straight away, or send your manager a request for a change of assessment procedures. If you can't make the changes straight away, have a way of reporting back so you can document what you did.

Documentation and reporting

It’s now time to report what you have found. Find out who need to report to. You should report in writing and include any recommendations you may have to improve the quality and consistency of assessment. You should also give constructive feedback to other assessors.

You may also have the opportunity to report by:

 

Other things that you might do

As we said, you'll review assessment tools, collected evidence, assessment decisions, assessment decisions, and records. Other than that, you have a wide toolkit of kinds of activities you might do as part of validation. You probably won't need all these approaches, but some of the main ways of going about it are:

  1. Working with another assessor, see if you both produce the same conclusions from the same evidence. (Another variation: Have someone else re-assess your evidence to see if you produce the same conclusions.)
  2. Process feedback:
  3. In discussion with colleagues, improve criteria by:
  4. Revising assessment materials by:
  5. Moderate with industry, in which case, it isn’t so much about ensuring consistency as about meeting industry standards
  6. Check standards:
  7. Analyzing particular assessments that went to appeal and the appeals processes

If you are the only assessor in your field in your RTO, you might:

 

Ongoing quality maintenance

Various strategies are available to you as the person ensuring that assessments are transparent and credible:

At organizational level

Assessment reviews are also organizational, not just your personal responsibility as an assessor. The present trend is to require assessors to take a more active role in the quality management of programs.

Aspects of quality management that may affect you are:

Accreditors sometimes conduct reviews of assessment in a number of RTOs, and your organization might be required to participate.

 

 

An example of how to do validation badly

A very simple kind of validation is review by staff meeting, also known as peer review. While this approach can be done well, it can also be done poorly:

Alex was the only lecturer in literature in the college, so he was always assigned to teach and assess all literature units by himself. That put him in a difficult situation; the standards meant only what he said they meant so his assessments were biased to his personal literary opinions. And if he became stuck, he had nobody to ask.

Last semester, some of Alex’s students thought several assessments were quite unfair, and many good students performed very poorly. Although they mentioned this in feedback, Alex thought that it was because they hadn’t really got several important points. So he suggested no changes.

When review came up, he usually got the final say because other lecturers didn’t know enough about his field to suggest anything different. For the most part, staff meetings were not much more than a time of saying “Yes, we’re happy with what we do” and validation reports more-or-less said just that.