What is validation?

Validation is a meeting where two or more assessors meet to review assessment. (Validation is also called moderation.) They need to bring to the meeting the standards (e.g. the units), some student evidence (it may be samples), and the assessment tools. RTOs must normally do validation at least once a year.

They need to answer some particular questions:

  1. Was the assessment process:
  2. Were the assessment tools/instruments appropriate?
  3. Was the evidence appropriate?
  4. Was the evidence interpreted correctly to make a judgment of competence?
  5. What improvements need to be made?

The validation is not complete until they implement the improvements and write down what they did.

Validation has these elements:

  1. multiple assessors
  2. standards (e.g. the units)
  3. evidence
  4. assessment tools
  5. industry requirements
  6. evaluation of what has been done  to identify improvements
  7. implementation of improvements
  8. a written record

Why do it?

Its main purpose is to improve the quality and consistency of assessment. Other than that, its purpose may be any of the following:

When do you do it?

You can validate:

Because there is a review component, this Cert IV TAA approach places it after the assessment. Besides, you might not already have evidence so can't do it beforehand.

 

Special Cases

You should validate assessment as a routine procedure, and you'll seldom have a problem. But every now and then, somebody will be concerned about assessment. They might:

When you meet to resolve the matter, go through the problems and get workable answers. Write notes of the meeting, call them validation meeting notes, and file these with your other validation records.

This is doing validation because you really, really need to do it. Consequently, these kinds of validation results are usually more valuable to everybody.

Do your preparation

You will need to schedule processes and to work within agreed timeframes, and you'll need the interpersonal skills to work with a wide range of people.

First, say why are you doing the validation.

Second, say what you are validating. A whole qualification? Particular units?

Third, get records. These should include samples of:

Then, get hold of the feedback from clients, students and other instructors and assessors. This should be recorded, most likely in feedback forms from students and perhaps in assessment forms. Your organization might also have other kinds of feedback systems such as surveys or focus groups.

If you use validation forms such as at the end of this chapter, you will be better able to organize everything you gather. The format should help you analyze and discuss what you have. You might also want forms or proformas for reporting to management or observing assessments.

Fourth, get hold of your organization’s system for quality assurance, assessment procedures, and continuous improvement.

It might be a written policy, a procedures manual, or a set of forms. It probably uses particular strategies or standards. It may also have particular goals for which the organization is aiming. To use it in your validation, you will have to be sure you understand what it requires of you.

Discuss your organization’s quality assurance system with your supervisor to find out what particular expectations he/she has of you.

It is quite likely that some organizational requirements affect the validation, for example, the time and resources allocated to validation may already be defined, and you may have to follow particular procedural constraints such as:

You will find other principles that could affect what you do, such as ethical standards, confidentiality, OHS, access, and equity.

Fifth, define the organizations and people that will be involved.

The organizations might be:

What about the individuals? How will you contact them? These must include other assessors, some of whom might be peers from other training organizations.

If you are the only person in your organization who assesses in your field, you must do validation with assessors in other organizations for the Certificate IV.

Sixth, analyze the benchmarks against which you are assessing. What do you conclude? (For example, you might be not be covering all requirements equally well.)

Seventh, when and where could you meet?

Eighth, get agreement on:

Ninth, confirm the arrangements with supervisors and the other assessors.

Doing validation

Back to the simple principle of doing what you planned.

If your RTO has a range of assessors who are assessing against the same competency standards, have a look at the evidence that they assessed.

Specific things to do

It isn't enough for you simply to participate. You'll need to play your part in determining the results. Join in the discussions, analyze the findings, and contribute to the group agreement to improve assessment quality. Take part in making recommendations and write them down.

Recommendations for improvement may include anything helpful. Assessors might need better professional development. You might change policies and procedures, strategies, assessment plans, methods, records management, resources, or evidence collection. You might want to change tools or develop new ones.

You might change partnership arrangements or form new partnerships. You might recommend giving more or better information for assessors or students, or give better advice, support, or supervision of assessors. You might instigate a system of exemplars. You might think that liaison with technical experts might be most helpful.

Here are some specific things that you'll want to examine when you do validation:

  1. Specify exactly the difference between acceptable and unacceptable performance wherever it isn’t clear enough in unit statements. It might help to file examples of students' work. Define those differences clearly and concisely.
  2.  
  3. Check the principles of assessment and rules of evidence.
  4.  
  5. Check all documents for accuracy and version control.
  6.  
  7. Make clear interpretations. Many outcomes and assessment criteria need “interpreting” to be useful. How much is “adequate”? How long should a research paper be? Interpreting may be by adding more explicit criteria or further explanations for students. Of course, avoid very long explanations that confuse the point.
  8.  
  9. Identify significant assumptions. One lecturer found that all his assessments worked very well using the assigned unit statements. However, he was assuming that the context of assessment was always the same, that is, the work requirements of his particular organization. It was quite valid, but the assumption needed to be made explicit.
  10.  
  11. Suggest better procedures. If variations in assessment procedures are the problem, suggest something a way of making them more consistent.
  12.  
  13. Anticipate problems and do something about them as early as possible. Sounds obvious, but it just goes to show that validation is another form of risk management.
  14.  
  15. Take good notes. Why bother with validation if it’s just talk? Notes should be more than personal scribble, they need to be put on on record.

Your validation results will almost certainly also result in better unit statements, better assessment procedures and more useful assessment criteria. Then students would have a better idea what to expect in assessment, and you will have a better idea of what works and what doesn’t. You might record your validation as changes in workbooks.

Some interpretations are so clearly implied that you won’t include them in the unit statement, but you should still write them down in your validation notes. In other cases, satisfactory requirements were written down all along, but you realise that you’ve been less diligent lately and simply need reminding that they are still important.

 

Making changes

All very well. But you still have to use all this to improve your teaching and assessing. The point of it all is to make improvements.

Decide what changes need to be made. You might be able to make the changes at the validation meeting. For example, you might edit the tools straight away, or send your manager a request for a change of assessment procedures. If you can't make the changes straight away, have a way of reporting back so you can document what you did.

Documentation and reporting

It’s now time to report what you have found. Find out who need to report to. You should report in writing and include any recommendations you may have to improve the quality and consistency of assessment. You should also give constructive feedback to other assessors.

You may also have the opportunity to report by:

Tip 1If you're a school teacher, the kind of comparative activity normally called "moderation" probably won't meet VET standards. | TipLots of instructors want to compare ways of teaching particular topics. It's a good thing to do, but it's not validation of assessment.

 

Other things that you might do

As we said, you'll review assessment tools, collected evidence, assessment decisions, assessment decisions, and records. Other than that, you have a wide toolkit of kinds of activities you might do as part of validation. You probably won't need all these approaches, but some of the main ways of going about it are:

  1. Working with another assessor, see if you both produce the same conclusions from the same evidence. (Another variation: Have someone else re-assess your evidence to see if you produce the same conclusions.)
  2. Process feedback:
  3. In discussion with colleagues, improve criteria by:
  4. Revising assessment materials by:
  5. Moderate with industry, in which case, it isn’t so much about ensuring consistency as about meeting industry standards
  6. Check standards:
  7. Analyzing particular assessments that went to appeal and the appeals processes

If you are the only assessor in your field in your RTO, you might:

Ongoing quality maintenance

Various strategies are available to you as the person ensuring that assessments are transparent and credible:

 

An example of how to do it badly | About organizational review