Assessment can mean "appraisal" or "judgment of worth." It implies that there are relevant criteria or benchmarks. Assessment that is based on explicit criteria, standards or benchmarks is often known as "criterion-referenced."
Assessment also implies that there is some evidence that can be assessed. "Evidence" is information, materials, or products that show whether or not a student has the skills. For example, you could:
The assessment procedures in this workbook apply to all kinds of learning. In training, assessment is used to determine whether or a student is competent in a particular skill. On the other hand, many campus programs teach through lectures and tutorials, give essays for assignments, and use written examinations. If you teach and assess this way, you will need to see how these principles apply to you.
Note: The term evaluation is used to refer to determining the quality of a whole program, so it now usually means something different from assessment.
The two main kinds of assessment are formative and summative. They have quite different purposes, but in practice formative leads into summative.
|
People take assessments for many purposes, for example:
The most interesting point is that you could use the same process for more than one purpose:
|
|
What about people who have never studied? Can they just apply for the assessment?
Yes. RTOs allow for the recognition of current competencies regardless of how they are acquired. This is called Recognition of Prior learning (RPL). RTOs once had to offer it for all qualifications, and it is still encouraged.
People can learn skills though:
If someone has learnt the skill through any of these ways, or any other informal way, and meets the requirements to get into the program, then they must be allowed to be assessed. They can also be assessed for RPL to gain admission into the program.
Training organizations may not refuse assessment because an applicant didn't sit the classes or training sessions. If the student passes the assessment, he/she is not differentiated in any way from those who sat through all classes.
RPL affects assessors because assessors now have greater responsibility. The assessment is the only way to ascertain whether the student has those skills. In the past, the classroom teacher could monitor student's development through assignments, class activities and formative assessment, so they had a guideline to know whether assessment went well.
Without classroom monitoring, the assessor is responsible to ensure the quality of assessment, and assessment becomes mainly summative.
RPL has another effect. RPL students and regular classroom students must sit equivalent assessments. You can't make one more difficult than the other, for example, you can't say that because an RPL student's skills weren't monitored in class you can make their assessment much more difficult.
In the past, some Registered Training Organisations used unethical tactics to make sure that students could not pass an RPL assessment. They did this by:
RPL is not the same as transfer credit. The confusion can arise because one way to do an assessment is by a portfolio of documentary evidence (discussed later). Besides, it becomes possible to recognize unaccredited study through RPL, as long as there is sufficient evidence of learning and an assessment is made.
RPL, transfer credit, and national recognition are quite different:
Portfolios of documents are still quite useful in some circumstances, but the trend is generally away from them. Current RPL procedures are more hands-on and on-site, including focused interviews, observations and walk-throughs.
RPL: like a job application
Here's an interesting thought on assessment. A Recognition of Prior Learning assessment can be much like applying for a job:
Applying for a job |
Assessment |
• Your CV outlines your experience and achievements with enough detail to convince the prospective employer that you are competent for the position. You list your qualifications and any other training, and you enclose references and a portfolio of work samples. • In the interview, he/she sights the originals or certified true copies of qualifications. He/she also asks questions to test whether you really have the skills for the position. |
• Your CV outlines your experience and achievements with enough detail to convince the assessor that you have the competencies of the qualification. You list your qualifications and any other training, and you enclose references and a portfolio of work samples. |
Portfolios of documents for RPL are still quite useful in some circumstances, but the trend is generally away from them. Current RPL procedures are more hands-on and on-site, including focused interviews, observations and walk-throughs.
Here are two different step-by-step procedures for RPL, one by portfolio and one by practicum. I've put them side by side so they are easy to compare:
Portfolio |
Practicum |
Prospective student approaches the RTO. RTO informs the prospective student about RPL for admission, units, qualifications, etc. RTO assists the prospective student to write a full Curriculum Vitae and/or do a self-assessment. RTO gives advice on what the prospective student should apply for (qualification, statement of attainment, admission). The prospective student for qualifications may need to take some units through instruction (non-RPL). The prospective student applies to become a student.
|
Prospective student approaches the RTO. RTO informs the prospective student about RPL for admission, units, qualifications, etc. RTO does a preliminary estimation of the student's abilities. RTO gives advice on what the prospective student should apply for (qualification, statement of attainment, admission). The prospective student for qualifications may need to take some units through instruction (non-RPL). The prospective student applies to become a student.
RTO places the prospective student in a practicum placement. Depending on the kinds of skills to be demonstrated, it might range from several hours to several weeks. Assessor assesses the student in the workplace. Assessor gives results and feedback to the student. RTO provides the qualification or statement of attainment or confirms admission to desired program The RTO reviews and monitors the process. |
Training policies perceive a gap between:
Endorsed and accredited outcomes and assessment criteria |
and |
Making specific assessment decisions with real students |
They are concerned about assessment decisions being judgment calls made by assessors. For example, if a student appeals, can you prove your assessment was correct? Or was it just your best guess?
A great deal has been done to close the gap and take any guesswork out of the assessment decision. Nevertheless, solutions to the guesswork problem come up as a constant theme throughout VET sector assessment practice:
Training authorities tend to prefer holistic assessment. (Compare clustering: it's similar.)
Could you have one assessment activity that gets evidence for many related elements? Holistic assessment means that one assessment of one complex activity might cover more than one element or more than one unit. Over-assessing is costly for you and frustrating for the student.
You might be able to save yourself and your students a lot of work by clustering assessments. The point is that students must be able to put all the different skills together to be able to do the job:
Example: A receptionist might be doing one job at the desk and be assessed for separate skills in OHS, communication, information management, and computing.
Example: A factory worker could use a piece of equipment correctly while complying with OHS rules.
Holistic evidence is good. A good holistic assessment is based on actual practice in a real workplace or a realistic simulation. Being part of one work role, the elements have common circumstances and probably overlap; students often perform skills from different units simultaneously.
In fact, a single assessment activity can combine knowledge, understanding, problem-solving skills, technical skills, OHS, language, literacy, numeracy, people skills, attitudes and ethics.
But you can't make a "holistic judgment." It wouldn't be responsible to make a gut feeling assessment judgment ("Um, yeah, he/she can do the job") based on a whole-of-life report consisting of qualifications, training, and experience.
The problem is that the specific written requirements and the person’s performance don't clearly correlate. In other words, the assessment gap is too wide and the assessment is basically guesswork. To solve this problem, you should assess according to the actual requirements of the units.
The result for each unit will be that the students are either "Competent" or "Not yet competent". Many VET sector units are well suited to these assessment outcomes, and most assessors appreciate the simplicity.
About graded assessment
Many colleges accredited in the training sector give grades as A, B, C, D, or E. Students, schools, employers, and universities all want it. Grades are necessary for students transferring to the higher education sector or internationally. Consider this scenario:
Q. Which applicants does the university accept? It has fifteen places in a course, and has twenty suitable applicants with VET sector qualifications. All students' units were assessed only as either Competent or Not yet competent.
A. None. It rejects all twenty students because it has no way of preferring one student over another. The places probably get taken by others further down the list.
You can make the system work in your students' favour. Make the standard for a distinction as clear as possible, and encourage students who have to the ability to aim for the distinction.
Perhaps the biggest trap in designing a system is that students who might do well academically (and appear to be entitled to an high grade) might also be assessed as Not yet competent on the job. Consequently, for the student to pass the unit, your system must ensure that students are also assessed as competent on the job.
The story of graded assessment
Some training accreditors also allow grading, although the procedures vary between states. In WA, the Training and Accreditation Council no longer supports it as part of the RTO activities, but there is nothing wrong with doing it as well. The Australia Skills Quality Authority (ASQA) has not yet published an opinion.
Unfortunately, for a long time, training sector accreditors did not recognize graded assessment at all, and some auditors would not let RTOs keep graded results, especially as a percentage. If they were allowed to keep them, they could only keep them for their own formative purposes, for references, and for transcripts that do not bear the logos of training accreditors. Some RTOs then had to run two sets of records because they needed graded results. When the procedure was introduced in WA, TAFE lecturers (and their union) frowned upon it because they thought it would take more time than they had. This was probably due to the very cumbersome way it was expressed.
The evidence is usually a direct correlate of the assessment mode. There are three kinds of evidence, although assessment authorities seldom agree on definitions:
Kind |
Key feature |
What it means |
Direct evidence |
Direct observation in a real situation. |
As assessor, you directly observe the student performing the skill in a real or simulated situation. (To be called a simulation, an assessment must be realistic enough to qualify as direct evidence.) |
Indirect evidence |
Assessor infers competence. |
As assessor, you infer competence from what the student has done (such as samples or written work). This may also include classroom-based assessments, and performance of conceptual skills in an interview or a test. |
Supplementary evidence |
You rely on someone else to inform your decision. |
This includes third party reports, references, and professional licenses. However, the assessor is still responsible for the final assessment decision. |
Supplementary evidence is excellent and in no way inferior. Original signed documents with detailed statements of competence from responsible, competent persons are very appropriate. Those documents need to be either issued independently or authenticated, and need to be free of conflict of interest. They usually take the form of references or professional licenses from credible bodies with established standards. It is good practice to follow up references with a phone call, because people will often tell you things in person that they wouldn't put in writing.
Of course, some third party evidence lacks detail or credibility, and is best used simply to corroborate other evidence.
Some authorities list the student's claims to competence (CV, self-assessments, etc.) in supplementary evidence. They are not real evidence and should not much determine assessment results. They are most useful for establishing an appropriate assessment. In a taught program, they monitor the students' readiness for assessment. In an RPL situation, they indicate areas and level of ability and possible sources of third-party evidence.
Most training packages leave it up to you to determine which you will get. Some, however, require you to gather a combination of direct, indirect and supplementary evidence, or specify that direct evidence must be collected.
Below are the main points of a code of practice for assessors, most of which reflect ethics in some way.
*Careful, confidential use of assessment results is imperative; an unwisely released "Not yet competent" result can have severe career ramifications for the student. There is no special exemption for the student's parents or employers, even if they have paid for the course.
Some of these are AQTF requirements and some only apply to specific packages.
Conflict of interest issues are a whole can of worms:
In some cases, you might have been so involved in helping a student assemble a portfolio or in supervising a project that you can no longer be the sole assessor. You can either get someone else to assess, or get their input to check your assessment.
Other legal and ethical issues can affect assessment. For example:
It is also unethical to:
According to the courts, training sector personnel can be liable for their assessments, unlike those in the Higher Education sector. The apparent reason is that the training sector certifies competence.
As a VET assessor, you can be liable if you certify students as competent for something that they cannot do.
Example 1
Assessment guidelines in the package required farm students to be able to use both motorcycles and quad bikes. However, the RTO only taught students to use quad bikes because motorcycle insurance was too expensive.
Imagine that a graduate is told to use a motorcycle on a farm. The boss is within his/her rights, because the student has a certificate that says he can. The graduate is afraid of losing his job, so he gets on the motorcycle and crashes, resulting in major injury.
The RTO assessor is personally liable for the injuries because he passed the student for that unit. He ignored the fact that he was certifying students as competent to ride motorcycles.
Example 2
In one state, aged care workers are not legally permitted administer medication; nurses must do it. Here's a hypothetical case.
You certify a student in aged care work as competent to administer medication because it's written in the package. However, he does not actually have those skills. Then he moves to another state where their laws allow aged care workers to give medicines. He gets a patient's medication wrong and the patient dies.
You are personally liable because you certified the student as competent to do something that he could not do.
You are required to follow industry standard professional practice.
Teaching staff can be sued for damages arising from their failure to comply with current industry standards. In fact, you may still be vulnerable unless you follow world best practice. This means that presently you must comply with the Certificate IV in Training and Assessment procedures whether you have it or not.
Insurance won’t necessarily cover you.
Your liability insurance might not help. Insurers will invest substantial resources to avoid paying out a large claim if they think they shouldn't pay.
They probably won't cover you for liability if they can show you were negligent, didn't follow due process, or didn't try to identify and minimize risks. Consequently, you would be personally liable.
Detailed records are good defense against litigation.
Keep records of evidence, not just for AQTF compliance, but for as long as there is an identified risk of litigation. You will need to be able to demonstrate "due process." The AQTF doesn’t require students to sign the assessment, but it may be helpful in case of litigation.
I understand that this is an actual case ...
An RTO assessed a student as competent to use a large piece of expensive equipment. Soon after the assessment, the student was horsing around and wrote the equipment off. The employer took the RTO to court, claiming that it had incorrectly certified the student as competent.
The RTO brought out its records showing:
The RTO was found not liable, and the student was found personally liable. Although competent, the student was blamed for his own irresponsible behaviour.
Your surgeon might be licensed to give you brain surgery, but would you want him to do it if he knew that he couldn't?
Don’t assess outside your area of ability even if you have appropriate qualifications. If you know that a topic is outside your current abilities, then it is normally unethical for you to assess it alone. You might get advice or have a co-assessor.
Find out your own limits. For example, you may have limitations related to your ability in assessment procedures, quality processes or your own competency level. There can also be related legal responsibilities that you cannot meet.
Your RTO might also put some constraints on you: