Anthony Watson
To cite this article: Anthony Watson (1994) Strategies for the Assessment of Competence, The
Vocational Aspect of Education, 46:2, 155-165, DOI: 10.1080/0305787940460205
ANTHONY WATSON
University of Technology, Sydney, Australia
ABSTRACT The paper examines three different strategies for the assessment
of competence. The strategies are different because they base their
Judgements about competence on different forms and amounts of evidence
and they adopt different procedures for obtaining this evidence. Each
strategy is illustrated by means of an analysis of an assessment programme
in operation. The programmes are drawn from an investigation of
competency-based training programmes in TAFE colleges, industry and
various training institutions carried out in Australia and Great Britain during
1992. The analysis brings out the strengths and weaknesses of each strategy
as well as the particular problems and difficulties associated with the
assessment of competence and the procedures that must be followed to
ensure that principles of sound assessment are adhered to.
Introduction
The agenda for the reform of the Australian Vocational Education and
Training System Is now well established. The National Training Board has
been established to foster the development of competency-based
standards for all occupations and classifications covered by industrial
awards and agreements and to endorse them within a national framework.
National working parties have been set up to develop competency-based
training guidelines. In 1991, agreement was reached on a National
Framework for the Recognition of Training (VEETAC Working Party on the
Recognition of Training, 1991). Australian TAFE authorities are committed
to the introduction of competency-based training systems for all
recognised courses (Kinsman, 1992, p. 25).
There are, however, a number of issues which remain largely
unresolved about the nature and implementation of competency-based
vocational education and training. One of the most important of these
155
ANTHONY WATSON
156
ASSESSMENT OF COMPETENCE
157
ANTHONY WATSON
based. Assessments based on these criteria are generally made on a
pass/fail basis. Students are generally assessed individually when they are
ready and judged as 'competent' or 'not competent'. Complex records of
progressive achievement are required as individual retries may be
allowed on assessments events until competence is achieved. This
approach is generally found in formal college or off-job training settings
and often carried out on behalf of industry. It was employed in some form
in all TAFE and further education (FE) colleges in the investigation, which
had competency-based training programmes in operation.
The assessment strategy employed in the Hairdressing Certificate
programme at Box Hill College of TAFE, Victoria, exemplifies this
approach. Assessment in this programme is based on the observation of
samples of job performance carried out on specially-designed practical
tests or tasks which involve the basic operations of hairdressing (cutting,
styling, waving, colouring and basin service). These are supported by
supplementary evidence from theory tests which assess underlying
knowledge. The assessments are generally based on individual elements
of competence such as 'shampoo and condition hair*. Theory tests are
generated and marked by a computer program (80% pass rate is
required). Students are allowed three tries before being 'locked out' for
further instruction. Practical tests employ checklists of performance
criteria which are also generated by the computer program. They usually
follow the theory test and are checked off by the teacher. When all
requirements of an element are satisfactorily completed, students are
recorded as 'competent' for that element. In addition there are so-called
'retention tests' or assignments at regular intervals and a final assessment
which attempt to draw together and integrate a number of elements.
Records are computer-managed and records of progress are available to
students. These are based on a management chart which outlines the
requirements for the course.
This approach to the assessment of competence is typical of most
college-based or off-job programmes. Validity depends on the extent to
which assessment events do in fact assess the relevant elements of
competence to the standards prescribed. It also depends on the quality
and range of facilities at the college and the capacity of the college to
simulate real workplace conditions and events. Reliability depends on the
usual factors which affect criterion-referenced assessment such as the
quality of the assessment items, objectivity in marking, adequacy of
marking guides and checklists.
There are, however, certain problems peculiar to this approach.
These must be overcome if effective judgements about competence and
effective records of competence achieved are to be provided.
The first is a tendency to fragment or atomise the assessment of
competence and to assess many separate elements at the expense of more
holistic assessment. At Box Hill this problem is tackled through the
158
ASSESSMENT OF COMPETENCE
159
ANTHONY WATSON
160
ASSESSMENT OF COMPETENCE
161
ANTHONY WATSON
162
ASSESSMENT OF COMPETENCE
Conclusion
It is worth restating that the assessment of competence is all about
obtaining evidence on which valid and reliable judgements about
competence might be made. Mitchell reminds us that "competence is the
ability to perform to the standards expected in employment" and that,
therefore, "performance evidence must be the prime candidate for
consideration, with assessment in the ongoing course of work the one that
is most likely to offer the highest validity" (1989, p. 61). When this
evidence is insufficient or unavailable, however, supplementary evidence
must be found through samples of performance observed elsewhere and
the assessment of an individual's knowledge or related cognitive
performance. It may well be, as Jack et al (1993, p. 22) conclude, that
"evidence of workbased performance and of relevant, underpinning
knowledge and understanding are both necessary in order to demonstrate
full competence capable of being sustained across a range of contexts".
The strategies examined in this paper provide alternative
approaches to obtaining this evidence, given different types and levels of
competence being assessed and different practical circumstances. They
also serve to highlight some of the difficulties and problems associated
with the assessment of competence and bring into focus some of the
procedures that must be followed to ensure that the principles of sound
assessment are adhered to.
Acknowledgements
For the purposes of this paper, the author made use of materials and
unpublished papers generously made available by the following people:
Mr Ralph Dutneal and Mr Roger Said of ICI, NSW, Australia; Mr Ray Seary
and Mr Terry Walker, TAFE consultants to ICI; Mr Leon Hockaday and Ms
Kerri Ferguson of Box Hill College, Victoria, Australia; Ms Polly Carter, of
Thames Training, University of Greenwich, United Kingdom.
163
ANTHONY WATSON
Correspondence
Professor Anthony Watson, School of Adult Vocational Education,
University of Technology Sydney, PO Box 123, Broadway, NSW 2007,
Australia.
References
Burke, J.W. (Ed.) (1989) Competency Based Education and Training. London: Falmer
Press.
Dutneal, R. & Said, R. (1992) Competency based training: an ICI case study,
unpublished paper, ICI Botany, NSW.
Gealy, N., Johnson, C, Miller, C. & Mitchell, L (1991) Designing assessment
systems for national certification, in E. Fennell (Ed.) Development of
Assessable Standards for National Certification, pp. 35-51. Great Britain:
Employment Department.
Jack, M., Goodman, H. & Newbery, G. (1993) Assessment of learning outcomes: the
BTEC experience, Competence and Assessment, Employment, Issue 21,
pp. 18-22.
Kinsman, M. (1992) Competency based education and TAFE, in Higher Education
and the Competency Movement: implications for tertiary education and the
professions. Canberra: Australian National University.
Mansfield, R. (1991) Deriving standards of competence, in E. Fennell (Ed.)
Development of Assessable Standards for National Certification. Great Britain:
Employment Department, pp. 12-24.
Masters, G.N. & McCurry, D. (1990) Competency-based Assessment in the
Professions, National Office of Overseas Skills Recognition, Research Paper
No. 2. Canberra: AGPS.
Mitchell, L. (1989) The definition of standards and their assessment, in J. W. Burke
(Ed.) Competency Based Education and Training, pp. 54-64. London: Falmer
Press.
National Assessment Research Forum Report (1993) Competency-based Assessment.
Canberra: Department of Employment, Education & Training.
National Council for Vocational Qualifications (1991a) Criteria for National
Vocational Qualifications. Great Britain: Employment Department.
National Council for Vocational Qualifications (1991b) Guide to National Vocational
Qualifications. Great Britain: Employment Department.
National Training Board (1992) National Competency Standards Policy and
Guidelines, 2nd edn. Canberra: National Training Board.
VEETAC Working Party on Recognition of Training (November 1991) National
Framework for the Recognition of Training. Canberra: AGPS.
VEETAC Working Party on the Implementation of Competency-based Training
(1992) Assessment of Performance under Competency-based Training. Canberra:
AGPS.
164
ASSESSMENT OF COMPETENCE
VEETAC (1993) Framework for the Implementation of a Competency-based
Vocational Education and Training System. Canberra: AGPS.
Whiteley, J. (1991) Assessment Models appropriate for Competency-based Training
and their Relationship to Teaching and Learning Approaches. Queensland:
Research and Development Division, Office of Vocational Education, Training
and Employment Commission.
165