- Study at Deakin
- Campus life
- Industry and community
- About Deakin
The use of detailed assessment criteria for individual assignments is an important part of establishing the incremental attainment of graduate attributes. If graduate attributes have been embedded into the program curriculum, then, taken together, satisfactory performance by a student in all of the formal assessment activities should represent satisfactory attainment of the required program graduate attributes. Of course, it is often possible for a student to complete a unit of study by attaining the minimum pass mark, but not actually covering a particular attribute. A 'pass student' may progress through their entire program and successfully graduate having avoided a range of graduate attributes that were designed into the curriculum and dutifully assessed (Ferguson, 2001). It is important to make the distinction between processes which ensure that a program will contain opportunities for students to learn and practise desired attributes, and processes which seek to certify actual student attainment of graduate attributes. The bottom-up, incremental assessment approach can be supplemented by other approaches that seek to measure the student attainment of desired/required graduate attributes.
Student portfolios are another means by which individual attainment of graduate attributes can be assessed. Many professional accrediting bodies identify student portfolios as one possible strategy for demonstrating program outcomes and student attainment of graduate attributes. The benefits of portfolios are summarised as:
Importantly, for the task of assessing outcomes of an entire program of study, a portfolio can act as an integrator, bringing together and assessing the whole program (Manson, Pegler & Weller, 2004), including allowing students to demonstrate attainment of particular attributes that may not have been explicitly summatively assessed at any point during their prior studies (EPC Assessment Working Group, 2002).
It has been found that the portfolio requirements and the structure/format in which portfolio items must be submitted need to be designed around the intended use of the portfolio, and made clear to students who will be using the portfolio (Heinricher et al., 2002). Additional effort in compiling the portfolio can be minimised by basing it around assessment items/artefacts already currently produced by students (Falk et al., 2002). Of course, this approach can only be employed if the assessment tasks undertaken by students clearly relate to the assessment of attainment of the required graduate attributes. It is well known that students take a strategic approach to study, and the learning activities they engage most fully with are those most clearly associated with what will be assessed (James, McInnis & Devlin, 2002). Not surprisingly, it has been observed that attaching assessment credit (marks) to the completion of portfolio tasks is an effective motivator for student engagement (Toohey, 2002).
While it is possible to employ a paper- or hardcopy-based student portfolio, the increasing use of online technology by students and educators alike, including in assessment, means that many of the reported applications of student portfolios are online portfolios (or, e-portfolios) (Love & Cooper, 2004; University of Sydney Faculty of Science, 2004; Williams & Sher, 2004). The suggested benefits of online portfolios include:
In addition to the direct assessment of individual student performance, there are other less direct and longer-term means by which student attainment of graduate attributes, at least at the program level, can be measured/inferred. Program graduates/alumni can be surveyed to seek their perceptions of the effectiveness of their studies in equipping them with the required attributes, and the employers of graduates can be surveyed to seek their assessment of how well the graduate exhibits the required attributes (Department of Chemical & Biomolecular Engineering North Carolina State University, 2002; Faculty of Electrical Engineering Universiti Teknologi Malaysia, 2007). Where student evaluation of teaching (SET) surveys include items relating to the development of graduate attributes, this data can provide a measure of the contribution of individual units to the development of program graduate attributes (Johnson, Gerstenfeld & Zeng, 2002).
Whole-of-program level data can be obtained if the institution has a course experience-style questionnaire that includes items relating to the development of graduate attributes (Bath, Smith, Stein & Swann, 2004). In Australia, a version of the Course Experience Questionnaire (CEQ) has been included in the Graduate Careers Council of Australia (GCCA) national survey of graduates from 1993 onward. The Generic Skills scale of the CEQ contains the following question items:
GS06 - The course helped me develop my ability to work as a team member
GS14 - The course sharpened my analytic skills
GS23 - The course developed my problem solving skills
GS32 - The course improved my skills in written communication
GS42 - As a result of my course, I feel confident about tackling unfamiliar problems
GS43 - The course helped me to develop the ability to plan my own work
The results from the Generic Skills scale of the CEQ can provide some measure of the effectiveness of individual programs in developing a range of generic graduate attributes in students (Lister & Nouwens, 2004).
In addition to individual assessment of set assignment tasks, does your School use any other method(s) for measuring student attainment of graduate attributes in the program(s) that you contribute to? If you are unsure, what methods could your School use?