Wednesday, 15 February 2012

Staff-student ratios: whose doubts?

The headline in this week's Australian Higher Education Supplement reads 'Doubts Raised over staff-student ratios'.

The story refers to the recent release by DEEWR of equity statistics in higher education in 2010. Skipping over the bit where the department was apparently late (is mid-Feb late for 2010 data?), the HES is more than a little vague about the 'doubts' about staff-student ratios.
Firstly, they hint that the data itself is doubtful, by highlighting the Minister's recent castigation of the 'quality and timeliness' of DEEWR data.

Then they quote a couple of professors to cast doubt on staff-student ratios. Richard James from Melbourne, a Professor of Higher Education, apparently told the HES that 'staff-student ratios were notoriously misleading'. Was he talking about this data? The HES says that the reasons for their notoriety (according to the useful Professor James, they say) is that sessional staff are not included in the data. But they are.

Marcia Devlin, chair of higher education research at Deakin (who also keeps a great blog, by the way), the HES claims, suggests that staff-student ratios are not the beat measure of good teaching.

The purpose of all this appears, on the surface, to be a jibe at DEEWR for their data. But it also seems intended to undermine the value of staff-student ratios in the first place.

Staff-student ratios seem to be an old-fashioned measure. The higher education sector seems to generally prefer to use softer methods of evaluating good teaching - and then convert them into numbers - than using hard facts that cost hard money. Nevertheless, every student knows that the more teachers you have, the better your learning experience is likely to be.

But for universities and governments, staff-student ratios highlight the uncomfortable fact that quality and financial input are actually related.

For The Australian, I would suspect that in giving academic staff hard and comparative data to take to their institutions, staff-student ratios seem a bit too Bolshie.

But lets get the facts. The DEEWR data does include casual teaching: so when one university has a ratio of one-to-47.62, there are no sessional teachers hiding to provide additional expertise. What it doesn't easily tell is the proportion of research to teaching - but even if your teachers are doing a lot of research, that may well be a good thing for students.

Students benefit not only from the connection between research and teaching and regular contact with researchers, but also from the reputation of their institution. Being able to say you studied at the place where Professor X discovered (insert famous discovery) enhances the value of your degree. This might seem spurious, but we all know (and Simon Marginson has told us) that the economic value of the degree is not only in what you have learned: it is also a 'positional good', something that symbolises your value relative to others.

In an era where governments and institutions insist that 'quality' and 'excellence' be converted to numbers, staff-student ratios are exactly the numbers that potential students, selecting their preferred institution, should use. The government should too, but they might not like the funding implications. Higher education researchers could usefully analyse the relationship between student performance (not just satisfaction) and staff-student ratios.

So, lets give the data The Australian didn't.

The top ten performers on staff-student ratio in 2010 were:

Batchelor Institute of Indigenous Tertiary Education
Melbourne College of Divinity
Bond University 
The University of New South Wales
University of Tasmania
The University of Sydney
The Flinders University of South Australia
Monash University
The Australian National University
The University of Melbourne

The range is wide: 5.62-18.39. The top two are special institutions that perhaps have particular reasons for a lot of staff compared to their students. Bond university has a ratio of one-to 15.8, so the 'real' range for top performers is 15.8-18.39. Good on them. We might note that they include our wealthiest universities.

Lets take a look at the bottom ten. From best to worst:

Deakin University
Victoria University
University of Canberra
University of Ballarat
Southern Cross University
Edith Cowan University
Charles Sturt University
Macquarie University
University of Western Sydney
Central Queensland University

There are some good universities in there. Their staff must be working very hard. CQU is at one-to-47.62, but the rest range between 23.67 and 29.23. Perhaps there was something wrong with the CQU data. Or a lot of people are doing a lot of self-directed learning in central Queensland.

The middle section of the data is more numerous and more clustered than both ends, ranging from one-to-18.5 to one-to-22.68.

If it were up to me, this would be the measure that universities would start with when they advertise, when they showcase their quality and make claims for the resources they offer to potential students. It would also be the one that governments would use to evaluate whether the sector is sufficiently funded. It is not the only one of course: a lot of bad teachers don't trump a smaller number of good ones. But chances of finding a good one might be better when there are more.

It represents a simple, measurable and useful question: are there enough teachers?

No comments: