Inside Story

My School and your school

My School promises to compare like with like, but a close look at thirty-six “average” schools reveals the limitations of this way of measuring achievement, writes Chris Bonnor

Chris Bonnor 24 February 2010 4870 words

forzadagro/Flickr



THE LAUNCH of the My School website at the end of January was accompanied by fanfare and media hype over what it said about schools, and an equal amount of hype about what it didn’t say. Now that the dust has settled it’s time to have a clear look at what My School actually can and can’t tell us about any school.

In one sense the website doesn’t say anything new. Just about everything it contains, including results of the National Assessment Program – Literacy and Numeracy, or NAPLAN, could already be found in school annual reports. But the information is now available at the click of a mouse, rather than at the end of a sometimes vain search of school websites for copies of annual reports. It is now easy to find statistics about such things as the enrolment, attendance and staffing of each school.

It is this easy access to each school’s NAPLAN results and the claims that My School lets you fairly compare schools that have raised the greatest interest and controversy. NAPLAN tests are conducted in May each year and much has been written about whether these tests say anything meaningful about a school. Secondary school principals are still trying to work out how the Year 7 NAPLAN tests, conducted just three months after students start high school in most states, can be used to attribute success or failure to their new school.

The chair of the board of ACARA, the Australian Curriculum, Assessment and Reporting Authority, Professor Barry McGaw, clearly states the challenge My School faces. He has repeatedly said that 70 per cent of the differences between schools is explained by whom they enrol rather than what the school does. This means that anyone wanting to compare what schools actually achieve has to be very careful to discount this enrolment factor. If ACARA hasn’t been able to get this right then the school comparisons derived from My School have little validity.

To enable schools to be compared in any meaningful way ACARA has created an Index of Community Socio-Educational Advantage. Professor McGaw has boldly claimed that this index enables fair and accurate school comparisons, and the My School website says: “You can quickly locate statistical and contextual information about schools in your community and compare them with statistically similar schools across the country.” But Professor McGaw was more cautious when giving evidence before a Senate estimates hearing earlier this month. “We’re not comparing schools; we’re not saying these schools are like one another in size, locality or physical resources. We’re saying the families from which these students come are equivalent,” he said.

Even that claim is hard to justify, however. The index is not made up of the family characteristics of the students who attend particular schools. It is an index of the average characteristics of local census collection districts, which are made up of a couple of hundred houses. ACARA’s use of this data assumes that if children from a specific collection district go to different schools then we can compare the schools because the children are assumed to be all the same – and we can safely compare them with similar neighbourhoods and schools across Australia. (A fuller account of the index appears at the bottom of this article.)

But families in any given local census district don’t all have the same income levels, number of parents, employment, level of education and so on. And families have a host of unmeasurable qualities that certainly affect the schools their children attend. Even the most similar looking families can range from the highly engaged to the highly dysfunctional. The children from these different families often go to different schools, taking with them the attributes of their family and not the average attributes of the neighbourhood.

The problems with the Index of Community Socio-Educational Advantage don’t end there. Differences between schools are not only derived from the children who enrol; they are also derived from a host of school-level factors. Some of these – such as the quality of teaching and learning – are substantially within the control of each school. Others – including how selective they can be in enrolling students, the ethnic mix and origins of students, the boy–girl ratio and fluctuations in characteristics from year to year – are outside the control of most schools.

All of these are important factors – none is included. While the index might include “measures that are highly correlated with student performance,” it simply doesn’t include all measures of school performance. ACARA doesn’t know the extent to which census collection district data accurately describes the enrolment of any particular school, and it doesn’t know enough of the specifics about the students who actually achieve the marks being so freely attributed, by My School and subsequent league tables, to each school.

It would be easy to attack the My School website and the resulting very questionable school comparisons by drawing attention to the obvious problems in the coupling of some schools, which have received media attention since the site was launched. For weeks now principals and teachers have been dining out on stories about the bizarre matching of selective schools with even nearby non-selective schools, of some of our biggest schools with some of our smallest, of schools of the air with schools on the ground.

Everyone knows that similarities between such coupled schools, mechanically bolted together by ACARA, defy belief. But it is probably more revealing to look at the similarity, or otherwise, of the apparently mainstream schools that have been given the same index rating. Does any random selection of “statistically similar” schools stand up to scrutiny?

To find out we examined the thirty-six secondary schools in Australia that have been given the mean Index of Community Socio-Educational Advantage of 1000 – our “ICSEA 1000” group. This should help eliminate some of the unusual schools and comparisons at either end of the index’s range. Despite stories about schools not being transparent, most of the information we used to compare the ICSEA 1000 schools was found on school websites. We also invited the principals of the schools to complete a brief online survey and some provided additional information. (For the purpose of this study a central or area school with secondary enrolments is considered to be a secondary school. Not all schools that are considered to be statistically similar have exactly the same index rating, hence some of the thirty-six in this study have an index rating of 999 or 1001.)

We combined this information with the Year 9 NAPLAN scores and other school-specific data available on My School. Year 7 test scores weren’t used, as they say little or nothing about secondary schools. We considered the school test scores and subsequent rank in the context of a number of school-level factors, which may impact on a school’s apparent success. These include the role played by school fees, discrimination in enrolments, gender balance, ethnic balance and origin, and fluctuating results.

It could be argued that the sample of thirty-six secondary schools is too small or the use of single tests or averages too crude a measure. Both criticisms could equally be levelled at ACARA and the league tables it has spawned. The difference is that this article raises questions while My School purports to come up with definitive answers.

The problem of fees

IN ANY COMMUNITY the charging of fees by some schools serves to divide enrolments along a number of fault lines, especially family income. Australian Bureau of Statistics data, for example, shows that 26 per cent of students in independent schools were from high-income households, compared to 16 per cent at Catholic schools and only 8 per cent of students at government schools. Even Cardinal Pell has lamented the fact that Catholic schools enrol a disproportionate number of middle-class students. Precisely to avoid this happening, countries such as New Zealand prohibited the charging of fees by church schools when they became government-funded in the 1970s.

So to compare schools in our divided landscape of schools is odd, to say the least. Why would ACARA assume that the national breakdown of school enrolments by family income isn’t replicated, even to some extent, within census collection districts? The charging of school fees by some schools even distorts the enrolment of identifiable groups of students. While the enrolment of Indigenous students is factored into ACARA’s index, fee-charging schools inevitably and disproportionately enrol Indigenous students from higher-income families.

Is it possible that the fact that some schools charge fees distorts any comparisons among our ICSEA 1000 schools? A majority of the non-government schools on our list are ranked in the top half in the Year 9 NAPLAN tests, for example, but is this because they do something special as schools or because the charging of fees helps deliver an advantaged enrolment? My School can’t tell us; ACARA won’t even try.

What are the fees charged by our schools? While specific information is sometimes hard to find we do know that Year 9 students at Trinity College in Evanston, South Australia are charged around $4000 per year. Year 9 fees at the Islamic College of Brisbane are just under $2000, at Chisholm Catholic College just over $3000 and at Orange Christian School $4700. (These amounts reduce if more than one student from a family is enrolled.) Websites and school annual reports tell us that fees at government schools are around 10 per cent of these levels – $315 at Varsity College, for example, and around $250 at Keith Area School.

What ACARA needs to show is that fee-charging schools such as Trinity College and Caroline Chisholm College and minimal-cost schools such as Varsity College and Keith Area School are all enrolling a completely representative cross-section of the students who apparently live in identical census collection districts. It isn’t very likely.

Selective enrolments

THE CHARGING of fees is one of a number of barriers placed in the path of students who might wish to enrol in a particular school. Many schools apply additional discriminators or incentives in the form of scholarships, entry tests and other restrictions including religious criteria. Does ACARA believe, for example, that the almost four dozen selective government schools in New South Wales can be compared with others deemed to be “statistically similar”?

How does selectivity work on the ground among our ICSEA 1000 schools? We do know that government schools are usually required to enrol all local students. Some variations exist among specialist schools, and in some states government schools can enrol students from out of area. As was found in a detailed study of Melbourne, this does create differences in the enrolment profile of high-demand and low-demand government schools. The existence of school zones is supposed to minimise such differences but few enrolment rules manage to get between a school principal and a desirable enrollee.

It is also true that, notwithstanding the rules, individual government schools appear to set certain conditions on enrolment. While it must enrol local students, Sydney’s Bossley Park High School asks for original copies of previous school reports and evidence of residential status, and applicants must have answered all questions on the enrolment form. Crusoe 7–10 Secondary College in Victoria wants to know, in addition to the usual information, whether there are “any issues arising from known relationships” (amongst peers). Even low-level mechanisms such as these can have the effect of screening out some students.

But the school websites show that the non-government schools among our ICSEA 1000 include many that are very selective indeed. John Calvin School in Albany in WA exists specifically “to cater for the needs of the Free Reformed Church in Albany.” Trinity College discourages enrolments between Years 8 and 10, while the School of Total Education in Queensland won’t enrol any new students at secondary level. If it can’t get its students as early as possible it doesn’t want them (which certainly gives a new meaning to “total education”). Coolum Beach Christian College does not restrict enrolment to Christian families, but the enrolment application form runs to ten pages. The school genuinely wants to know how it can help new students adjust, but it also wants to know much more, even down to the name of the minister of the church that the child presumably attends. Then, if all is in order, the parent may be called in for an interview.

It is hardly surprising that three out of these four schools rank nicely above most other ICSEA 1000 schools. At the other end of the selectivity ladder, Bossley Park and Marsden High Schools are just two that have several competing selective schools within very easy reach. But as far as ACARA and My School is concerned all these schools are considered to be enrolling “statistically similar” students.

Some non-government schools work hard to be inclusive rather than exclusive, as indicated by the enrolment at Orange Christian School, a school halfway down the ICSEA 1000 rank. At 6.5 per cent (the figure on My School is already out of date), its enrolment of Indigenous students is comparable with one of the nearby government schools and well ahead of the other non-government schools in the area – and ahead of almost all the non-government ICSEA 1000 schools. In our survey the principal reported that the fact that some schools may have entrance tests had an impact on his school. He also noted that in regional centres census groups are a poor indicator of socio-economics.

Gender balance

ACARA appears not to have considered that the gender balance in a school might influence NAPLAN results. This is despite the fact that the achievement profile for boys and girls can show significant differences at various age levels, reflecting different interests and rates of maturity. Even a brief scan of My School shows that co-located boys’ and girls’ schools often have different results. In the Hornsby district of Sydney, for example, three sets of co-located single-sex schools display significant NAPLAN differences.

The two Catholic schools in Liverpool, New South Wales, that fall into our ICSEA 1000 group illustrate the problem. They are close to each other and draw from a similar area so it’s not surprising that the index has paired them. But the average of the Year 9 NAPLAN scores for the boys’ school is 573 while the average for the girls’ school is 598. The real enrolment of the two schools is not the same – so why are they considered to be “statistically similar”?

Apart from single-sex schools, does gender balance affect our other ICSEA 1000 schools? Marsden High School in Ryde, Sydney, has the greatest gender imbalance and might provide a clue. It has 283 girls and 470 boys, which is not an unusual ratio in parts of Sydney where a number of boys’ schools have been progressively closed. Its average Year 9 NAPLAN is 546, which could well reflect the preponderance of boys.

Each of about thirteen schools in our ICSEA 1000 club has a noticeably uneven mix of boys and girls, including Bossley Park High, Comet Bay College in Western Australia and Murwillumbah High in New South Wales. When all thirty-six are ranked according to their average NAPLAN reading scores (itself a limited measure) nine of the thirteen fall in the bottom half. When it comes to numeracy, the boy-dominant schools are more evenly spread in the top and bottom half. While this is just a scan of a sample of schools it certainly suggests that ACARA erred in not including the gender balance data it has for each and every school.

Ethnic balance and origin

THE CASE of Marsden High School highlights the issue of the ethnic balance for our ICSEA 1000. In bolting together the statistics to create its index ACARA simply considered the percentage of people who don’t speak English and found that this did not align significantly with NAPLAN scores. But the character and impact of an ethnically mixed school enrolment is about much more than proficiency in English.

Any teacher or school principal knows that the origin and ethnic balance of students can make a significant contribution to the achievement profile of a school. Newly arrived students might range from skilled and professional migrants to refugees adjusting after years of personal trauma. Teachers know that the profile of a school in which many families bear the scars of civil war is going to be very different from that of a school in which families represent several generations of successful aspirants. There is nothing especially good or bad about being Islander, Lebanese, Chinese, Sri Lankan or Filipino – but the profile of schools will certainly vary according to the mix.

So what mixtures are represented among our ICSEA 1000 schools? A brief scan reveals only four schools with significant non-Anglo enrolments. One is a government school, Marsden High, mentioned above. The others are non-government schools: East Preston Islamic College in Melbourne, the Australian International Academy in Strathfield, Sydney, and the Islamic College of Brisbane.

Marsden High School is an interesting case. As a comprehensive school on the southern fringe of Sydney’s middle-class northern suburbs, and with selective and private schools all within very easy reach, it’s hard to see how it resembles any other of our ICSEA 1000 schools. An Intensive English Centre is attached to the school, which means that new arrivals from overseas probably stay at Marsden after they complete their English course. Over 70 per cent of Marsden’s students have a language background other than English and this includes a significant number of international fee-paying students.

The international students, temporarily at school in Australia, make up 12 per cent of the enrolment of Marsden High. ACARA’s index would struggle to describe these ninety international students; they won’t necessarily resemble the profile of any census collection district or even that of their host family. Their own family and neighbourhood is 8000 or more kilometres away in Korea or China. There are many schools in Australia with significant numbers of international students.

Less is known about the ethnic mix of the other schools but they appear to be quite different, even from each other. East Preston Islamic College is a smaller P–12 school which claims to be “unique in that it is predominantly an ESL school… with a population situated in a low to very low socio-economic community.” Its index number (1000) does not support the claimed “very low” SES, but in the light of the deficiencies of the index the school’s statement may be correct.

Even a casual scan of the school’s website reveals the unique features of this school and the difficulty in finding any other that is statistically similar. The school is ranked against another ICSEA 1000 Islamic school in Sydney, a school which promotes the International Baccalaureate and has a branch in Abu Dhabi. Another apparently similar school, the Islamic College of Brisbane, has many more girls than boys and encourages the enrolment of overseas students. All seem to be interesting schools, but are they similar?

Fluctuating results

IN USING NAPLAN results to encourage comparisons of schools is ACARA really assuming that student test results for just one year are a valid measure? It may talk about caveats and qualifications but it doesn’t matter: the subsequent league tables based on My School data are sold, and accepted, as an authoritative ranking of schools. But to what extent do test results fluctuate from year to year and are these fluctuations more evident in some schools than in others? Again our ICSEA 1000 schools might provide some answers.

To find out, we ranked the schools by numeracy scores in 2008 and again in 2009. To use the media parlance, the school that showed the most dramatic improvement between the two years was the Australian International Academy in Strathfield, Sydney, up from twenty-fifth to sixteenth place. Chisholm Catholic College rose eleven places, Mooroolbark College rose ten places and Boonah State High School rose eight places. One government and one non-government school fell about a dozen places.

So did some schools do something stunning between the two years? Did others relax their standards? In the absence of a thorough appraisal of the schools we simply don’t know. What we do know is that any school’s ranking against statistically similar schools can change dramatically from year to year. Just choose any school, bring up the list of “statistically similar schools” and watch the colours (comparing the schools) change as you switch back and forth between 2008 and 2009.

School principals and teachers know only too well that test scores often fluctuate from year to year as different year cohorts move through the school. Such changes can be attributed to any one of a number of quite ephemeral factors: collective trauma, turnover of enrolments, loss of key teachers, significant peers. Teachers and principals are well aware that you get “good” or “bad” (or any shade in-between) cohorts through schools. Anecdotal comments about this might reflect some excuse-making, but we just don’t know. Neither does ACARA.

At the very least ACARA needs to factor in the churn in enrolments at some schools. According to the principal over half of the current Year 10 students at Marsden High were enrolled somewhere else when they were in Year 7. A third of the students who sat for the Year 9 NAPLAN tests in 2009 had only been at the school for twelve months. Why is their new school being credited or blamed for their results?

Is close enough good enough?

THE MY SCHOOL website shows all the signs of having been thrown together in a hurry to meet the timeline created by a political agenda. It was always going to be a public relations coup for the Rudd government. For a couple of decades education has been a hot button political issue and there is considerable advantage for any side of politics that can neutralise its impact, let alone turn it into a winner. My School has not only done that but has enabled the government to wedge its critics, both inside and outside parliament.

There is no doubt that My School, and the comparisons between schools that it encourages, won’t go away. The site will be considerably modified and improved as more data about schools becomes available – almost to the point where its current content, and the school comparisons it excites, will look quite deficient. When the dust really settles ACARA will consider the criticisms and it might even be given time to consult people who know their way around schools, including parents as well as teachers and principals.

At the moment there are far too many invalid comparisons created by this half-baked website, and too many schools and their communities are being unfairly labelled and harmed as a consequence. There is a real ethical issue in the question of whether the site should have been launched when it was quite knowingly built around flawed and insufficient data and assumptions. Unfortunately it is highly unlikely that anyone, least of all the deputy prime minister, will be called to account in any meaningful way. The alternative government is not interested in raising issues of fairness and validity and opponents in the education sector have, to date, been unable to gain traction on such issues.

While My School will improve it almost certainly won’t address all the concerns about its validity. With some exceptions, such as gender balance and possibly academic selectivity, most of the school-level factors can’t be reduced to numbers and hence can’t be included in any improved index. While some may argue that this alone should be enough to consign My School to the scrap heap, that is not going to happen. But at the very least, considerable pressure should be placed on the federal government and ACARA to deal with the complexities that exist at the school level and stop pretending that they don’t make a sufficient difference.

What are the alternatives?

IT IS POSSIBLE that a different approach might be used to try to deal with, or get around, these complexities. Future changes will see My School come closer to presenting data on how much “value” a school adds to its students, in a revised attempt to say something meaningful about schools. The problem is that value-added data isn’t immune from any home and neighbourhood effect on student achievement; nor does it account for changes in the socio-educational composition of a school’s enrolment or cope with schools that have a high enrolment turnover.

In any search for alternatives we really have to revisit why the site was launched in the first place. Is it about freedom of information? Is it needed to allow families – or at least those who can afford it – to choose among schools? Is it the best way to improve the accountability and performance of schools? What we do know is that in the development and debate surrounding My School very little evidence has been produced to show that publishing raw student test scores and attributing their success or failures solely to their schools can achieve much of this.

It always will be important for classroom teaching to be the very best it can be. And it is important for schools to be as accountable as possible to parents and the wider community. But there is ample evidence to show that far better school performance and improvements are gained through independent appraisal of school progress to identify and improve areas of weakness. Far better system performance is achieved not by ranking schools in some failed quasi-market experiment, but by cooperative development and commitment to quality teaching and learning.

This requires professional and independent appraisal, which must be conducted independently of schools and government – as happens in New Zealand, for example. The data can guide a process that should be conducted in every school, not just those which appear, by some flawed comparisons, to be “underperforming.”

Of course such an approach is complex and expensive – and it doesn’t resonate as well as the current rhetoric about “transparency” and “openness.” But the Rudd government came to power extolling the virtue of evidence-based policies. It must take heed of the evidence and properly resource a thorough and continuing appraisal of all our schools.

As for My School, at the very least it should be stripped of any references to fair and meaningful comparisons – and the wording “under construction” given prominence on every screen. •

The Index of Community Socio-Educational Advantage

According to the technical paper available on the ACARA website, this index was developed for My School as a means of identifying socio-educationally similar schools across Australia. ACARA identified Australian Bureau of Statistics data that has a bearing on student performance and combined this data into its Index of Community Socio-Educational Advantage. ACARA acknowledges that it isn’t possible to find groups of schools with students of similar abilities, so it relies on a proxy community measure.

There can be little doubt that variables such as family income and occupation, chosen for the index, play a major role in explaining educational outcomes. This data is derived by obtaining the addresses of students and linking them to the average ABS data on census collection districts, or CCDs, which are made up of around 220 households. Variables associated with individual families are not known, so they cannot be considered.

Some variables, including the percentage of people who do not speak English well, were discovered not to be significant and were not included. Measures of remoteness and indigeneity were calculated to affect school achievement and were included. The index was then scaled to a mean of 1000, in line with socio-economic status measures used by the ABS.

The technical paper states that the index is “a measure of the socio-educational character of the students within a school.” This is not correct: it is an aggregated measure of the socio-educational character of the census collection district of each enrolled student, not the character of the students or their families.

ACARA does concede that the index “may provide an inappropriate measure of the socio-educational level of the school… where there is a mismatch between students’ actual levels and that of the CCD values associated with their addresses.” To address this limitation the index values have been adjusted in some way by an “expert panel.”

ACARA goes on to say, “There will continue to be a need for a formal review process to make ongoing adjustments where there is evidence that ICSEA does not properly reflect the actual circumstances of students in a given school.”

The expert panel will be very busy. •