Inside Story

School equity: from bad to worse

Gonski got it right, and in the years since he reported his findings have become more relevant than ever, write Chris Bonnor and Bernie Shepherd

Chris Bonnor & Bernie Shepherd 22 October 2014 4387 words

Tamara Voninski/AFR


Almost three years have passed since the Review of Funding for Schooling, otherwise known as the Gonski report, was handed to the Gillard government – years that have seen its recommendations fall victim to timidity, inaction, distortion, self-interest and partisan politics.

But something else happened in the year the government commissioned Gonski: My School was launched with great fanfare, considerable hyperbole and the promise of useful information about schools. Although My School version 1 had significant flaws, it was quickly and substantially improved. The data underpinning the site, now in its fifth iteration, tells a compelling story, not so much about individual schools, but collectively about our framework of schools – what it delivers and, more importantly, what it doesn’t.

In effect, the most substantial review of schooling ever conducted in Australia was accompanied by a goldmine of information which, over time, would tell us if the reviewers had got it right. My School has become Our Schools: it tells of the consequences for all schools if we fail to act on what Gonski found and recommended.

Have the problems revealed by Gonski diminished over those three years? With the most notable exception of New South Wales, most state and territory governments, and certainly the federal government, don’t seem to think that what were agreed to be serious issues in 2011 deserve any urgent attention. Education didn’t even make the agenda of two successive COAG meetings in 2014. But the problems didn’t go away. Many just got worse.

So let’s look at seven key findings of the twenty-six in the Gonski review, grouped under three headings: schooling performance and outcomes, current funding arrangements, and equity and disadvantage.

Particular reference is made to whether the original findings remain relevant and the extent to which the circumstances that led to the findings may have changed in recent years. The information is derived from My School data and little reference is made to other sources, including specific research. This analysis is a brief and preliminary report, issued without graphics. The work is ongoing and will eventually be published.

Schooling performance and outcomes

Finding 1: Australian schooling needs to lift the performance of students at all levels of achievement, particularly the lowest performers. Australia must also improve its international standing by arresting the decline that has been witnessed over the past decade. For Australian students to take their rightful place in a globalised world, socially, culturally and economically, they will need to have levels of education that equip them for this opportunity and challenge.

Data from My School can be analysed to track changes in student performance, in particular to find out whether performance, especially that of the lowest performers, has been lifted.

My School reports student performance in the four NAPLAN aspects: reading, writing, language conventions and numeracy. Test scores in these domains need to be interpreted cautiously; the nature of the NAPLAN writing test, for example, has changed over this time. Even on a national level, changes in test scores from year to year can have many explanations. In the search for possible trends we have considered test scores for two groups representing two distinct stages of schooling: Year 5 and Year 9.

Over five years of NAPLAN testing, 2009 to 2013:

  • Year 5 reading scores rose by about 1.7 per cent (from 494 to 502.2)
  • Year 5 writing scores are not all directly comparable owing to changes in the skills tested, but it would appear more likely that performance has declined rather than improved
  • Year 5 numeracy scores were essentially static (from 486.8 to 485.9)
  • Year 9 reading scores were static (at 580.4)
  • Year 9 writing scores are not directly comparable but again, looking at the published numbers it would appear more likely that performance has declined rather than improved
  • Year 9 numeracy scores fell by 0.8 per cent (from 588.5 to 583.7)

Most of these changes are not significant in statistical terms, but the sense we have on examining them is that performance overall has been more inclined to stagnate or fall rather than to improve, with trends in Year 9 of particular concern. What is certain is that there is no substantial evidence of any performance lift: the Gonski review panel had every right to be concerned.

What about the lowest-performing students? We grouped schools according to their Index of Community Socio-Educational Advantage, or ICSEA, and tracked the performance of schools in high, medium and low ICSEA ranges. In some cases, for example in Year 5 reading, test scores rose for schools in all three ICSEA ranges. But the more noticeable trend was for achievement scores to diverge between high ICSEA schools, where scores tended to increase, and lower ICSEA schools where scores most commonly fell. This diverging trend was most noticeable in Year 9. The differences were least noticeable in reading, but more noticeable for writing and numeracy.

We can only conclude that student performance didn’t lift over this time and, if anything, the achievement levels of our lowest performers declined.

Finding 2: The challenge for the review is to design a funding model that adequately reflects the different needs of students to enable resources to be directed to where they are needed most. All Australian students should be allowed to achieve their very best regardless of their background or circumstances.

Data from My School can be analysed to track how resources to schools have been allocated in recent years and whether resources have been directed to where they are needed most.

It is important to identify where the greatest need exists. While many factors affect student performance, My School data for each year since 2010 shows a strong and continuing association between the socio-educational status of school enrolments and the level of student achievement as measured by aggregated NAPLAN scores. This strong association is found across all sectors and in all localities: higher or lower school ICSEA values are routinely accompanied by higher or lower NAPLAN scores.

The association between socio-educational status and student achievement is often illustrated – for example in reports from the Programme of International Student Assessment, or PISA – by “social gradients.” The slope of the gradient shows the strength of this association. In our comments on Finding 17 (below), we show that the social gradients constructed from data on school ICSEA and student NAPLAN achievement have become steeper over four years. To use Gonski’s wording, it seems that “background or circumstances” have had an increasing impact on student achievement.

This means that the need for resources to be “directed to where they are needed most” has increased in recent years. Has this happened to any significant extent? How have resources been directed?

We can investigate the allocation of resources for three groups of schools with demonstrably different needs.

The aggregated NAPLAN scores of students in schools around the 900 ICSEA level (at the more disadvantaged end of the scale) are quite low at an average of 460 on an aggregated measure. On average, $13,870 was spent on each student in these schools in 2013. Almost all this money, regardless of school sector, came from governments.

The aggregated NAPLAN scores of students in schools around ICSEA 1000 were higher, at 495. On average, $11,265 was spent on each student in these schools in 2013, less than that available to the more disadvantaged students. Most of this funding – between 84 per cent and 100 per cent – is also provided by governments.

Considering just these two examples, we can say that resources are being directed to where they are needed most. The resources may or may not be sufficient to substantially improve student outcomes, but for the moment that’s another matter.

The situation changes at around ICSEA 1162. At this level students are funded at higher levels ($14,263 per student) than the disadvantaged students in the ICSEA 900 schools. Between 65 per cent (Independent schools) and 82 per cent (Catholic schools) of their funding still comes from government sources. Above ICSEA 1162 the funding increasingly comes from school fees in addition to that provided by governments – and the total expenditure per student continues to increase as the ICSEA level rises.

Gonski accepted the arguments for some public funding for students regardless of their levels of advantage. But if all sources of funding are considered – as was Gonski’s brief – we have made no progress towards directing resources towards the greatest need.

Current funding arrangements

Finding 6: Australia lacks a logical, consistent and publicly transparent approach to funding schooling.

There are a number of ways to test the current school funding regime on the basis of logic, consistency and transparency. My School data makes a useful contribution by enabling a closer look at where the funding ends up. There are many reasons why schools are funded at different levels and why some schools are funded at higher levels (per student) than more needy schools. Costs per student are always higher in smaller schools, for example, and funding will reflect this. School location will also explain differences in funding levels.

But other differences are harder to explain, as we discovered in our preliminary analysis of government funding of larger schools with secondary enrolments, located in a random sample of fifty Australian federal electorates. Large numbers of schools seem to be funded, by governments out of proportion to the educational challenges the schools face.

In the electorates surveyed, a number of government schools receive amounts per student in excess of the funding received by lower ICSEA government schools in the same electorate, which suggests anomalies in the way government funding is directed to some government schools.

In those same electorates, fifty non-government schools are more generously funded by governments than are less advantaged non-government schools in the same electorate. These differences between non-government schools are likely to reflect the assigned status of those schools under the opaque, decade-old federal government funding arrangements.

But the differences in government funding of government and non-government schools has proven to be the greater surprise, given that one of these sectors is a fully funded public system and the other is, at least in a legal and technical sense, privately owned and operated. There have long been examples of non-government schools receiving higher government funding than similar government schools. But there are sixty-two non-government schools in the fifty sample electorates receiving government funding in excess of what needy government schools in the same electorate receive.

Across the fifty electorates there are 131 less-advantaged government schools that somehow receive less government funding than at least one more-advantaged nearby non-government school. If scaled up to all Australian electorates, around 25 per cent of government schools with secondary enrolments face greater educational challenges, but receive less government funding, than a non-government school in the same electorate. This is more than an anomaly – it is an absurdity.

It should be stressed that this is a preliminary analysis, but it seems that Gonski’s conclusion about the lack of logic and consistency in how we fund schools remains as accurate today as it was when the review was conducted. It is hardly surprising that the lack of transparency in the existing, complex system is a big part of the problem.

Finding 7: There is an imbalance in the provision of funding to government and non-government schools by the Australian and state and territory governments. In particular, the Australian Government could play a greater role in supporting state and territory governments to meet the needs of disadvantaged students in both government and non-government schools.

Data from My School can be analysed to see whether government funding in recent years has continued to lack balance, and whether the Australian government has assumed a greater role in funding disadvantaged students.

The pattern of expenditure on schools between 2009 and 2012 (the most recent year for financial data) by various governments might reflect a number of priorities, but it appears that creating greater balance isn’t one of them.

My School provides information about both federal and state/territory government funding. Much has been written in various places about federal government funding. This analysis focuses on the states and territories and analyses changes in state/territory government recurrent and capital funding. Unless otherwise stated, the funding amounts cited are dollars per student.

State and territory recurrent funding goes mainly to government schools, but there are considerable differences between jurisdictions. On average across Australia, state and territory funding to government schools increased between 2009 and 2012, but with large variations. The largest increases (around 5 per cent) went to students in Queensland, the Australian Capital Territory, Tasmania and South Australia. Funding increases in New South Wales, Victoria and Western Australia were far more modest at around 2 per cent, and recurrent funding per Northern Territory student actually went down.

The differences among the states and territories are even more evident when it comes to funding non-government schools. Funding for Catholic schools tended to rise by around 5–6 per cent across Australia, but rose by 11.4 per cent per student in Victoria and by 2.4 per cent in New South Wales. Independent schools received more consistently high increases, with the greatest increases again in Victoria. In all states except Queensland funding per student in non-government schools increased at a higher rate than for government schools.

State and territory governments also direct most capital expenditure to government schools, but again there is little evidence of any balance. In round figures the capital expenditure figure was $700 per Australian student in 2009 and also in 2010, and closer to $900 in 2011 and 2012 – but there were great variations among the states.

Annual capital expenditure per government school student in New South Wales and Victoria averaged around $500, but generally declined over the four years. In contrast, capital expenditure per student in Queensland government schools almost trebled, to around $1700 per student in 2012. Capital expenditure increased in South Australia. It also increased in Tasmania to 2011, yet all but disappeared in 2012. Western Australia showed the reverse pattern: a three-year decline followed by a substantial boost in 2012.

States and territories did not provide significant capital funding to non-government schools, with the exception of the Northern Territory and Queensland. On average, Queensland provided almost as much capital funding to non-government school students as South Australia provided to government school students.

The pattern of funding by state governments shows that Gonski was certainly justified in expressing concern. It seems that nothing has changed.

On the second aspect of the finding: do figures from My School indicate that the Australian federal government is assuming a greater role in funding disadvantaged students? To find the answer, we conducted a brief analysis of Australian government recurrent funding in 2010 and 2012 for all schools with complete data in the ICSEA range 900 to 999, and all schools with complete data in the ICSEA range 1100 to 1199.

In total, Australian government expenditure on the first group of schools amounted to $2.39 billion in 2010 and $2.81 billion in 2012 – a 17.5 per cent increase. Australian government expenditure on the second group of schools amounted to $1.77 billion in 2010 and $2.00 billion in 2012 – a 12.9 per cent increase. In terms of dollars per student the increase for the lower ICSEA group of schools was around 16.3 per cent and for the higher ICSEA group of schools 7.7 per cent.

These figures indicate that the Australian government did, between 2010 and 2012, assume a greater role in funding more disadvantaged students. This may reflect National Partnership funding in this period. The funding per student in the higher ICSEA range is $3157, against $2498 in the lower range, reflecting the ongoing greater direction of Australian government funding to non-government schools.

Finding 10: Public funding arrangements need to reflect the nature of the educational challenges faced by a system or school given its characteristics and student population, regardless of whether it is in the government or non-government sector.

Data from My School reveals the extent to which funding reflects the nature of the educational challenges faced by systems and schools. The Gonski review stressed the need to fund schools on an equal per student basis if they serve similar populations with similar levels of need. In this way schools would be resourced appropriately regardless of sector.

Our response to Finding 6 provides many examples of how this does not sufficiently happen for particular schools. In this analysis, we refer to differences by school location and by sector.

It is well-known that schools in non-metropolitan areas incur higher costs, if only because of their location. Regardless of a school’s enrolment, remoteness creates a considerable educational challenge. To meet this challenge, combined government funding, at $21,400 per student, in remote and very remote schools is over double that available to students in metropolitan schools. Regardless of whether such an investment is sufficient it appears that, on the basis of school location, the distribution of funding does reflect the different educational challenges and costs created by school location.

By way of contrast, a sector-based analysis of recurrent funding reveals an increasing disconnect between the educational challenges faced by schools and the funding they receive from governments. We have long known, and My School readily shows, that government schools enrol students with greater needs. While averages don’t tell the whole story, the median government school student is in the upper third of Q2 (the second-lowest ICSEA quarter. The median Catholic school student is in the lower third of Q3 and the median Independent school student is around the top of Q3 (almost into Q4). In metropolitan schools, attended by three-quarters of students, average ICSEA values are 1019 for government schools, 1061 for Catholic schools and 1090 for Independent schools.

Such figures illustrate a hierarchy of needs that should, in no small measure, determine the ongoing distribution of public funding. But in the years 2009–12 increases in funding certainly did not reflect relative needs. The combined per student recurrent funding from all governments increased by just 10.9 per cent for students attending government schools. The increase for students attending Catholic schools was 19.8 per cent, and it was 20 per cent for students in Independent schools. It is possible that within each sector the funding is directed towards the greatest need, but the overall pattern certainly is the inverse of what would reasonably be expected – with almost no difference between Independent and Catholic schools despite the greater demonstrable level of student need in the latter.

In summary, government funding continues to favour remote and very remote schools. In general it also favours lower ICSEA schools as illustrated in our response to Finding 2. But changes in government recurrent funding over the last few years heavily favour non-government over government schools, in contrast to the relative challenges faced by the two sectors.

Equity and disadvantage

Finding 17: New funding arrangements for schooling should aim to ensure that: • differences in educational outcomes are not the result of differences in wealth, income, power or possessions • all students have access to a high standard of education regardless of their background or circumstances.

This finding focuses on the Gonski and OECD (Organisation for Economic Co-operation and Development) definitions of equity. In its landmark report, the Gonski panel explored the influence of student background on educational outcomes, best illustrated by what are known as “social gradient” measures. These can be derived by measuring the slope of a graph of educational outcomes against some social or socio-economic indicator. The report described how Australia has a steeper social gradient compared with many higher-achieving countries. A steeper slope indicates a greater impact of social factors – as distinct from school factors – on student achievement.

The Gonski panel concluded that greater equity and improvements in student outcomes could be achieved through efforts to reduce the influence of student background on achievement – in effect to reduce the social gradient.

My School data provides the opportunity to examine a kind of social gradient if we plot schools’ average NAPLAN scores against their ICSEA values. Since ICSEA is a socio-educational advantage measure, we might call it a socio-educational gradient.

Typical values for the slope of these NAPLAN/ICSEA plots are around 0.35, or 35 per cent. By graphing socio-educational gradients for various groups of schools we are able to compare the equity of schooling in different places and for different levels of schooling. My School data shows that gradients are very much steeper (40–44 per cent) among secondary schools than among either combined or primary schools. In addition, gradients are higher among metropolitan schools generally – and the change over time is greater – than among non-metropolitan schools.

Most significantly, My School shows that Australia’s socio-educational gradient has progressively steepened from 32 per cent in 2010 to 37 per cent in 2013. Socio-educational advantage has had an increasing impact on student achievement in just three years. More than before, differences in educational outcomes are the result of differences in wealth, income, power or possessions.

Finding 21: Increased concentration of disadvantaged students in certain schools is having a significant impact on educational outcomes, particularly, but not only, in the government sector. Concentrations of students from low socioeconomic backgrounds and Indigenous students have the most significant impact on educational outcomes.

This finding confirms what has been increasingly researched and reported in recent years. Even the most cursory examination of data from My School confirms that a concentration of disadvantaged students exists in many schools and is strongly associated with low educational outcomes.

My School not only includes an ICSEA value for each school but also shows the percentage of each school’s enrolment in each of the four ICSEA quarters. The proportion of each school’s enrolment in the lowest quarter, Q1, indicates the concentration of disadvantaged students. Analysis of student outcomes, represented on My School by NAPLAN, shows a strong association between student outcomes by school and the proportion of school enrolments in the lowest quarter.

The more important question is whether concentration of disadvantage is increasing – in effect, whether what Gonski found to be bad is getting worse.

Our comments on the first Gonski finding indicate that NAPLAN achievement scores have tended to diverge over time between high ICSEA schools, where scores tended to increase, and lower ICSEA schools, where scores more commonly fell. Has anything else happened in these schools over the last few years which might explain this divergence?

Declining NAPLAN scores in lower ICSEA schools could have a number of explanations. It could be due to an enrolment shift of more able students out of these schools. School enrolment data certainly shows that enrolment has fallen in lower ICSEA schools, in the order of 1–2 per cent in schools below ICSEA 900. This change in school size may also reflect demographic changes, particularly in provincial and remote areas. However, even low ICSEA metropolitan schools more commonly experienced declining or static enrolments.

Other data suggests that the social profile of enrolments is changing in low ICSEA schools. Gonski clearly referred to Indigenous students as a disadvantaged group. Between 2011 and 2013 there was a small but noticeable increase in the percentage of Indigenous students in lower ICSEA schools, with almost no change in higher ICSEA schools – illustrating an increasing concentration of a key disadvantaged group in lower ICSEA schools. Again the reasons are a matter of hypotheses: it might be due to student movement or to changing local demographics.

Another possible explanation of concentrating disadvantage and the associated decline in student performance might be found in changes in teacher qualifications and experience in lower ICSEA schools over three years. My School data can’t provide any indication of whether these changes have occurred.

What is surprising and disturbing is that concentration of disadvantage, together with its impact, has been able to be measured over such a short period.


From our appraisal of these seven key Gonski findings we can conclude that Gonski got it right and that the years since the review have seen its findings become more relevant than ever.

The review found that Australian schooling needs to lift the performance of students, particularly the lowest performers. Information available in My School shows that we have not achieved this and that the gap between our higher and lower performers, especially in the secondary years, shows every evidence of widening.

The review found that a funding model should enable resources to be directed to where they are needed most. My School shows that government funding, whether sufficient or not, is generally directed in this way – but the combination of public and private funding increasingly goes to students who already achieve at quite high levels.

The review found that Australia lacks a logical, consistent and publicly transparent approach to funding schooling. My School reveals hundreds of examples of schools which are funded in ways that seem to defy logic and consistency, especially in light of the educational challenges they face.

The review referred to imbalance in the provision of funding to government and non-government schools. Analysis of state and territory recurrent and capital funding provides continuing and inexplicable – to the point of absurd – examples of this imbalance.

The review found that public funding arrangements need to reflect the nature of the educational challenges faced by a system or school. My School shows that the distribution of funding does reflect educational challenges created by school location; but also shows that increases in recurrent funding across the different sectors in recent years does not reflect any notion that funding should pay attention to need.

The review found that funding should aim to ensure that differences in educational outcomes are not the result of non-school factors. By constructing socio-educational gradients from My School data we show that our “equity slope” is worsening in most locations and for different levels of schooling. We are simply heading in the wrong direction.

The review found that increased concentration of disadvantaged students in certain schools is having a significant impact on educational outcomes. The evidence from My School suggests that this concentration has increased in recent years and may explain the increasing gap between high and low ICSEA schools.

The changes we have been able to illustrate using My School data have not taken place over decades. They have occurred across the very same years during which the Gonski review proceeded and reported, then was variously ignored, cherry-picked, partially implemented, and in relative terms largely abandoned.

What the Gonski review panel found to be bad about our framework of schools, we find to be worse. •

A second paper – available here – provides more background on what My School shows about Australia's steepening equity gradient.