Are schools gaming NAPLAN to manipulate academic performance?

Shutterstock photo

Schools may be strategically manipulating participation in the National Assessment Program – Literacy and Numeracy (NAPLAN) to improve their public performance results, according to new research from UNSW Business School.

The research found that schools with lower initial test scores relative to other schools teaching similar student bodies show higher rates of non-participation in NAPLAN tests following the public release of performance data on the My School website.

This website, which is maintained by the Australian Curriculum, Assessment and Reporting Authority, is designed to provide transparency and accountability of Australia’s school education system through the publication of nationally consistent school-level data.

However, the research paper, Unintended consequences of school accountability reforms: Public versus private schools, published in the Economics of Education Review, found that certain schools are more likely to exclude lower-performing students – a move which would enhance a school’s overall test results.

The study said this increase in non-participation was largely driven by formal parental withdrawal, a process that parents can initiate by applying directly to the school principal.

Students may be withdrawn from the testing program by their parent/carer for reasons such as “religious beliefs or philosophical objections to testing”, according to the NAPLAN website, which states that this is decided by parents/carers in consultation with their child’s school.

“We found that the fraction of students withdrawn from testing in years after the 2010 launch of My School went up, while the fraction who were absent or exempt from testing remained roughly steady – with poorly-performing students far more likely than other students to be withdrawn from testing,” said Gigi Foster, a Professor in the School of Economics at UNSW Business School, who co-authored the research paper together with University of Melbourne Associate Professor Mick Coelli.

“We further find that this increase in withdrawal rates occurred in schools that were initially reported on My School to be poor performers relative to peer schools,” she said.

This behaviour, commonly referred to as “gaming the system,” undermines the objective of NAPLAN to provide a fair assessment of student performance across Australia, Prof. Foster asserted.

Private schools are more likely to game the system

The research identified a sector-specific response, with private schools more likely to adjust their testing pools compared with public or Catholic schools. This indicates a higher tendency among private schools to manipulate participation rates to maintain their reputations.

More specifically, the research found that students at private schools are more than twice as likely than their peers at state schools to be pulled out of NAPLAN tests in subsequent years if they received low grades in previous years.

“We provide some suggestive evidence that these higher rates of withdrawal in lower-performing schools were more prominent in independent private schools – which are famous for charging parents a pretty penny for the privilege of enrolling their kids – than in public schools,” said Prof. Foster.

“These findings are consistent with a situation in which the increased withdrawal is used as a tactic to manipulate the image of a school’s quality: excluding more poor performers from testing makes the school look better than it otherwise would look on My School.”

How parents play a role in NAPLAN testing participation

The research paper noted that parents can formally apply to the school principal for the withdrawal of their child from NAPLAN testing “in the manner specified by the local testing authority” (usually the state’s department of education).

NAPLAN testing protocols indicate that withdrawals are intended to address issues such as religious beliefs and philosophical objections to testing, but the research paper observed that providing a considered explanation for withdrawal appears to be unnecessary.

Furthermore, the process of applying for withdrawal was not actively promoted by governments or the testing authority. “We believe that knowledge of the process was provided to parents directly by schools,” said the researchers, who noted that parents apply for withdrawal directly to the school, not to the testing authority or government.

Parents have reported pressure from schools to withdraw children from NAPLAN testing in surveys conducted by state education authorities, while the popular press have also reported on claims by parents that schools have instructed children not to sit the NAPLAN tests “in order to boost their chances of obtaining higher overall scores”, the research paper stated.

“The volume of such reports led the NSW Minister for Education to warn that principals or teachers found encouraging children not to sit the tests may face disciplinary action.”

Unintended consequences of manipulating NAPLAN

Prof. Foster explained that she and co-author A/Prof. Coelli, were familiar with overseas research evaluating government programs designed to make school performance more transparent to parents, and thereby to raise the accountability of schools.

“The idea of such programs is to help improve educational outcomes, since with more information, parents would be expected to select higher-performing schools for their children to attend, thereby exerting competitive pressure on low-performing schools to either lift their game or close,” she said.

However, she said, these types of programs could have unintended consequences – if not designed with careful thought. As low-performing pupils at poorly performing schools were more likely not to sit the NAPLAN tests, Prof. Foster said this meant that the My School program may have had the unintended consequence of hiding from public view the low skills in English and numeracy of some of Australia’s weakest school students.

“This is somewhat ironic since the whole point of a school accountability program is generally to improve the education that students – and particularly disadvantaged students, whose parents may have few sources of information about school quality apart from the program – receive,” she said.

Data analysis and policy implications

In conducting their analysis, the researchers utilised data from the My School website spanning from 2008 to 2015. This data set included standardised test scores and participation rates from a balanced panel of 6981 schools over eight years.

While the intention behind publicising school performance data was to drive improvements and transparency, Prof. Foster said, the results indicated that it could also incentivise undesirable behaviours.

“The government could fix a maximum percentage of students who may be excluded from testing on any given testing day, although the monitoring costs of this may be prohibitive and it may result in further unintended consequences, such as schools reducing the number of allowed exclusions amongst average performers – perhaps then creating pressure on sick or injured students to sit the tests – in order to create more ‘slots’ for weak performers to sit out the tests,” she said.

“Another option would be for the My School site to report the percentages of students excluded from testing at each school, while adding a note informing parents that a comparatively high fraction of students excluded from testing may signal that a school’s true average performance is lower than what it appears to be on the My School tables.”

 

[textblock style=”7″]

Like what we do at The AIMN?

You’ll like it even more knowing that your donation will help us to keep up the good fight.

Chuck in a few bucks and see just how far it goes!

Your contribution to help with the running costs of this site will be gratefully accepted.

You can donate through PayPal or credit card via the button below, or donate via bank transfer: BSB: 062500; A/c no: 10495969

Donate Button

[/textblock]

3 Comments

  1. The short answer is “yes”
    I have worked in schools that “practice for the test” regularly in the months before, and actively discourage students who might lower their overall score, from even attempting them.

    Do I have a problem with NAPLAN itself? No.

  2. I’m just trying to work out what sort of genuine “religious objections” there can be to a test designed to assess the students’ and schools’ performances. There’s a certain ignorant logic to some religious objections to medical interventions, as stupid as it is to think it’s better for someone to die than (for instance) get a blood transfusion, but sitting a paper that tests your command of basic language, mathematical and scientific skills and knowledge is … well, how on earth can it contradict any religious edict?

    I’m also a little confused by the remark that private schools do this far more than “public and Catholic” schools. Are Catholic schools not private, fee-charging entities?

  3. Why is the UNSW Business School so interested in children’s non participation in NAPLAN? This is appears to be yet another spurious attempt to undermine schools’ ability to inform and advise parents and guardians about their child’s suitability to sit through an arduous process of testing. Many children are not able to participate comfortably or well in NAPLAN, and many are not capable of participating at all.

    There seems to have been no allowance within this assessment for the fact that schools are under increasing pressure to place students with serious and extensive special needs into mainstream classes with generally little or no support for their needs. These students are rarely capable of completing NAPLAN, which would create unnecessary stressful pressure on them, staff and administration.
    There are also increasing numbers of students with diagnoses of anxiety and other disorders who could be adversely affected by the stress of undertaking NAPLAN and other exams.
    Of course schools advise parents of their rights, including to withhold student attendance at NAPLAN, that’s the duty of schools. Trying to make this a controversial situation is not constructive or conducive to creating trust in the education system, which is struggling for so many reasons, most of all media rumour and innuendo.

1 Trackback / Pingback

  1. Schools Found to be ‘Gaming’ Their NAPLAN Results – SOS Australia

Leave a Reply

Your email address will not be published.


*


The maximum upload file size: 2 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here