
QCA board members Beasley, Kirkham and Gould conducted the review, with a DfES representative
The investigation into this year's English tests for 624,000 teenagers in England concludes that "the whole test operations process is not robust in any sense". The review team members said it had become clear to them "that teachers and markers have lacked confidence in the Key Stage 3 English test for several years".
 | REVIEW TEAM Mike Beasley, business consultant, former MD of Jaguar Sue Kirkham, head of Walton High School Ed Gould, master of Marlborough College (QCA board members) Hilary Emery, Children's Services Improvement Adviser, DfES |
They make recommendations about next year's tests, which they say are now "high stakes" and should be treated as "a high-intensity project". Among other things, their report reveals the scale of the problem was far worse than previously acknowledged.
The report states: "Lack of programme management and leadership were the primary root causes of the service delivery failure".
It says there was "a lack of effective programme and project management and an absence of overall leadership of the test development and delivery process from start to finish".
Poor communication resulted in poor decisions and actions.
 | RESIGNATION |
There was "no evidence of any sense of collective responsibility to achieve a positive outcome until failure was both obvious and irreversible". Evidence suggested the Department for Education and Skills "might usefully have adopted a more 'hands-on' role".
The review recommends changing the process of checking pupils' scripts to make results more reliable before they are sent to schools.
But its main recommendation is that there needs to be a new management structure for the 2005 tests.
Tests
Many schools were sent the test papers late. Some still had not received them on the actual test days, 6 and 7 May.
 | Anger, frustration and disillusionment underpinned virtually all of the submissions received from schools  |
They should have had associated administrative materials - a guide, mark sheets and stationery - by 23 April but it appeared deliveries did not begin until 4 May. Again, some did not get them before the tests.
The marking process changed this year, with separate markers for the reading and writing papers.
Results
Schools had been warned in advance that results would be sent to them a week later than usual.
But as the end of term arrived, many schools were saying they had not received the results.
The review reveals that only 77% of schools had received their results as they should have done by 13 July. In other words, some 1,035 had not.
Pearson - running the Data Collection Agency - decided late on to deliver results via a website.
But schools were sent wrong access codes and passwords. And they were not told the website - when it was not swamped - was being updated "live" so results might be incomplete and unreliable.
Results were also incomplete because some scripts had not been "borderlined", partly because faulty software wrongly identified too many as borderline.
This involves checking whether scripts up to three marks below the expected standard have in fact met the required level.
So some schools had to return scripts and did not get the results until well into September.
The review team said this should not have been done. It also wonders why scripts three marks above the standard are not similarly checked.
Schools
The report says: "Anger, frustration and disillusionment underpinned virtually all of the submissions received from schools, and LEAs [local education authorities]".
This reflected the "increasingly high-stakes role" of the tests, results of which have "the potential to affect teachers' careers".
Teachers and subject leaders "feel that the delivery failure undermines their professionalism in the eyes of parents".
Although the assessment agency apologised in July, "there is a feeling among teachers that there has not been an appropriate recognition of the hours they have had to spend on checking papers, writing appeal letters, making phone calls chasing papers, downloading results and remarking papers".
Marking
Schools had "a perception that the marking and data collection processes had not been carried out accurately by the external marking agency (AQA) or the Data Collection Agency (Pearson) in the first place".
 | Difficult to identify any evidence that clearly establishes marking quality as an issue  |
The vast majority of the markers, employed by the AQA exam board, "are extremely professional", the report said. They believed teachers' confidence was eroded by data issues, not marking.
But "a perception spread that the marking process was flawed and some markers joined the public debate to defend their professional integrity".
Some had been recruited late and some did not get training materials - partly because they were delivered late and partly because AQA sent some to the wrong addresses.
The QCA sent out test packs with the wrong mark scheme.
The "historic inability to recruit enough markers" was a key issue.
"It is possible that this annual difficulty in recruiting markers has implications for marking quality although the review team found it difficult to identify any evidence that clearly establishes marking quality as an issue."
Marking quality is judged by the number of appeals from schools, but the review says this is "unreliable and unsatisfactory".
The government view is said to be that there has been "significant reputational damage" to the tests.
Publication of national results has been delayed and the Key Stage 3 league tables have been postponed until next March.