This article by Steve Powers from Birmingham University provides a useful summary of information that is already available, primarily from government sources, and of issues raised by the data in relation to deaf education. In his conclusion Steve highlights the need for more accurate and up to date information to be used at all levels within schools, Local Authorities and nationally so that we can identify good practice, set appropriate targets, raise attainment and measure more effectively the outcomes for the deaf children and young people with whom we work.
The aim of this short discussion paper is to promote discussion about government data on pupil attainment now available to schools and local authorities. Therefore it is limited in scope. It follows discussions with a number of people struggling to make sense of the data on deaf pupils.
I have no doubt that the profession needs to fully engage with this issue. In the context of rising costs there appears to be increasing criticism of schools and local authorities over their monitoring and target setting for pupils with special educational needs; for example, in a recent survey of provision Ofsted (2006) found ‘no agreement about what constituted good progress for pupils with learning difficulties and disabilities (LDD)’; and that ‘schools rarely questioned themselves as rigorously about the rate of progress for LDD pupils as they did for pupils who did not have LDD’ (p2). Similarly, in a recent report on out of authority placements the Audit Commission (2007) calls for better systems for measuring the cost-effectiveness of provision, to link resources with progress of individual pupils against outcome-based targets (p4).
We know there are lots of good things going on in our profession around monitoring progress and target setting - and it’s possible that what Ofsted reports is not so relevant to educators of deaf children as it is to others. However, I do feel we need to develop a clear position on how we use the government data now available (for example through the Pupil Achievement Tracker and RAISEonline). If the profession does not do this there is the danger that something is imposed on us.
The information in this paper has been gathered mainly from government websites and relevant literature. Also, opinions have been sought from a small number of heads of services and schools (through convenience sampling). Much of the wording on CVA, PAT and RAISEonline is taken directly from government websites.
See Figure 1.
The Pupil Level Annual School Census (PLASC) is completed each January and provides information on a range of pupil characteristics. It has enabled the development of Contextual Value Added measures of pupil progress
|1||School performance tables (DfES) |
now known as Achievement and Attainment Tables
|SATs (all key stages) |
Value added scores
|By school and local authority |
Data available on special schools
but no data at individual pupil level
(PLASC, PANDA, PAT, RAISEonline, Contextual Value Added measures)
|By individual pupil Many factors measured In PLASC deaf children identified through ‘hearing impairment’ as primary or secondary type of SEN Expectations of progress based on hearing school population|
|3||Research reports:||GCSE results of deaf pupils in mainstream schools in England||By individual pupil; reasonably good return rate |
Scotland only; incomplete data; project now ended
Incomplete data on attainment
|4||PIPS, MidYIS, YELLIS monitoring systems, Curriculum, Evaluation and Management Centre (CEM), University of Durham||Value added data for school self evaluation and monitoring pupil progress||Reports that it is expensive. No information on how widely used by Teachers of the Deaf.|
|5||BATOD Survey||Has collected data on attainment but this has never been reported because of low return rates|
The government’s approach to value added measures was determined after a government sponsored project on the topic which reported in 1997. The main points made in ‘The Value Added National Project. Report to the Secretary of State’ (SCAA, 1997) included that:
The report recommended that, ‘SCAA should consider establishing a central database on pupil performance, possibly using unique reference numbers for individual pupils’ (p6). This has been done through PLASC.
The report states that in the case of cohorts of less than 30 pupils no statistically significant conclusions can be drawn from one year’s data and in these cases a three year average should be reported. This point is clearly very relevant to educational programmes for deaf children, including most special schools.
The report acknowledges that the present system of National Curriculum assessment is not appropriate to some pupils with SEN and recommends separate research into how to provide value added information on these pupils. It is not clear whether or not this was done.
Following the Value Added National Project, which it had commissioned, in 1998 the government conducted a national pilot of value added measures in 200 secondary schools, including 10 special schools. The performance indicators provided on each school in the ‘Value Added Pilot’ (DfEE, 1998) included the value added score (‘school progress measure’) for the whole school. This showed the value added between National Curriculum Key Stage 3 tests at 14 years and GCSEs/GNVQs scores at the end of Key Stage 4 at 16 years.
Out of the 10 special schools in the pilot two were schools for the deaf. The value added scores for these schools (A and B) are shown in Table 1 compared with the average for all schools in the pilot.
Table 1 shows that both schools for the deaf scored better than the national average. The value added score for the other 8 special schools were; -4.4, -3.6, -13.5, -8.6, -11.7, +2.9, -10.3, and ‘NS’ (presumably, meaning not significant or not appropriate). That is, the two schools for the deaf were the best performers on this measure of all the special schools. School (B) appeared also to be the best of all the 200 schools in the pilot (the second highest value added score was much lower at +7.1). This result is striking.
Table 1 The key findings of the value added pilot on the two special schools for deaf children compared to the national average
|Value added score|
|England average for all schools||-0.5|
The high value added scores for the two special schools for deaf pupils might not represent their performance over a period of years. The recommendation in the Value Added National Project discussed above was that where there are cohorts of less than 30 pupils school results should not be reported on a yearly basis. The results for School (A) were for a cohort of 16, therefore might be atypical of the school, but the results for School (B) were based on a cohort of 34 which, by implication, it is suggested should not be atypical of the school.
One other possible explanation of the high value added scores of the two schools for the deaf is that the scores reflect different rates of progress across different phases of pupils’ school careers. Deaf pupils might be relatively slow starters because of delayed language, but relatively fast finishers. Therefore progress between Key Stage 3 and GCSE might be greater for deaf pupils than others. This effect has been found with pupils who do not have English as their first language (Thomas and Mortimore, 1995).
In the report of the Value Added Pilot the DfEE suggested that ‘value added comparisons between mainstream and special schools are questionable because of the different character of the special schools’ intake’ (DfEE, 1998). In fact, according to the points just raised it seems that not only should special schools not be compared with mainstream schools on value added scores but also that the right comparison to make might be with special schools of the same type. That is, that special schools for deaf pupils should only be compared with each other. However, given the small number of such schools and their very different characters (for example, there are ‘oral’ schools, and schools where the dominant language is British Sign Language) it is questionable whether such an approach would be useful. Furthermore, such considerations exclude the majority of deaf pupils, those in mainstream programmes.
Value added measures take into account prior attainment, the biggest single predictor of pupil results. However, a number of other factors which are outside a school's control, such as gender, mobility and levels of deprivation, have also been observed to impact on pupil results, even after allowing for prior attainment. CVA goes a step further than VA measures by taking these factors into account.
Characteristics for which adjustments are made: 2006 model
|Gender||Allows for the different rates of progress made by boys and girls by adjusting predictions for females.|
|Special Educational Needs||Pupils who are school action SEN, and those who are on Action Plus or have a statement.|
|Ethnicity||Adjustments for each of the 19 ethnic groups recorded in PLASC.|
|Eligible for Free School Meals||Pupils who are eligible for free school meals. The size of this adjustment depends on the pupil’s ethnic group. This is because the data demonstrates that the size of the FSM effect varies between ethnic groups|
|First Language||Adjustment for the effect of pupils whose first language is other than English. The size of this adjustment depends on the pupil’s prior attainment. This is because the effect of this factor tends to taper, with the greatest effect for pupils starting below expected levels and lesser effects for pupils already working at higher levels.|
|Mobility||Pupils who have moved between schools at non-standard transfer times.|
|Age||Looks at a pupil’s age within year based on their date of birth.|
|In Care||Those pupils who have been ‘In Care’ at any time whilst at this school.|
|IDACI||A measure of deprivation based on pupil postcode.|
Some external factors which are commonly thought might have some impact cannot be included because there is no reliable national data available eg. parental education status/occupation. For deaf pupils, there are further significant factors omitted eg. onset of deafness, additional difficulties, hearing status of parents.
CVA measures past performance over a given period of time and allows comparisons to be made given what is known about the progress made by pupils during that time and with the same characteristics.
CVA has been developed for use across Government, wherever there is a need to assess school effectiveness. CVA is used for:
As this list shows, it is clear that CVA is intended to support both the accountability framework, and self-evaluation and improvement at a school and pupil level.
PAT has ceased to be upgraded following the merger with the PANDA in RAISEonline from autumn 2006.
PAT software allows schools and LEAs to import and analyse their own pupil performance data against national performance data. There are four main areas of analysis possible in the PAT:
School level analyses, comparing the school’s results in the key stages and optional tests against national comparatives.
Pupil level value added, comparing the progress of individual pupils or groups of pupils between key stages with progress nationally taking account of prior attainment and other contextual factors.
Target setting, assisting the school to set targets for individual pupils in the light of projections based on progress by similar pupils in the best performing schools with a similar baseline.
Question level analysis, allowing schools to analyse by question attainment target and topic how their pupils performed in the national curriculum tests and optional tests compared to performance nationally.
PAT produces a range of reports from the four areas of analysis outlined above, including the new Schools Improvement Summary Report.
The PAT contains national data, previously known as the Autumn Package. This national data is also available on this website.
RAISEonline (reporting and analysis for improvement through school evaluation) is a web based system to distribute school performance data and is replacing PANDA (performance and assessment) and PAT (pupil achievement tracker) data in 2007. Schools will be able to drill down to individual pupil level as well as comparing with local information. The system currently holds data of all key stages from 2005 and 2006 which has been validated (checked) by schools.
RAISEonline will provide the opportunity to look at contextual value added (CVA) progress to Key Stage 2, 3 and 4. There will also be a target-setting section of the system which will enable the setting and moderating of pupil targets.
My information is that although RAISEonline is planned for special schools it is recognised that the small number of pupils in many of these schools will make comparisons difficult.
They use government SAT scores and national examination data, and supplement this with their own annual/bi-annual assessments of language and literacy. P levels are used with children and young people with deafness/hearing impairment and additional difficulties.
They collect SAT/GCSE data every year on every child/young person at each key stage, and have this going back several years. Therefore, they are able to identify levels in their own area for deaf/hearing impaired children and young people (with no additional needs) at different key stages; and to identify patterns and trends. They compare individual pupil results with the school’s average, the local authority’s average, and the national average.
They can relate findings to the effect of individual schools – for example some pupils who do not achieve expected outcomes might be in a school with a ‘hostile environment’; but in these cases the deaf/hearing impaired children and young people often do better than their hearing peers and siblings.
They can also relate the findings to the effect of different teachers of the deaf who are supporting – the head of service acknowledges this can be seen as a threat.
Overall they find that outcomes at key stages 1 are very good, there is a dip at key stage 2, but by key stage 4 results go up again to at least national and local averages.
The ‘acid test’ is whether deaf/hearing impaired children and young people (with no additional difficulties) are functioning at national target levels or above.
She finds the government data wrong and simplistic.
Their service uses P levels for Value added, PIPs for a good baseline compared to other children’ levels of deafness, also for predictions of SATs, and MIDYIS for a good Y7 profile including non-verbal scores and learned skills, good predictions for SATs and GCSE and excellent final value added. But this is not cheap.
At the moment their service is not involved with the government data. Two officers in the LA are working on the data across all SEN to see how it can be used, before passing it on to the heads of service.
The head was delighted when the school was reported to be in the top 5% of schools in the country on VA measures. But previously it was in the bottom 40%!
A number of factors influence the VA score for the school – essentially related to the nature and size of each KS cohort:
In summary this headteacher thinks the government’s VA scores are of limited or perhaps no use to the school. Rather they use their own assessments (eg. reading, BPVS, TROG) and plot progress for each individual pupil across the years. Head wonders whether it would be useful for schools and services to pool their data to create norms for the deaf child population.
He thinks the government data is of no use at all. It is impossible for the school to benchmark with other special schools because the populations are so different. Also, the school’s population varies so much from year to year.
He says the government’s P-level data is of some use but the scales have been devised mainly for children with learning difficulties and therefore are not so relevant to deaf pupils.
The school ‘ploughs its own furrow’ and for the last several years has set its own individual pupil targets for every subject.
His school has large number of pupils with special needs including learning difficulties, visual impairment and hearing impairment.
Head is struggling to make sense of measured progress from government data on CVA. One problem is that CVA appears to include a ‘very crude measure’ of SEN.
‘Expectations of growth’ are based on the whole population of school pupils – but he thinks this is unreasonable for some children with special needs. For example, deaf children struggle increasingly as the curriculum becomes more abstract and the vocabulary becomes more technical.
He thinks the government’s VA measures are ‘inadequate’.
A colleague in the SEBD field has been working with others in looking at the ECM outcomes framework to see to what extent it can be used to 'measure' progression and whether it could be used in addition to CVA measures to aid school improvement. They feel they have made significant strides in this, and HMI is looking at some of their work to see what if any of it can be use across SEN areas.
|Audit Commission (2007)||Out of Authority Placements for Special Educational Needs||www.audit-commission.gov.uk|
|DfEE (Department for Education and Employment) (1998)||Value added project||www.dfee.gov.uk/performance/vap-98.htm|
|Ofsted (2006)||Inclusion: Does It Matter Where Pupils Are Taught?||www.ofsted.gov.uk|
|SCAA (School Curriculum and Assessment Authority) (1997)||The value added project. Report to the secretary of state||London: SCAA|
|Thomas, S., Mortimore, P. (1995)||Comparison of value added models for secondary school effectiveness||Paper presented at the annual conference of the British Educational Research Association, Bath, 14-17 September 1995|