BATOD
The British Association of Teachers of the Deaf
Promoting Excellence in Deaf Education

As you will be aware, BATOD and a range of partners are working together to establish an effective means of collecting national data about educational provision for deaf children and young people and about the outcomes for deaf children.  

This article by Steve Powers from Birmingham University provides a useful summary of information that is already available, primarily from government sources, and of issues raised by the data in relation to deaf education. In his conclusion Steve highlights the need for more accurate and up to date information to be used at all levels within schools, Local Authorities and nationally so that we can identify good practice, set appropriate targets, raise attainment and measure more effectively the outcomes for the deaf children and young people with whom we work.

Alison Weaver
BATOD President
September 2007  

The educational attainments of deaf pupils:
a discussion paper on data currently available

Stephen Powers

University of Birmingham
 
  1. AIM

    The aim of this short discussion paper is to promote discussion about government data on pupil attainment now available to schools and local authorities. Therefore it is limited in scope. It follows discussions with a number of people struggling to make sense of the data on deaf pupils.  

    I have no doubt that the profession needs to fully engage with this issue. In the context of rising costs there appears to be increasing criticism of schools and local authorities over their monitoring and target setting for pupils with special educational needs; for example, in a recent survey of provision Ofsted (2006) found ‘no agreement about what constituted good progress for pupils with learning difficulties and disabilities (LDD)’; and that ‘schools rarely questioned themselves as rigorously about the rate of progress for LDD pupils as they did for pupils who did not have LDD’ (p2). Similarly, in a recent report on out of authority placements the Audit Commission (2007) calls for better systems for measuring the cost-effectiveness of provision, to link resources with progress of individual pupils against outcome-based targets (p4).  

    We know there are lots of good things going on in our profession around monitoring progress and target setting - and it’s possible that what Ofsted reports is not so relevant to educators of deaf children as it is to others. However, I do feel we need to develop a clear position on how we use the government data now available (for example through the Pupil Achievement Tracker and RAISEonline). If the profession does not do this there is the danger that something is imposed on us.

  2. SOURCES

    The information in this paper has been gathered mainly from government websites and relevant literature. Also, opinions have been sought from a small number of heads of services and schools (through convenience sampling). Much of the wording on CVA, PAT and RAISEonline is taken directly from government websites.

  3. ATTAINMENT DATA CURRENTLY AVAILABLE

    See Figure 1.

  4. PLASC

    The Pupil Level Annual School Census (PLASC) is completed each January and provides information on a range of pupil characteristics. It has enabled the development of Contextual Value Added measures of pupil progress

  5. VALUE ADDED MEASURES Value added (VA) measures were introduced into the secondary Achievement and Attainment Tables in 2002, to give a better and fairer measure of school effectiveness than absolute results alone. VA allows meaningful comparisons to be made between schools with different intakes, by taking into account prior attainment, the biggest single predictor of pupil results.

    Figure 1 Attainment data currently available

    1 School performance tables (DfES)
    now known as Achievement and Attainment Tables
    SATs (all key stages)
    GCSEs
    A Levels
    NVQs
    Value added scores
    By school and local authority
    Data available on special schools
    but no data at individual pupil level
    2 Government data
    (PLASC, PANDA, PAT, RAISEonline, Contextual Value Added measures)
    By individual pupil Many factors measured In PLASC deaf children identified through ‘hearing impairment’ as primary or secondary type of SEN Expectations of progress based on hearing school population
    3 Research reports:
    1. Powers, 1995, 1996
    2. Achievements of Deaf Pupils in Scotland, 2003
    3. University of Durham, 2004
    GCSE results of deaf pupils in mainstream schools in England By individual pupil; reasonably good return rate
    Scotland only; incomplete data; project now ended
    Incomplete data on attainment
    4 PIPS, MidYIS, YELLIS monitoring systems, Curriculum, Evaluation and Management Centre (CEM), University of Durham Value added data for school self evaluation and monitoring pupil progress Reports that it is expensive. No information on how widely used by Teachers of the Deaf.
    5 BATOD Survey Has collected data on attainment but this has never been reported because of low return rates

    The government’s approach to value added measures was determined after a government sponsored project on the topic which reported in 1997. The main points made in ‘The Value Added National Project. Report to the Secretary of State’ (SCAA, 1997) included that:

    1. prior attainment is the best predictor of achievement
    2. a register of individual pupils is recommended

      The report recommended that, ‘SCAA should consider establishing a central database on pupil performance, possibly using unique reference numbers for individual pupils’ (p6). This has been done through PLASC.

    3. value added measures should be reported for each major subject at all key stages
    4. attention should be given to pupil cohort size when interpreting the data

      The report states that in the case of cohorts of less than 30 pupils no statistically significant conclusions can be drawn from one year’s data and in these cases a three year average should be reported. This point is clearly very relevant to educational programmes for deaf children, including most special schools.

    5. pupils with special educational needs might need special arrangements

    The report acknowledges that the present system of National Curriculum assessment is not appropriate to some pupils with SEN and recommends separate research into how to provide value added information on these pupils. It is not clear whether or not this was done.  

    Following the Value Added National Project, which it had commissioned, in 1998 the government conducted a national pilot of value added measures in 200 secondary schools, including 10 special schools. The performance indicators provided on each school in the ‘Value Added Pilot’ (DfEE, 1998) included the value added score (‘school progress measure’) for the whole school. This showed the value added between National Curriculum Key Stage 3 tests at 14 years and GCSEs/GNVQs scores at the end of Key Stage 4 at 16 years.  

    Out of the 10 special schools in the pilot two were schools for the deaf. The value added scores for these schools (A and B) are shown in Table 1 compared with the average for all schools in the pilot.  

    Table 1 shows that both schools for the deaf scored better than the national average. The value added score for the other 8 special schools were; -4.4, -3.6, -13.5, -8.6, -11.7, +2.9, -10.3, and ‘NS’ (presumably, meaning not significant or not appropriate). That is, the two schools for the deaf were the best performers on this measure of all the special schools. School (B) appeared also to be the best of all the 200 schools in the pilot (the second highest value added score was much lower at +7.1). This result is striking.  

    Table 1 The key findings of the value added pilot on the two special schools for deaf children compared to the national average

    Value added score
    School (A) +4.1
    School (B) +21.6
    England average for all schools -0.5
     

    The high value added scores for the two special schools for deaf pupils might not represent their performance over a period of years. The recommendation in the Value Added National Project discussed above was that where there are cohorts of less than 30 pupils school results should not be reported on a yearly basis. The results for School (A) were for a cohort of 16, therefore might be atypical of the school, but the results for School (B) were based on a cohort of 34 which, by implication, it is suggested should not be atypical of the school.  

    One other possible explanation of the high value added scores of the two schools for the deaf is that the scores reflect different rates of progress across different phases of pupils’ school careers. Deaf pupils might be relatively slow starters because of delayed language, but relatively fast finishers. Therefore progress between Key Stage 3 and GCSE might be greater for deaf pupils than others. This effect has been found with pupils who do not have English as their first language (Thomas and Mortimore, 1995).  

    In the report of the Value Added Pilot the DfEE suggested that ‘value added comparisons between mainstream and special schools are questionable because of the different character of the special schools’ intake’ (DfEE, 1998). In fact, according to the points just raised it seems that not only should special schools not be compared with mainstream schools on value added scores but also that the right comparison to make might be with special schools of the same type. That is, that special schools for deaf pupils should only be compared with each other. However, given the small number of such schools and their very different characters (for example, there are ‘oral’ schools, and schools where the dominant language is British Sign Language) it is questionable whether such an approach would be useful. Furthermore, such considerations exclude the majority of deaf pupils, those in mainstream programmes.  

  6. CONTEXTUAL VALUE ADDED (CVA)

    Value added measures take into account prior attainment, the biggest single predictor of pupil results. However, a number of other factors which are outside a school's control, such as gender, mobility and levels of deprivation, have also been observed to impact on pupil results, even after allowing for prior attainment. CVA goes a step further than VA measures by taking these factors into account.

    Characteristics for which adjustments are made: 2006 model

    Gender Allows for the different rates of progress made by boys and girls by adjusting predictions for females.
    Special Educational Needs Pupils who are school action SEN, and those who are on Action Plus or have a statement.
    Ethnicity Adjustments for each of the 19 ethnic groups recorded in PLASC.
    Eligible for Free School Meals Pupils who are eligible for free school meals. The size of this adjustment depends on the pupil’s ethnic group. This is because the data demonstrates that the size of the FSM effect varies between ethnic groups
    First Language Adjustment for the effect of pupils whose first language is other than English. The size of this adjustment depends on the pupil’s prior attainment. This is because the effect of this factor tends to taper, with the greatest effect for pupils starting below expected levels and lesser effects for pupils already working at higher levels.
    Mobility Pupils who have moved between schools at non-standard transfer times.
    Age Looks at a pupil’s age within year based on their date of birth.
    In Care Those pupils who have been ‘In Care’ at any time whilst at this school.
    IDACI A measure of deprivation based on pupil postcode.

    Some external factors which are commonly thought might have some impact cannot be included because there is no reliable national data available eg. parental education status/occupation. For deaf pupils, there are further significant factors omitted eg. onset of deafness, additional difficulties, hearing status of parents.  

    CVA measures past performance over a given period of time and allows comparisons to be made given what is known about the progress made by pupils during that time and with the same characteristics.  

    CVA has been developed for use across Government, wherever there is a need to assess school effectiveness. CVA is used for:

    • Achievement and Attainment Tables
    • Ofsted Inspections
    • The Pupil Achievement Tracker (PAT)
    • PANDA
    • RAISEonline (which is replacing the PAT and PANDA)
     

    As this list shows, it is clear that CVA is intended to support both the accountability framework, and self-evaluation and improvement at a school and pupil level.

  7. THE PUPIL ACHIEVEMENT TRACKER (PAT)

    PAT has ceased to be upgraded following the merger with the PANDA in RAISEonline from autumn 2006.   

    PAT software allows schools and LEAs to import and analyse their own pupil performance data against national performance data.  There are four main areas of analysis possible in the PAT:  

    School level analyses, comparing the school’s results in the key stages and optional tests against national comparatives.    

    Pupil level value added, comparing the progress of individual pupils or groups of pupils between key stages with progress nationally taking account of prior attainment and other contextual factors.  

    Target setting, assisting the school to set targets for individual pupils in the light of projections based on progress by similar pupils in the best performing schools with a similar baseline.  

    Question level analysis, allowing schools to analyse by question attainment target and topic how their pupils performed in the national curriculum tests and optional tests compared to performance nationally.   

    PAT produces a range of reports from the four areas of analysis outlined above, including the new Schools Improvement Summary Report.  

    The PAT contains national data, previously known as the Autumn Package.  This national data is also available on this website.  

  8. RAISEonline

    RAISEonline (reporting and analysis for improvement through school evaluation) is a web based system to distribute school performance data and is replacing PANDA (performance and assessment) and PAT (pupil achievement tracker) data in 2007. Schools will be able to drill down to individual pupil level as well as comparing with local information. The system currently holds data of all key stages from 2005 and 2006 which has been validated (checked) by schools.  

    RAISEonline will provide the opportunity to look at contextual value added (CVA) progress to Key Stage 2, 3 and 4. There will also be a target-setting section of the system which will enable the setting and moderating of pupil targets.  

    My information is that although RAISEonline is planned for special schools it is recognised that the small number of pupils in many of these schools will make comparisons difficult.  

  9. COMMENTS FROM HEADS OF SCHOOLS AND SERVICES

     

    1. Head of Service

      They use government SAT scores and national examination data, and supplement this with their own annual/bi-annual assessments of language and literacy. P levels are used with children and young people with deafness/hearing impairment and additional difficulties.   

      They collect SAT/GCSE data every year on every child/young person at each key stage, and have this going back several years. Therefore, they are able to identify levels in their own area for deaf/hearing impaired children and young people (with no additional needs) at different key stages; and to identify patterns and trends. They compare individual pupil results with the school’s average, the local authority’s average, and the national average.   

      They can relate findings to the effect of individual schools – for example some pupils who do not achieve expected outcomes might be in a school with a ‘hostile environment’; but in these cases the deaf/hearing impaired children and young people often do better than their hearing peers and siblings.   

      They can also relate the findings to the effect of different teachers of the deaf who are supporting – the head of service acknowledges this can be seen as a threat.   

      Overall they find that outcomes at key stages 1 are very good, there is a dip at key stage 2, but by key stage 4 results go up again to at least national and local averages.   

      The ‘acid test’ is whether deaf/hearing impaired children and young people (with no additional difficulties) are functioning at national target levels or above.  

    2. Head of Service

      She finds the government data wrong and simplistic.  

      Their service uses P levels for Value added, PIPs for a good baseline compared to other children’ levels of deafness, also for predictions of SATs, and MIDYIS for a good Y7 profile including non-verbal scores and learned skills, good predictions for SATs and GCSE and excellent final value added. But this is not cheap.  

    3. Head of Service

      At the moment their service is not involved with the government data. Two officers in the LA are working on the data across all SEN to see how it can be used, before passing it on to the heads of service.

    4. Head of Special School for Deaf Pupils

      The head was delighted when the school was reported to be in the top 5% of schools in the country on VA measures. But previously it was in the bottom 40%!  

      A number of factors influence the VA score for the school – essentially related to the nature and size of each KS cohort:

      1. A significant number of pupils come to the school having struggled in mainstream schools and with no KS1 SAT scores – therefore the progress they make by the end of KS2 is relatively rapid.
      2. Signing children might have a fairly good KS1 assessment, but with the greater English language demands in the KS2 assessment they score poorly in the KS2 tests – they can’t demonstrate what they are able to do. Linguistics of tests is a problem.
      3. Cohort sizes vary year by year. This year there are only five pupils leaving at the end of year 6.
       

      In summary this headteacher thinks the government’s VA scores are of limited or perhaps no use to the school. Rather they use their own assessments (eg. reading, BPVS, TROG) and plot progress for each individual pupil across the years. Head wonders whether it would be useful for schools and services to pool their data to create norms for the deaf child population.

    5. Head of Special School for Deaf Pupils

      He thinks the government data is of no use at all. It is impossible for the school to benchmark with other special schools because the populations are so different. Also, the school’s population varies so much from year to year.  

      He says the government’s P-level data is of some use but the scales have been devised mainly for children with learning difficulties and therefore are not so relevant to deaf pupils.  

      The school ‘ploughs its own furrow’ and for the last several years has set its own individual pupil targets for every subject.

    6. Head of Primary School with a resource base

      His school has large number of pupils with special needs including learning difficulties, visual impairment and hearing impairment.  

      Head is struggling to make sense of measured progress from government data on CVA. One problem is that CVA appears to include a ‘very crude measure’ of SEN.  

      ‘Expectations of growth’ are based on the whole population of school pupils – but he thinks this is unreasonable for some children with special needs. For example, deaf children struggle increasingly as the curriculum becomes more abstract and the vocabulary becomes more technical.  

      He thinks the government’s VA measures are ‘inadequate’.

  10. OTHER

    A colleague in the SEBD field has been working with others in looking at the ECM outcomes framework to see to what extent it can be used to 'measure' progression and whether it could be used in addition to CVA measures to aid school improvement. They feel they have made significant strides in this, and HMI is looking at some of their work to see what if any of it can be use across SEN areas.

  11. SUMMARY

  1. There continues to be very limited useful data on the educational attainments of deaf pupils.

  2. Government Achievement and Attainment Tables report school level data on special schools for deaf pupils, which provides aggregate data for this population. However, these tables tell us nothing about the attainments of deaf pupils in mainstream schools.

  3. There is research data on the GCSE results of deaf pupils in mainstream school in 1995 and 1996.

  4. The BATOD Survey has collected data on attainment but has not reported this because of low return rates. However, BATOD is reviewing its approach and hopes to collect attainment data in the future against a range of well known influencing factors for deaf pupils.

  5. Government PLASC data, collected since 2002, on the Key Stage 2 and Key Stage attainments of deaf pupils has been reported through Hansard, but unofficial reports question the accuracy of this information.

  6. PLASC data has not been reported elsewhere. In theory, it should be able to provide aggregate data on end of Key Stage attainments of pupils with ‘hearing impairment’ as their primary type of special educational need – although the data seems difficult to obtain. However, one problem is that this data does not differentiate pupils with different levels of hearing loss (or by other important factors).

  7. Government data based on Contextual Value Added measures is now provided to schools and local authorities through RAISEonline (having replaced PANDA and PAT). However, there is no evidence that educators of deaf children are finding this data useful in self evaluation or in setting individual pupil targets. Key problems concern small cohort sizes and deaf pupils not matching the wider school population in relative rates of progress at different key stages.

  8. Schools and services use their own data for self evaluation and to set pupil targets.

  9. CVA data omits the effect of key factors for deaf pupils, eg. age at onset, additional needs, hearing status of the parents. Also, a key problem is the uncertainty over which pupils are included in the category of ‘hearing impairment as the primary special need’. Nevertheless, the CVA data might still be useful to educators of deaf children, and this should be explored. The data will probably be more useful at pupil level rather than school level given that (i) most deaf pupils are in mainstream schools and (ii) even special schools have small cohorts at each key stage.

  10. Government VA measures refer only to a narrow range of academic subjects. Some of the key outcomes for deaf pupils are unreported – for example, outcomes concerning communication and language competences, and personal and social aspects of development.

References

Audit Commission (2007) Out of Authority Placements for Special Educational Needs www.audit-commission.gov.uk
DfEE (Department for Education and Employment) (1998) Value added project www.dfee.gov.uk/performance/vap-98.htm
Ofsted (2006) Inclusion: Does It Matter Where Pupils Are Taught? www.ofsted.gov.uk
SCAA (School Curriculum and Assessment Authority) (1997) The value added project. Report to the secretary of state London: SCAA
Thomas, S., Mortimore, P. (1995) Comparison of value added models for secondary school effectiveness Paper presented at the annual conference of the British Educational Research Association, Bath, 14-17 September 1995