The Document Summary Charts!

The Document Summary Service, School of Education

Blog post by Helen Aberdeen, Director of the Document Summary Service & Bristol Guide, School of Education, University of Bristol.


Now we are well into 2020 and spring is in our sights, it is a good time not just to look forward, but also to take stock of the year which has gone.

One of the things which I like to do as Director of the Document Summary Service here at the University of Bristol is to take a look at the ‘Top ten’ – i.e. the summaries which had the most downloads from our subscribers over the last year. If you are a subscriber, it may be interesting to see if your interests align with those of the subscriber community as a whole. If you are not yet a subscriber, I hope that this overview will give you a flavour of the wide-ranging scope of the summaries and tempt you to join us in 2020.

Ofsted tops the charts with 3 reports hitting the Top Ten. In January 2019, the new education inspection framework was published, heralding a bright(?) new era of inspections. The underlying principle of the new framework was that inspection should reflect how schools operate on a daily basis, not just on inspection day. Key changes included new judgements on personal development and on behaviour and attitudes. The major change, however, was a broadening of the judgement on quality of teaching, learning and assessment to specifically include curriculum quality.

Inspectors will now look at 3 distinct aspects of curriculum: intent (what schools want for their children); implementation (what is taught and assessed to fulfil the intent); and impact (what are the outcomes for children and to where do they progress?) I am sure that the intent-implementation-impact mantra will grace many staffrooms this year!

The theme of curriculum was further explored in an Ofsted report ‘Assessing the quality of education through curriculum intent, implementation and impact’, which was published in December 2018 and made it into the January 2019 summaries. This report detailed Ofsted’s research into the best ways of assessing curriculum quality – a fascinating read. A further Ofsted Top Ten report, ‘Inspecting education quality: lesson observation and workbook scrutiny’, was published in June 2019 and included in the July summaries. This report looked at work undertaken by Ofsted to establish the best ways of observing lessons and scrutinising pupils’ work.

Two reports related to pupil disadvantage made it into the Top Ten. The first of these was an insightful DfE report ‘Research to understand successful approaches to supporting the most academically able disadvantaged pupils’, which did what it said on the tin, The second was a Sutton Trust report ‘School funding and Pupil Premium’, which reported on research into how schools use Pupil Premium funding and on what they base their decisions.

Another key area of subscriber interest is wellbeing and mental health. There have been an increasing number of reports covering this area, 3 of which made it into the Top Ten. The ‘Children’s Mental Health Briefing’, was published by the Children’s Commissioner at the end of 2018. It gave a comprehensive overview of the state of Children’s Community Child and Adolescent Mental Health Services (CAMHS) across England – somewhat depressing albeit highly informative. The ‘Impact of Social Media and Screen Use on Young People’s Health’, published in January by the House of Commons Science and Technology Committee was another Top Ten read for subscribers. I wonder if any subscribers downloaded the accompanying teachers’ pack which was mentioned in the report.

A third report in the Top Ten which I include under the mental health theme was ‘Understanding Mathematics Anxiety’, published by the Nuffield Foundation and the University of Cambridge. An in-depth look at the anxiety, apprehension, tension nor discomfort felt by some individuals when confronted with maths – a must-read for maths teachers!

Teacher workload is, unsurprisingly, an area of considerable interest for those of you at the coal face. The annual ‘Teacher Workload Survey 2019’, published by the DfE in September was in 7th place in the summary charts. Some good tidings in this report with teachers reporting spending less time on planning, marking and inputting data!

Another issue of considerable relevance to those at the coal face is the issue of pupil behaviour. The EEF’s Guidance report on ‘Improving Behaviour in Schools’, with its 6 key recommendations was a popular read, in 4th place in the download charts.

As another year begins, I am currently engaged in tracking down summaries for the first batch of the new decade! We do aim to be responsive and, as I sit alone in an office writing these summaries, feedback from subscribers is always very welcome, as are suggestions about reports which have come out and which you think may be of interest.

Do follow us on Twitter in 2020 @BristolUniDocs – and look out for exciting changes to the service coming soon!

Living by the evidence

Professor Harvey Goldstein, School of Education, University of Bristol

Author Professor Harvey Goldstein, School of Education, University of Bristol 

In public discourse it has become common to claim that a programme or policy is “evidence informed”. Indeed, it is often felt sufficient merely to state that a particular decision is “evidence informed”, rather than describing the nature and quality of the underlying evidence.

The move to base public policy decisions on the best scientific evidence is certainly welcome and has been inspired and developed by initiatives such as the Cochrane Review in medicine and the Campbell Review in the social sciences, which rely upon systematic reviews of research. In this brief article I would like to explore the contemporary scene and how evidence is used or misused.

I will start by looking at what counts as evidence, followed by a discussion of how evidence should be presented, and examples of attempts to moderate ways in which evidence is used in public debate. Finally, I will look at how evidence about school performance has been used, and what lessons we might take from this.

But before considering the nature of evidence, it is worth saying that public policy decisions are, and should be, influenced by considerations beyond the research evidence, such as priorities, feasibility,  acceptability, ethics and so forth: all of these will involve judgements, typically subjective ones.

Types, quality and uses of evidence

Evidence that can reasonably be termed “objective” can usefully be divided into two kinds. First, and most importantly, are inferences about causal or predictive relationships, usually across time, relating actions or social and other circumstances to later outcomes.

A second kind of evidence is useful, although secondary, and is concerned with the provenance of evidence: who has provided it, who has sponsored it, and what vested interests might be involved. Thus, there is legitimate concern about the funding of climate change research by oil companies, and for some time medical researchers studying the effects of smoking have refused to accept funding from the tobacco industry.

In addition to provenance, the general quality of research is typically supported by peer reviewing. The best journals will obtain at least two independent judgements on any paper submitted and, while not foolproof, this is perhaps the most satisfactory method that is currently available for weeding out unsatisfactory work.

When evaluating evidence quality, uncertainty should be taken into account and well communicated. A quantitative analysis of any dataset will always reflect uncertainty, due to sampling variability, choice of technique and so forth. Those presenting evidence should do so in such a way as to allow an informed debate about how it can be used. Publicly accountable bodies in particular should be required to be transparent about the uncertainties and alternative explanations that might be involved.

Finally, those people further disseminating evidence – such as journalists and policymakers – need to resist the temptation to “cherry pick” the results they like, and this leads on to the issue of how to ensure that those using evidence do so in a responsible fashion.

Moderating the debate

Sites such as Full Fact do a good job with limited resources in holding evidence providers and media to account, but they typically fail to delve in depth and one consequence is that journalists – with their own limited resources – may simply use and quote the assessments provided by fact-checking sites, rather than seeing such sites as a first step to following up in more detail.

Journalists may look elsewhere, of course, including to the UK Statistics Authority (UKSA), the statutory body overseeing UK national statistics, which broadly does a good job of highlighting the misuse of statistics in public debate. But the UKSA’s resources are also limited and, like Full Fact and others, it does not generally explore issues in depth.

Another organisation that comments, criticises and advises on the use of evidence in public life is the Royal Statistical Society, but it too has limited resources, and must largely rely on voluntary input from members, though it nevertheless does a lot of insightful work, with the real strength that this work is informed by expert opinion.

Impact and access

One concern relates to current UK government policy devoted to the evaluation of university research – the Research Excellence Framework – explicitly encourages researchers to effectively ignore  best practice when describing their research “impact”, in favour of promoting their own research as a major driver of policy change or even change in a distal outcome (such as policy alleviation). In general such attempts to infer ‘causality’ are a pretty silly thing to do and ultimately could lead to a severe distortion of research and the ethics surrounding it, as well as forcing a concentration on short-term rather than long-term objectives. In some situations

A second concern is to do with the evolving economics of scientific publishing. The role of commercial publishers of books and journals has also always been important. The most recent development in this field is so-called “open access publishing”, whereby the cost of accessing a research paper – which has traditionally fallen on the reader through access to an academic library or otherwise – is now being shifted to the writer of the paper, who might be expected to pay up to £2,000 to a journal so that the work can be freely downloaded by anybody. I do not have space to go into all the details, but it should be fairly clear that, under this model of publishing, those with the financial resources to pay so-called “article processing costs” are more likely to be those whose research gets read. The social, cultural and scientific implications of this are likely to be extensive.

We have also recently seen the steady growth of “middleperson” organisations who will publicise scientific work to the public for free – but charge the researcher up to £2,000 per paper – or, alternatively, they may offer to distribute a “popular” version provided by the researcher to paying subscribers.

Both of these examples are likely to change the balance of evidence that gets used, yet neither has been discussed in open debate.

How to use evidence sensibly

So, we have evidence. Now, how do we use it? I have spent much of my career arguing about league tables, especially school ones, so I’ll end on this topic. Over last 30 years, some of us have had some success in conveying notions of statistical uncertainty (interval estimates for ranks) and the need to make adjustments for the differences in pupil intake between schools. These constraints have influenced policymakers to the extent that they are reflected in the tables they provide. But they have done little to moderate the enthusiasm of the media, who are generally unwilling to forsake the idea that what matters – or, perhaps, more cynically, what sells newspapers and website subscriptions – is a simple ranking of “best” to “worst” schools, without any concerns about uncertainty or even the need for statistical adjustment for intake.

The problem of school league tables illustrates several important points. It shows how certain kinds of evidence can be harmful if collected and then displayed in public. It shows, as in the case of university research impact assessments, that individual actors can game the system, so changing what it is intended to measure. It shows how a government can claim to be providing useful information, without any real attempt to stimulate a public debate about its usefulness. And it shows how mass media will embrace the most simplistic interpretations of the data without any encouragement to dig deeper.

To be clear: I am not advocating that we drop the idea of publicly accountable systems, rather that we move away from naïve and misleading presentations of evidence, and towards a more rational approach.  In other words, league tables – for schools or other institutions – should function as one piece of evidence: as a screening device that may be able to point to concerns that could be followed-up. But any ranking, in itself, is not suitable for use as a diagnostic tool to pass definitive judgement. (See “Rethinking school accountability” for a suggestion on how we might progress in education.)

Final thoughts

So where does this leave us? I have little doubt that, ultimately, real evidence can win out if the issue is serious enough. For example, as we see with climate change evidence, it will be ignored for as long as possible by vested interests and those policymakers who rely upon such vested interests, until its implications really can no longer be ignored. Hopefully this will not be too late for useful action.

The important thing for researchers is not to give up. The research and the publicising of the implications of that research, along with public critiques of evidence abuse or suppression, need to continue. All of this is difficult, but I think there is an ethical imperative to try to do it.

And I hope to be involved in doing just that.

 

This is a shortened version of a longer paper that can be accessed at:

https://harveygoldstein.co.uk/2019/12/15/the-role-of-evidencein-public-policy-making/

 

School accountability by Progress 8: A critique of Conservative and Labour proposals

The Conservatives and Labour hold different views on the future of England’s system of school accountability by Progress 8. However, both parties’ thinking is at odds with the research evidence.

Over the last 30 years, successive governments have held secondary schools to account for their GCSE results via national school performance tables (for a review see Leckie & Goldstein, 2017). In 2016 the Conservatives replaced their longstanding ‘5A*–C’ school performance measure – the percentage of pupils with five or more GCSEs at grade C or higher – with Progress 8, a ‘value-added’ measure of the average pupil progress made between key stage 2 SATs and GCSE. This long-argued-for shift in measuring school performance reflects the fact that simple school differences in GCSE results say more about differences in the types of pupil taught in different schools than differences in the effectiveness of the education provided by those schools.

The Conservatives, however, have repeatedly stated their opposition to making any further adjustments for pupil demographic and socioeconomic characteristics, despite a substantial research literature providing evidence in favour of such adjustments. In recent research we argued that Progress 8 is fundamentally biased against schools teaching poorer and other educationally disadvantaged intakes (Leckie & Goldstein, 2019). We showed how further adjusting Progress 8 for just seven pupil background characteristics (age, gender, ethnicity, language, special educational needs, free school meals take-up and deprivation) leads the national rankings of one-fifth of schools to change by over 500 places. We also showed that over 40 per cent of schools would move out of the government’s ‘underperforming schools’ category. We encourage readers to explore the Northern Powerhouse Partnership’s interactive map of this research, which shows how the Progress 8 score of every school in the country changes after these adjustments.

Labour, in contrast, would appear more sympathetic to contextualising schools’ Progress 8 scores, especially given that the last Labour government published ‘contextual value added’, a measure that did adjust for both SATs and pupil background. Thus, from the perspective of school accountability by Progress 8, Labour’s April 2019 announcement to scrap the key stage 2 SATs seems strange. Scrapping the SATs, while potentially addressing many concerns around their deleterious effects on schools, teachers and pupils, would also remove the most important adjustment made by Progress 8 – namely, the adjustment for prior attainment. It would in effect force a reversion to judging schools by average GCSE results. In our latest research (Leckie, Prior, & Goldstein, 2019), we take up this idea and show that even if Labour were to adjust GCSE scores for pupil demographic and socioeconomic characteristics, this would be a poor alternative even relative to simply adjusting GCSE results for the SATs, let alone the preferred approach of adjusting for SATs and pupil background. Put simply, Labour’s proposal to scrap SATs could leave us with a worse school-accountability-by-results system than that of the Conservatives.

Of course, the above arguments assume that both parties’ attempts, in government over the last 30 years, to judge schools off the back of school performance table results will simply continue unaltered. This excessive reliance on these tables has raised their stakes to such an extent that there are now many well-documented perverse behaviours associated with them, and the GCSEs and SATs that underlie them.

What is needed instead is a more radical rethink of our school accountability system. While the Conservatives appear to be happy sticking with the status quo, Labour appears more keen to entertain such a move but has thus far been light on detail. The research described here suggests it would be wise for Labour to make firmer plans for any new accountability system before disassembling the current one, since in absence of any further reform, simply scrapping the SATs will result in a worse school accountability system.

 


This blog is based on the article ‘The implications of Labour’s plan to scrap Key Stage 2 tests for Progress 8 and secondary school accountability in England’ by George Leckie, Lucy Prior and Harvey Goldstein, which is available on arXiv.


References

Leckie, G. & Goldstein, H. (2017). The evolution of school league tables in England 1992–2016: ‘Contextual value‐added’, ‘expected progress’ and ‘progress 8’. British Educational Research Journal, 43(2), 193–212.

Leckie, G. & Goldstein, H. (2019). The importance of adjusting for pupil background in school value-added models: A study of Progress 8 and school accountability in England. British Educational Research Journal, 45(3), 518–537.

Leckie, G., Prior, L., & Goldstein, H. (2019). The implications of Labour’s plan to scrap Key Stage 2 tests for Progress 8 and secondary school accountability in England. arXiv: 1911.06884 [stat AP].

Welcome to the new School of Education blog!

Bruce McFarlane Head of School of EducationSchool of Educa
Professor Bruce Macfarlane
Head of School
School of Education

Author: Professor Bruce Macfarlane, School of Education, University of Bristol

Welcome to the new School of Education blog!

This will be a place where we can share our ideas about education and the work we are all engaged in: from teaching students, carrying out research and connecting with our community partners. Not only that we will be exploring the process of  helping our students through their academic journey with the School of Education, discussing exciting innovations in Seminars, student Open Days and events, guiding and shaping future teachers and educationalists, plus hearing from our dedicated and inspirational students themselves.

We have been involved in teacher education for over 100 years and have a long-standing commitment to developing interdisciplinary understandings of education for social justice.

This is not just about academic research. It also involves commitment to civic engagement and trying to widen the participation of previously under-represented groups within the School and University. Another of our commitments is to provide an excellent student experience at all levels. As a School we have recently experienced a period of rapid growth, especially at Masters level, and have seen the proportion of international students rising on our Masters in Education.

As a School within a highly ranked University we need to make sure that our collective standing as a research community is as high as possible. This year the School, along with others in the sector, will submit its return for the periodic Research Excellence Framework. Previously we have been ranked 5th in the UK and within the top 50 internationally. It is important for the School, and our students, that we maintain this high standing. We are further seeking to deepen our internationalisation in terms of high-quality partnerships with other Schools and Faculties of Education around the world and strengthen global perspectives within the curriculum.

Finally, we are aware that our student population is changing and we need to ensure that we adapt the curriculum to reflect this trend too. This is partly about the need to decolonise the curriculum and ensure that it better reflects knowledge and cultures from around the globe and for us to reflect on the extent to which it has previously been over-represented by largely Western and colonial perspectives.

All of this makes sense, I hope, both for our local students who will need to work in an ever more inter-connected world (despite Brexit!) and for our international students who bring a incredibly rich diversity of knowledge, traditions and cultures with them to study at Bristol.

The School of Education has a mission to do all of these things. It’s a tall order I know but then a vision needs to be something ambitious, I think. I hope this is a vision we are all striving as a community to achieve.

 Follow @SOEBristol on Twitter

Subscribe By Email

Get every new post delivered right to your inbox.

This form is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.