Evidence, evidence*

This post is a part of a series being written for my EDUC6352 online masters students.

Screenshot 2017-03-13 20.42.17

The above picture appeared in my twitter feed and links to the SMH story ‘Homework, technology, smaller classes: what works in NSW classrooms‘.

The article describes a particular disadvantaged school in NSW and their use of the MiniLit program to help students struggling with their literacy. It segues from this to detail:

Most teachers believe phonics works, and now they have access to the proof.

For the first time, the NSW Department of Education has teamed with a new not-for-profit education group, Evidence for Learning, to pull together more than 10,000 pieces of research from around the world to show teachers what works – and what doesn’t – in the classroom.

Interestingly enough, in the framing of this article, there’s evidence and there’s evidence and not all research is created equal:

Jenny Donovan is the executive director of the department’s Centre for Education Statistics and Evaluation, which has worked with Evidence for Learning to develop the toolkit. She says it isn’t about “telling teachers where they went wrong”.

“Instead of academic research, this is about connecting directly to the work that is being used in schools,” she says. “What we have done is give teachers access to a really useful resource that will provide tools and data and give them the chance to really drill down into classroom management.

Ok, then.

Leaving aside the snide aside about ‘academic’ research, what is this ‘proof’ that teachers now have access to. Well, Deborah Netolicky provides a great explanation and review of the teachers’ toolkit provided by Evidence for Learning [E4L]. She explains what it is, how it works and her own reservations about this non-profit provision of research for teachers:

My hunch is that the E4L toolkit has something to offer educators in Australia (as a starting point rather than an answer sheet), and I can see the significant work that has gone into producing it, as well as the good intentions behind it. Yet I have my reservations. I worry that an uncritical acceptance of the toolkit’s content, alluring in its apparent simplicity, will result in an impoverished understanding of ‘what research says’. We are in danger of giving education research lip service, or wading in shallow pools of evidence. The use of meta-meta-analyses as the basis for the toolkit has the potential to over-synthesise limited quantitative data to the point of distorting original findings, and ignore the limitations, qualities and complexities of the synthesised studies.

The push for evidence-based policy in education is something that should be approached with a critical perspective. I’m not saying that having an evidence base is a bad thing, rather, it’s not as simple as most discussion would suggest. As Deborah notes, education is not like medicine. The complicated contexts of schools make it hard to make causal claims. (Note that the SMH article refers to what works ‘in the classroom’ as though all classrooms are the same).

My colleague, James Ladwig, has written about the limitations of attempting to create a national evidence base, in this piece ‘National Evidence Base for education policy: a good idea or half-baked plan‘:

If Australia wishes to develop a more secure national evidence base for educational policy akin to that found in medicine, it must confront basic realities which most often are ignored and which are inadequately understood in this report:

a) the funding level of educational research is a minuscule fraction of that available to medicine,

b) the range and types of research that inform medical policy extend far beyond anything seen as ‘gold standard’ for education, including epidemiological studies, program evaluations and qualitative studies relevant to most medical practices, and

c) the degree to which educational practices are transportable across national and cultural differences is far less than that confronted by doctors whose basic unit of analysis is the human body.

Both Deborah’s and James’ posts are worth reading in their entirety. Deborah highlights the problems of relying on meta-meta analyses for a simple metric of ‘what works’ and James’ piece describes why Australia does not yet have the research infrastructure for a truly credible, independent National Evidence Base for educational policy.

research

*If you have seen the musical Matilda you can really get into the groove of this post by- replacing the word ‘discipline’ with the word ‘evidence’ a la the song ‘The smell of rebellion‘.

One thought on “Evidence, evidence*

  1. Pingback: Evidence II: The mathematics strikes back | Dr Rachel Buchanan

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s