Classroom computers kipper-slapped in OECD report

14 Sep 2015
Desktop computers

 

This week the OECD published a report on Students, Computers and Learning: Making the Connection which will be a slap in the face with a wet kipper* to many a technology evangelists. Based on a large survey and standardised test dataset of 15 year olds across many countries, it concluded that “the reality in our schools lags considerably behind the promise of technology”, “even where computers are used in the classroom, their impact on student performance is mixed at best” and “technology is of little help in bridging the skills divide between advantaged and disadvantaged students.” What exactly did the report find, and what are the implications of this? (And no, I don’t really think you should feed your school iPad collection to the fishes).

What sort of research is this?

This is an analysis of survey data and standardised maths and reading tests collected in the OECD Programme for International Student Assessment (PISA) in 2012. Data is available from 510,000 fifteen year olds (representative of 28 million students across OECD countries) from 64 countries. Each country submitted data from a minimum of 150 schools. Studies of this sort are great for describing the generalities and providing basic facts about performance in standardised tests, technology access and ownership. However, as there are many factors which are not controlled for (it is not designed as an experiment in this sense), it is not possible to draw conclusions about cause and effect. The study might suggest patterns of behaviour or relationships between variables but it will not be able to explain why these patterns exist.

What are the main findings?

Access at home and at school

  • On average, 96% of the school students surveyed in the OECD countries had access to a computer at home. The UK is slightly higher than average, with 98.8%. Children across OECD countries spend on average 104 minutes using the Internet at home on weekdays, and 138 minutes on weekend days. In contrast at school, they spend 25 minutes using the internet. (No figures are reported for the UK).
  • In OECD countries on average there are 4.7 students to one computer in school. There are more computers per head in UK schools (1.4 students to computers). In OECD countries, 72% of the students have access to computers at school. (UK information is not available on this measure).

Lack of technology effectiveness in the classroom

  • Investment in educational technology infrastructure does not appear to be associated with better performance in reading, maths or science. Countries that have invested less in introducing computers in school have improved faster, on average, than countries that have invested more.There is a hill shaped curve in the relationship between technology and learning as measured by the PISA tests: limited use of technology in school is better than none but also computer usage at school above the OECD average appears to be associated with significantly poorer results. These calculations controlled for GDP and previous performance in PISA. The same pattern is evident for computer based homework. It is worth noting that there are countries where this pattern does not hold e.g. Denmark and Norway.  As the report does a good job of acknowledging, these are associations rather than statements about cause and effect. We cannot conclude from this that having too many extra computers around is bad for children’s learning.
  • The sort of activities performed on school computers make a difference: drill/practice software and chatting online are associated with lower performance, while browsing the Internet or using email once or twice a week is more beneficial.

Digital Divide

  • There is a 13% difference in access to the Internet at home for students from socio-economically disadvantaged backgrounds across OECD countries, whereas it is only 3% in the UK. On average across OECD countries, disadvantaged students spend 7 minutes less using the Internet at weekends (presumably they are able to access the Internet outwith the home at weekends, or else the aggregate data is masking different patterns of behaviour within countries).
  • The purpose of using the Internet at home is different for those in social-economically disadvantaged students: 18% fewer disadvantaged studens use the Internet for finding practical information, while there is only a 0.5% difference for game playing. (No figures are available for the UK).
  • The report draws the conclusion that in order the reduce digital inequalities, schools should focus on reducing inequalities in literacy and numeracy. Access to technology will not by itself ensure that the children have the skills to use the technology effectively for learning.

Association between excessive internet use and negative behaviour/experiences

  • Students who reported spending more than 6 hours per weekday using Internet (outside school time) are more likely to report feeling lonely at school and are more likely to be late for school or absent. It is worth pointing out here that these are associations, not causations. E.g. we don’t know whether they are lonely because they use the Internet or that they use the Internet so much because they feel lonely, or whether there is some common underlying factor.

So what?

It is not entirely surprising that the overall picture for technology in schools is a damp squid. Previous meta-analyses (statistical syntheses of findings across many quantitative studies) indicate that computers in the classroom have only a small-moderate effect on learning. Two recent meta-analyses show that educational technology makes a modest improvement in learning of mathematics (Cheung & Slavin, 2013) and a small positive improvement in reading in comparison to traditional instruction (Cheung & Slavin, 2012).  Hattie’s meta-meta-analysis (yes, there is such a thing!), across about 5000 studies, found a only moderate effect for computer assisted learning, and it is ranked pretty low down on his list of strategies which make a difference in the classroom.

What does surprise me is the hill-shaped curve finding, where more use of technology was associated with poorer performance. I have not come across a finding like this for technology before (which may be a reflection of my patchy knowledge rather than anything else). It may well be because class time spent using computers to do pointless stuff means less time available to do valuable stuff (face to face or on computer).  And money spent on vast banks of iPads (not looking at any council in particular here…) means less money available to be spent on teacher professional learning.

I think us educational technologists need to pause a moment to feel the wet kipper. It’s a good moment to reflect on the fact that technology in the classroom just isn’t as effective as we claim it ought to be. This doesn’t mean we should give up on it, but that we need to be more savvy about how it is used.  I also don’t think we should let ourselves off the hook by saying “technology really is effective, but we’re not measuring it the effects properly”. Sure, technology may have lots of benefits which were not measured in the PISA test, ( such as enabling collaboration) but if we’re deciding how to spend public money, it seems wise to invest in things which will enable gains in basic underlying skills such as literacy,  numeracy and science. I would argue it could potentially increase higher order cognitive skills too, but we need to measure that.

This study and Hattie’s meta-analysis tells us that computers in the class can be more or less effective depending on the features of the software and how they are used. There are some circumstances in which they can be highly effective, and clearly some countries which do this better than others. This can give us hope, and ideas for fruitful directions for the future. Hattie found that benefits are more likely to occur when technology is used: as supplement to teacher (not replacement); collaboratively; when the learner is in control; when teachers have been educated about the use of computers as a pedagogical tool; and when feedback via the technology is optimised. So as well as investing in educational technology hardware (instead?), it seems that our education authorities need to be investing in teacher professional development (and quite possibly higher quality online learning materials). Teachers need support in learning to use technology but also how to use it effectively in the service of learning. As the OECD report put it: “In the end, technology can amplify amplify great teaching, but great technology cannot replace poor teaching”.

References

Main report on which this post is based: OECD. (2015). Students , Computers and Learning: Making the Connection. http://www.oecd.org/education/students-computers-and-learning-978926423…

Cheung, A. C. K., & Slavin, R. E. (2013). The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educational Research Review, 9, 88–113. doi:10.1016/j.edurev.2013.01.001

Cheung, A. C. K., & Slavin, R. E. (2012). How features of educational technology applications affect student reading outcomes: A meta-analysis. Educational Research Review, 7(3), 198–215. doi:10.1016/j.edurev.2012.05.002

Hattie, J. (2009). Visible Learning. New York: Routledge.

Technical report on PISA survey: http://www.oecd.org/pisa/pisaproducts/PISA%202012%20Technical%20Report_…

 

* An expression meaning “receive an unpleasant surprise” for those of you unfamiliar with the phrase. Why kippers? No idea.