I went to the #Design4Learning conference last week – yes OK, I’m a bit late with the write up, its been a hectic week or two!
The conference was held at the OU, which had the advantage that I could get home in the evening for my son’s birthday, but the disadvantage that I kept popping back into the office to keep on top of things. It reminded me that while I hate the coffee, lunch, and conference dinner opportunities for informal contact (being an introvert by nature), there is something lacking if you don’t participate at all in them.
It was great to spend a couple of days thinking about what we’re developing, and why, rather than how. I really value these opportunities to talk to academics. The focus of the conference was around learning design and learning analytics. I won’t write up all the sessions that I went to, but here are some of the highlights for me…
Sharon Slade presented about the OU’s ethics policy for learning analytics and the questions that students raised. The OU is considered to be the first university with an ethics policy for learner analytics. There is a real challenge in working out how we communicate with students what we’re doing. Students on one hand want a personalised experience, but they don’t want us “snooping”. They’re happier with feedback based on trends rather than personal activity. What is clear is that the results of analytics should focus us on what questions to raise with students, rather than making conclusions on what’s going to happen to them.
You can never really know why a student has a pattern of activity, and whether or not they’re likely to fail as a result. Elsewhere in the conference (I forget when) I heard it said that students who post often in the forums and turn in their assignments on time are most likely to pass. Back in the 90s when I was an OU student, I certainly handed in my assignments on time, but I never posted in the forums. I was too shy, too lacking in confidence that I had anything of value to add, too unwilling to expose my own stupidity… I wonder what the analytics would have said about me… But I passed with flying colours. Similarly my son (who has learning difficulties), always fails tests, rarely hands in his homework, never raises his hand in class… but you only have to talk to him to know that he is learning. Obviously “learning” isn’t good enough for a university which needs students to pass qualifications to prove success, but we live by different rules in my house! Learning analytics done badly might suggest that a uni should give up on my son. Done well it should suggest that he needs a lot of extra support to pass despite innate ability.
Simon Cross talked about his Open Education MOOC which formed a blocks of a formal OU course, where students and the public learned together in the open. Apart from being pleased to see my old project, OpenLearn, being used for this, the thing that most interested me was that students had concerns over what they are paying for with their OU study (worrying perhaps that they didn’t value the assessment, tutor support etc), and that they wanted the badge as well as the TMA grade – and I thought badges were a gimmick!
There was a useful learning design tool from Imperial College London called BLENDT. You plug in your learning outcomes and it helps you work out what sorts of (online) activities will help meet them. It is based on Blooms taxonomy where objectives are classified as psychomotor, affective or cognitive skills – users pick words in their objectives such as “explain”, “list”, or “discuss” and the system works out which of the skill sets these map to and then presents example activities that best suit meeting those objectives. It is customisable based on factors such as group size, to cater for the fact that some activities don’t work in some situations. This tool aims to provide a discussion point for teachers to make final decisions on activity mix. It looked like something that could be very helpful in supporting work to embed learning design in our every-day practice. I wonder whether we could write a similar Moodle module? Or maybe some-one already did?
Finally, one fascinating piece of software from Denise Whitelock at the OU’s Institute for Educational Technology was Open Essayist. which lets students upload draft essays to get feedback on structure. The tool shows you where your key words and phrases are in the essay, helping you ensure you have good start/middle/end, spread of keywords across the essay, and clear connections between concepts. Because it provides feedback on structure only, it should apply across disciplines. It has been proven to improve grades, although obviously structure is only one component of grade, so you do have to say something useful and relevant and understand the assessment criteria for your subject to get top marks. Denise has also found the tool useful for analysing MOOC comment threads and creating paper abstracts. There was clear demand in the room from OU tutors for us to provide this to all students, especially at level 1. I would have loved something like this when I first learned how to write essays.
This blog post feels a bit rambling. I wonder what Open Essayist would make of it! Anyway, I hope I’ve given you a taste of the conference and some of the ways that analytics may be changing the services we offer to students and the way we design learning in the future. I hope my team get a chance to be involved in some of these development.