Methods in Chemistry Education Research 2019, Edinburgh
If you’re interested in different perspectives, may I recommend Michael O’Neill’s blog reflecting on MICER. Michael was another on my ‘I tweet and would like to meet’ list, along with Maria and it was really great to meet them both at this meeting.
The fourth MICER meeting has just happened, and I’ve been fortunate to attend 3 out of 4 of them, and also some of the previous one-day meetings, around getting started in pedagogical research. I always come away from these meetings feeling a little more ‘on track’ for my research, and while a lot of the time I’m familiar with some of the methods being described, I always see a new dimension that I hadn’t considered.
Highlights for me were Maria Gallardo-Williams’s description of student produced videos for laboratory skills. Now laboratory stuff isn’t my thing, but what struck me was the lovely way the work was organised across time – shared ‘how to’ guides, collaborations back and forth between different participants. I felt that would really solve some of the challenges of running a chemistry education research programme predominantly with 3rd year project students. I also very much enjoyed Sam Pazicni’s talk on the use of surveys and I’ll hold my hand up and say that I stand corrected on a few aspects of good survey design.
One talk that I really struggled with was Nicole Graulich who was discussing her work on evaluating student reasoning. My principle struggles were with the context of the work: organic mechanisms. Those are not my thing at all, and I really struggled to apply the methodology to the example given because I kept getting side tracked figuring out what was meant. The work was impressive however, and I could see how it could be a new direction for my misconceptions/diagnostic test work. I suspect that investigating how students reason when solving problems across any branch of chemistry would lead to far deeper understanding of misconceptions, particularly between levels of study in higher education.
Aishling Flaherty reminded us all to check our assumptions when carrying out work. She noted that she looked back at her PhD work and saw where she had made assumptions that perhaps now she wouldn’t have, or now would recognise. That’s both remarkably honest, and I suspect totally normal. Part of the point of being an academic is the notion that your PhD was good, but you can do better, and would do better if you did it again.
So I’m off to work out some shared resources for my project students so that I can try to work on my projects across years rather than reinventing the wheel (thanks Maria!), and I’ll be updating my ‘how to write questionnaire’ notes for my project students (thanks Sam!). I need to think up a way to make my project students understand that they all make assumptions and how to identify them and work out the impact they have. And then I need to sit down and tackle Graunlich’s paper on student reasoning. Stats and Discourse analysis, and publishing it will wait for another day. I ran 2 x 2 hour sessions on chem ed research methods for my project students this year, and I’m already planning what we’ll cover next year. We did content analysis with sweets (Barry Ryan, MICER 18), and an overview of techniques, but I think this year we’ll add in a few more exercises on assumptions.