Memorable Teaching #Chemedcarnival

When I set the prompt for this carnival, I thought I was picking something fairly straightforward. Now I’m struggling to home in on one particular teaching session that was memorable. There are plenty that I remember, but few I remember for ‘good reasons’. Do I select the Maths for Chemistry lecture on ‘dress like a pirate day’ and talk about the student who leapt to his feet (dressed like a pirate) yelling ‘aharrrrr me hearties’ everytime the number four came up? Do I talk about the reasons that I don’t give out postit notes in class (hint: they were kidnapped and later found on my door with various things written on them)? Do I talk about the class that for one reason or another became a discussion about the right of women to wear whatever the hell they want and do not ‘deserve’ to be heckled in any way? Or perhaps the class that ended in a conversation about consent, drunkenness and unconsciousness? Perhaps focussing on the actual content delivery I should focus on a class where I watched the students get it, or a class where what I did seemed especially effective. Then maybe I should think more about my time as a student and consider the lecturers with an enormous capacity for chalk’n’talk (14 sides of derivations), starting to use lecture notes that were shared on the web (and printed in 48 point font as they were the versions used to print the OHP slides), making us research and present content each week, or the coming plagues of powerpoints and torture thereby.

It seems that teaching sessions are much like holidays – many memories for many different reasons. My personal conviction is that any teaching session you walk away from thinking ‘that wasn’t actually too bad’ is a good one.

I teach a fair range of courses and mostly the teaching materials evolve to some degree each year. My current thing is mini-PBL: we start with a lecturey lecture (traditional style content delivery) then in the next session have a brief intro to a problem, a set of questions to structure solving the problem, and some resource slides/reading/bring your own device and do research. A period of class time is allocated to working the problem through, then a whole cohort discussion to get some answers recorded. Mini-PBL because I will not send the class away with homework at times of the semester when they are overloaded with coursework. My own research is pointing out very strongly that this is not an effective or sustainable (or reasonable) approach. It is our job as educators to wring the most out of contact time. Some of those courses are memorable by virtue of being very dry and tedious, some are a lot of fun. Generally it is the subject matter that governs this rather than the teaching method. I’ve yet to find a teaching method that works for the driest content other than ‘suck it up folks’.

I do have a favourite course, and it’s one that I’ve taught the longest and I think it’s probably my best teaching. It’s memorable because I can predict where the students will go wrong and when, how they will go wrong and the fewest steps to getting things back on track. Sometimes I see a new error or misconception but I’ve got a list of the major ones these days. The course also overlaps with a research project into diagnostic tests so there’s a nice synergy there. Perhaps that’s why I picked that topic for the diagnostic test rather than the dry and tedious course.



Chemistry Education Blog Carnival #chemedcarnival

I miss blog carnivals. For those who weren’t lurking in the chemistry blogosphere back in the mid/late naughties, a blog carnival is where a bunch of bloggers write posts on a specific theme proposed by the host. On carnival day, the host publishes a blogpost with links to all the posts and encourages traffic to the participating blogs.

So I propose that here, on A Chemical Unconformity, there will be a chemistry education blog carnival on February 28th.  I don’t include chemistry in the title to exclude people who don’t identify as chemists – it’s open to all who wish to respond to the topic. It’s also open to teachers of chemistry/science at all levels from primary, through secondary and into FE and HE (K12, community college, universities), and to students and…well anyone who wants to write a post on the topic and share the link!

In common with most courses in learning and teaching, let’s start with a nice easy topic:

The most memorable teaching session you have participated in.

It might be one you delivered, one you were a student in, or one you were an observer of. It might have been for any group of students or even part of a conference.

There is no obligation to write a new post, if you’ve got something in your archive that fits the bill, that’s absolutely fine.

Don’t have a blog and still want to play? A blog carnival is a collection of links to things written by interesting people. You could link to a series of tweets threaded together, a static page on a personal website or on any other web based means that can be viewed without requiring a logon for the reader.

Submission of posts: please leave a comment on this post with a link to your post by midnight 26th February (your local time). I will compile the carnival post on the 27th so it can appear on the 28th.

We’re going to need a hashtag! Please share your posts on twitter or other social media using #chemedcarnival

A few rules which may be summarized as ‘be nice’. 

I will not link to posts that are offensive or derogatory to students or teaching staff, or that could be construed as derogatory to prior or parallel levels of study.

I will not link to posts by anyone who can’t behave themselves reasonably in comments or social media related to this carnival. I will also report people for behaving inappropriately, delete comments and use my considerable powers of persuasion to convince the universe to unleash mayhem on them.






VicePhec Storify Backups

The Storify backups I made of Variety in Chemistry Education/Physics Higher Education conference were causing a few issues with the blog loading. I have converted them to pages to eliminate this problem, and each has it’s own URL. They will take time to load as each tweet is fetched individually but that’s less than the time they take collectively.

If anyone associated with ViCEPHeC or the sponsors would like to host these on another site, please get in touch and I can send the html files.










Keele Seminar

I gave a seminar at Keele University (my own uni) yesterday. I decided to present two works in progress (WIPs) rather than any complete thing. I read somewhere that presenting before you’re ready with the completely polished final thing was a useful way to get some feedback and drive your thinking about a project so I thought I’d give it a go.

From Tesla to TESTA: Meanderings in Chemistry Education Research

This seminar comes in two parts. The first looks at the use of diagnostic tests to evaluate the knowledge of students at the start of a block of teaching. Over the past 3 years, two diagnostic tests have been developed, one evaluating 1st year’s knowledge of topics in spectroscopy on entry to 1st year, and the second evaluating 2nd year’s knowledge of NMR and related topics in preparation for a multinuclear NMR course. The prevalence of key misconceptions is determined and areas to be addressed in subsequent teaching identified. The second part looks at the use of the TESTA (Transforming the Experience of Students Through Assessment) process to catalogue changes in the Chemistry course from 2010 to 2017. The TESTA process provides metrics to evaluate the nature of assessment and feedback processes, however is deficient in a key regard: impact of assessment deadlines on student workload. Assuming an ‘ideal spherical student’, a student workload model is proposed and considered in the context of having sufficient time to participate in assessment for learning activities.


If you’ve seen any of my poster presentations in the past year, you’ll be familiar with some of the diagnostic test stuff. I’ve been plugging away developing diagnostic tests for about three years now and I’m getting fairly close to happy with the question sets. I find they are very useful in informing my teaching and a couple of changes I made this year seem to have added clarity to the responses received. Firstly, I moved the tests from paper-based (MCQ, confidence scale, free-text response bit for explanation) to online (MCQ, confidence scale, studied before yes/no/maybe). This allowed for automatic marking and meant I didn’t need to spend hours typing in the answers. I also changed the confidence scale from a 1 – 7 scale with 1 being ‘not at all confident’ and 7 being ‘highly confident’ to a categorical scale. I intended this to make it easier for students to select an answer (I found myself becoming tied up in the difference between a 5 and a 6 whereas the difference between ‘neither confident nor unconfident’ and ‘a little bit confident’ seemed easier to grapple with). Never the less, I do employ NSS style groupings to split the responses into low, medium and high confidence.


The second part is based in part on the ‘Spherical Students’ post and is consideration of a student workload model as well as the use of the TESTA (transforming the experience of students through assessment) process to evaluate a curriculum review. When I looked at our TESTA data from several years ago with a view to seeing how things had evolved since the modules were shiny and new, I realised that while it accounts nicely for type and number of assessment and goes into great detail on feedback, there’s little about considering the timing of assignments. This, combined with my on-going bewilderment about how more ‘active’ forms of learning requiring not insubstantial amounts of pre-sessional activity should be accounted for in module proposals (where hours are broken down) and in timetabling (if you’ve got 8 lectures a week, can every lecture come with an hour of prep?), has all lead me to start reading about student workload models. It’s fascinating stuff and seems fairly under researched in HE. The basics are obvious: it’s really difficult to pin a meaningful metric on the task of evaluating student workload and methods vary. They include word counts and proxy word counts (e.g. a 15 credit module is 4000 – 5000 words) which obviously doesn’t work that well for calculations and the like, time on task which is objective (how much time should be spent) and subjective (how much time is spent) and further complicated by the idea that researching and writing 1000 words at FHEQ level 4 (1st year UK uni) is very different to researching and writing 1000 words at FHEQ level 6 (3rd year UK uni). An effective student workload model must make some allowances for level of difficulty of material but really should be so much richer than that. So far I’ve only got as far as figuring out that where we put our deadlines is really important in whether students appear overloaded in any given week or not but I’ve got more reading and thinking and modelling to do on this one.

As it was about work in progress, I’m not uploading the slides. I got some really good questions at the end and need to think some more about some of the issues.

Spooky Slimey Vampire Science

Last Saturday we did the ‘Spooktacular’ science outreach event at Keele University Hub. If my recollections are correct, it’s the 6th Spooktacular I’ve taken part in and the 5th where I’ve designed and run activities with the help of student volunteers. The audience is families with wee kids, typically primary aged so there has to be lots of seasonal fun and wee bits of science. These events are extremely worthwhile because:

(a) the whole family can attend and so parents and other adults can get involved with the science bit too, helping with the tasks and hearing the explanations.

(b) it allows undergraduates, postgraduates and academics to run activities in a marketplace format and hence test out their science communication skills in a dynamic and changing environment before (for example) taking on a larger project such as going into a school.

(c) it’s fantastic publicity for the institution and cements the role of the university as part of the community. It is a generally positive experience for attendees and breaks down barriers about what a university is.

I started out with one planned activity but then got rather a lot of student volunteers so came up with a second. Here’s what we did and, importantly, why.

  1. Slime (well it’s Hallowe’en, you can’t not!).

Since the whole PVA-borax fiasco that’s resulted in a lot of recipes for PVA slime using contact lens solution with boron based buffers being bandied about, we changed our normal approach to slime to alginates. In the past we’ve had mixed success with them, primarily because the interaction was a little on the short side. This time:

  • measure 20 mL of distilled water into a plastic cup
  • add one spatula of alginate powder
  • stir with lolly stick (highly sophisticated scientific labware)

Now, it takes a fair bit of time to dissolve, a couple of minutes and while this can be challenging for the youngest participants, it provides a good opportunity to talk about the science of what’s going on.

  • add food dye and glitter to the now thick solution.
  • Using a large small pipette (comedy description but this year I bought kids science pipettes from amazon that are around 5 mL and have a larger bulb on them than the usual disposable 3mL plastic pipettes), add to calcium chloride solution.
  • Make worms/beads/lumpy aggregates and fish from solution using large forceps (I bought kids forceps as metal lab ones are often a little difficult for wee hands). Prod, poke and generally play.


  1. Mission Starlight…Vampire Edition

I have a backpack in my office from Summer 2016 when we took the RSC’s global experiment ‘Mission Starlight’ to a summer science event. That was when we had the UV-active beads on teddies and we were protecting teddy from the sun and talking sun safety with visitors. The UV beads change colour in sunlight and we provide various cloths, films and sunscreens to build the perfect UV protection suit. For Hallowe’en we adapted it to protect Vampires from sunlight – what would a Vampire wear to avoid the sun? I bought some rubber bats and put the UV beads on them. It worked pretty well but one draw back is it involves the participants going in and out of the room which can be alarming for parents and guardians, particularly if the room is busy. We also had some UV emitting LEDs which came attached to pens as part of an ‘invisible ink’ set. UV active dye in the pen, no obvious marks, shine the LED at it…you get the idea. In any case, those worked well when the sun went in a wee bit.

Mission Starlight was one of the best RSC Global Experiments in my opinion because there is no clean-up, few consumables (the fabrics can be reused), and it’s dry to store. I’m in favour of outreach being a bit more sustainable in the generating limited waste department. Other experiments such as the polymer hydrogels and the vitamin C in foodstuffs required quite a bit of wet prep and disposal. I need to find more activities like this, although I’ll note that not pre-dissolving the alginates did cut down on prep time significantly and allowed the participants the chance to dye their own alginate.

Next on the outreach calendar is our regional Top of the Bench Heat. I’ve got the practical challenge sorted (shhhh! Top secret until after!) but need to figure out the written challenge.

Automated Marking of Tests/Quizzes

I’ve been dabbling in the automatic marking of tests and quizzes for several years now. By this, I mean a web-based set of questions that a student completes on a specific topic, that automatically grades the answers as correct/incorrect (and sometimes gives partial credit) and returns the mark (sometimes with feedback) to the student at a specific time. Before you think – oh this is good, no marking, I’d warn you about setup burden.

What kinds of assignments make really good auto-tests? I have used them for the following:

  • pre-laboratory exercises that give practice at calculations, some safety aspects, and identifying products/balancing equations (Blackboard Test/Pool)

  • online safety quiz  (Blackboard Test/Pool)

  • assessed tutorial with a very tight deadline before the exam  (Blackboard Test/Pool)

  • referencing and academic conduct test  (Blackboard Test/Pool)

  • diagnostic test (new for 2017! Google Form Test)

The technology has limitations, particularly related to the type of questions you can ask. I find the following types useful:

  • multiple choice questions

  • calculated numeric [with the caveat that Blackboard can’t deal with 5.5 and 5,5, units, or number of decimal places]

  • fill in the blank or short answer [with the caveat that students often can’t spell (even when given a list of words to select their answer from), and sometimes deducing the syntax of the required answer is tricky]

  • matching pairs [really good for reactant/product equation matching in transition metal redox chemistry]

I also like the ability to write a pool of questions and a system that allows each student to be asked a number of questions from the pool. If every question is from a different pool, this reduces the scope for collusion. An example of a good pool question stem for a calculated numeric question:

Calculate the mass of copper sulfate required to prepare a [Y] molar aqueous solution in a [X] mL volumetric flask. 

You can see how simple it is to vary X and Y within the realms of possibility, generate all the correct answers in excel and make a pool of questions.

The setup burden is how long it takes to create the initial test. As a rough guide, I’d estimate it to be at least twice as long as it would take to manually mark! So for a pre-lab done by 50 students, taking me 10 hours to mark, I’d expect to spend about 10 hours developing the online version. I do not recommend doing the online test thing unless you know you can use it for at least 2 years – one reason for doing it is to reduce the marking load and you don’t really start to make gains until the 3rd year of running. On the other hand, it’s a convenient way to ship time from semester (marking time) into quieter times of the year (prep time). I estimate that each test requires 1 – 2 hours of tweaking and set-up each year, usually after reviewing the analytics from Blackboard, weeding out poorer questions, adding a couple of new ones…that sort of thing.

Why do I do this? Well each of the assignments I’ve outlined are reason enough in themselves, but some have transitioned from paper-based to online (pre-labs, diagnostic test) and some would not exist if they could not be online (safety, referencing, academic conduct, assessed tutorial). So sometimes there is no reduction in marking time for me because I wouldn’t offer the assignment in an alternative manner. Technology facilitates the use of formative tests to aid learning, so I use it.

This year I’m expanding my range of formative tests to transfer my 1st year spectroscopy ‘drill’ questions into an online format. When teaching things like the equations of light, basic NMR, IR etc, I recognize the value in doing lots of examples. I also recognize the value in those examples stepping up in difficulty every so often, I’ve been calling them levels.

For example, using the equation E = hν

Level 1 – calculation of E in J, with ν in Hz

Level 2 – calculation of E in kJ/mol

Level 3 – calculation of E in kJ/mol with ν in ‘insert standard prefix here’ Hz

Level 4 – calculation of ν with energy in J

Level 5 – …

You get the idea anyway.  I read a paper on this a few years back, about stepping up calculations in small steps.  So I’m making question pools for each level, bundling a few levels together into a quiz then setting a minimum score requirement to gain access to the next levels. Students will do quiz 1 and if their mark is high enough (80%+) they get access to quiz 2. If it isn’t, they’ll get access to a couple of worked examples and the chance to re-do quiz 1 to get the mark.

I’m aware that this type of drill enforces a purely algorithmic approach, but if my students can’t do these bits, they are going to run into a whole lot of problems at higher levels. When setting exam questions, I balance the question between the algorithmic problem solving stuff like performing a calculation, and the ‘explain your answer’ part where they need to demonstrate a degree of understanding. We can argue over the appropriate balance between those sections but I think the algorithmic stuff should be 40 – 60% of the marks available (depending on the level of the paper) and the balance should be the explanation stuff, or higher level problem solving such as unseen, unfamiliar problem types. With this balance I’m saying ‘well you can probably pass the exam if you can do the basics but you need to show more to get a great mark’.  I also assume that intended learning outcomes define a passing mark (40%) or a low 2:2 mark (50%), rather than a 100% mark.

The experience of setting up and running diagnostic test through Google Form Tests requires a post on it’s own so I’ll come back to that.


Reflections on Variety in Chemistry Education/Physics Higher Education Conference 2017

This year’s conference started on Thursday 24th August 2017. I did not attend the lab events on Wednesday because I think a 2-day conference is sufficient – I get seriously ‘conferenced out’ after a while and I’d rather focus on the talks. I did arrange to arrive in time for Wednesday dinner so as to avoid early trains and a very long day by arriving on Thursday.

This was also the first time I attended a conference seriously aware that I had limited energy and that I needed to take care not to over do things. There were a few challenges in that regard, simple things like getting between venues, accommodation, dining rooms, and bus stops was harder, as was standing for several hours in a poster session. It’s quite eye opening when you notice that you struggle to do things you previously took for granted. The only advice I have for people organising conferences, is to be really explicit in delegate information before arrival about distances between things, requirements to take buses, and to give serious thought to how much seating is available during sessions such as posters and lunches.

I enjoyed many of the sessions, and a lot of the talks. I found several talks highly frustrating, particularly those that could easily play into the unconscious bias held by many. These included the keynote on gender differences in the physics force concept inventory, and the closing keynote on perspectives either side of A-level/university chemistry teaching. Here’s the thing: unconscious bias afflicts us all both as recipients and as people who hold them. One particularly insidious one is the bias that those who teach in HE often have towards those who teach in secondary.I’ve heard a lot of academics disparage teachers of chemistry at A-level, particularly around ‘teaching to the test’.  I do not doubt that many HE teachers view incoming students through a deficit lens – they focus on what the students can’t do rather than respecting the enormous hard work and effort and learning that has taken place in secondary (and further education). And let’s be clear, I don’t mean all HE teachers but I’d rather focus on what our incoming students can do and acknowledge the diversity inherent in the secondary and further education sectors, and the enormous pressure on both teachers and students. I disliked that the keynote could feed those attitudes by drawing greater attention to the issues.

My (annual?) issue with the gender thing. Males and females performed differently on a test that was developed to see how much students understand about some physics stuff. It wasn’t designed to investigate gender, and if it shows a difference in male/female performance, we should absolutely investigate and be concerned. But we have to be really careful how we do this and make sure that we’re using the right research tools and asking the right research questions. We also have to be really careful to explore gender sensitively which means considering all aspects of gender, rather than treating it as a male/female binary.


Moving on from that, there was an interesting talk on teaching tips for making a more inclusive teaching environment. I think that’s something we all need a reminder of from time to time but I would have liked a few more concrete suggestions on things that work. PDF format is variable and not all versions work for screenreaders…OK then, which way of generating a PDF is best? That sort of thing. And it would be really good to share ideas about how to convince all those who teach in HE to adopt practices that make teaching as inclusive as possible.

The poster session didn’t really work for me – there were two rooms, with many of those in the first room unaware of the second room. There were also very few seats and that made it very very tiring. I had several very good discussions with people over my posters but never really made it round them all. Fortunately the majority of the posters were shared electronically beforehand and this allowed me to review them at my own pace. There were still lots of people I wanted to talk to in the poster session but I never found them – and by the time lunch was served, I was out of energy.

There is also some question over what Variety in Chemistry Education conference is for. What type of chemistry education thing. To me, the strength of Variety is in the variety: I go to see great chem ed ideas presented, but I also enjoy presentations that steer more strongly towards chemistry education research. I like the mix and I wouldn’t want to see that change. I do, however, have one caveat: I dislike intensely anything presented that has not been sensibly evaluated. And to evaluate a teaching innovation, one must carry out the activity with students. I’m OK with the evaluation being done purely from the perspective of the teacher, I’m OK with the evaluation being done from all perspectives. I’m not OK with unevaluated activities being presented. I suppose I should comment on the role of Variety in seeking collaborations, or putting a flag up to say you’re doing a certain thing and would anyone like to help. That’s what oral bytes are for, nothing longer. I would call on future organisers of this conference, and those who scrutinise the abstracts to ensure that evidence of evaluation is clear.

I’ll also note that I have no intention of attending the pre-conference activities. 2 days of conference and dinner the night before is sufficient for me, and I don’t need to attend more.

We also had a chat on Twitter the other night about whether we should facilitate the attendance of undergraduates who’ve done chemistry education final year projects. I think it would be good to see undergrads attending Variety and if their work is of sufficient standard to justify submitting an abstract for a presentation, then great stuff, what an opportunity! I would not, however, reduce the ‘standard’ expected of that abstract or subsequent presentation. So I’m not in favour of giving undergraduates (or any other group) more chance of getting a presentation slot just because of what they are.


This post has been several drafts in the making, I’m still not happy that I’m articulating what I want to say very clearly, but right now I’m done with it and am hitting publish! Time to move on!


Poster: Alternative Assessments #ViCEPHEC17

This is my poster for #ViCEPHEC17 on a range of assessments through 3 years of the chemistry curriculum. It was a surprisingly difficult poster to put together as there was so much I wanted to include. In the end I focussed on the general skills developed rather than the specifics of each assignment.

One of the highlights of these assessments is getting students to create ‘What Am I?’ puzzles. This started as a part of this blog where I’d post the chemical components of everyday items and readers had to guess the item. Since running this as an assessment, I’ve hidden those blog posts. The students have to decide on an everyday item, investigate the chemical composition of it, draw the compounds using ChemDraw and make it an appealing single page graphic. The purpose of the first page is to allow the reader (that’d be me) to guess the item. I’m getting very good at identifying various body sprays by chemical structure. On the 2nd page they should indicate what the item is, and briefly outline the purpose of each chemical in that item, finishing with a reference list.  Should you wish to play, here is the What Am I? Variety in Chemistry Education Edition:

Below I’ve listed the key chemicals found in a common thing, which may be slightly UK-centric.  I’ve drawn the chemical structures of principal components where simple and appropriate; given the E number or CAS number (however tempting Sigma-Aldrich catalogue numbers would be) if no simple chemical structure exists for an additive; and given the chemical formulae or name if neither of the above make sense.  See if you can guess what this is!  If you guess on Twitter (@kjhaxton), please DM your guess so others can play.


structures of chemicals contained in common item


And should you wish to find more of these puzzles, try:

With respect to the other assessments I mention, I have presented on some of the elements before so there are a couple of slide decks for further info.

For more information on the 3rd year infographics:

For more information on some of the other assessments

And on the 1st year screencast presentations


Poster: NMR Diagnostic Test #ViCEPHEC17

One purpose of this blog post is to provide additional materials for my poster at Variety in Chemistry Education 2017.

This is the third of three posters presented this year on our NMR diagnostic test project. It’s a subset of a larger project but the NMR results have been quite interesting and I’ve also been fortunate enough to have a couple of students work on the project.

For those attending ViCEPHEC17 who fancy having a go at some spectroscopy diagnostic test questions, here’s a link to a shortened online version. All submissions are anonymous and there are only 5 questions!

If you have any feedback on the test, you can comment on this post, or get in touch at Variety or by email. Using GoogleForms is new this year for this type of test.

The ViCEPHEC17 Poster is here:


For the RSC Twitter Poster competition, this poster was submitted which outlines some of the key initial findings:

For Methods in Chemistry Education Research, the poster focussed on the research methods we were using to investigate this project:

Poster: Methods of investigating alternative conceptions in NMR

Better Ed Tech Solutions

It’s course prep season here. It’s fun. Sort of. I’m looking forward to term starting. Kind of. I’m doing everything the same as last year. Not really. I’m frustrated that there’s no ed tech tools that do what I need them to. Totally.

Screencast Presentations: After last academic year’s wee ‘oh shit I broke the VLE and this time it can’t be fixed’ moment, I’m looking for a legitimate alternative for the 1st year screencast presentation assignment.

Here’s the workflow:

Student: – create and submit screencast; complete electronic self-assessment form

Me: – allocate or initiate peer assessment, each student views and marks 4 screencasts and completes electronic peer-assessment form on each.

Student: – watches 4 screencasts, completes electronic peer-assessment form after each, completes 2nd self-assessment form.

Me: – screams in dismay as I have to compile (for a class of 100): 2 x self-assessment (200 items), 4 x peer-assessment (400) items, moderate the grades, arrive at the final grade and release marks and peer-assessment feedback to submitting students.

Now to be fair to me, I’ve got the grade compilation bit down to around 2 hours work for a class of 100. I’m fairly whizzy with excel (mmmmmm….data). But I think it should be possible to do this automatically which is what the thing that I broke on the VLE did (well not exactly but the shortcomings were far outweighed by the benefits).  The hard labour version of this is google forms for the self- and peer-assessment bits which generate multiple spreadsheets, then the excel-whizzing to bang it all together. [if you’re ever doing this, the “IF” function is your friend to ensure that you’re putting the right grades together].

Voting for Peer Instruction: Another problem this year is that I’m not hiring a supply chain of lorries just to get me and my teaching stuff to class.  That means no carting of personal response devices around. I have peer instruction questions, I’d like to do PI with students. I tried Socrative last year but the BYOD element of this was…frustrating.

Firstly, I couldn’t work out if Socrative could do those fancy graphs that the Turning Technologies stuff does – the ones that you can display in-presentation to the class to show the range of marks for an MCQ without the correct answer being shown. I couldn’t find it.

Secondly, (and this is a common complaint from me), most of my pre-existing MCQs weren’t in a format that could be easily imported or exported, and indeed multiplatform compatability was pretty much non-existent. Now if you think I object to 2 hours bashing grades in excel, it’s nothing compared to the 2 hours copy-pasting questions between different programmes that might do what I need.

Thirdly, there aren’t enough power points in lecture theatres for students BYODing laptops in a 2-hour class. Wifi held up well, batteries did not.

[In the interests of fairness, the personal response devices can be a pain when no one’s checked and replaced the batteries for a while…but at least I know which door to knock on about that. And there’s the weight.]


Reflective Diaries: last year I ran a reflective diary exercise for the first time. It worked fairly well and I used VLE blogs for the purpose. The interface for grading is basic, and doesn’t allow grading each post rather than the overall blog (unless I missed something). There’s also something quite irritating about the month by month view and it’s impact on the total number of posts submitted. This year I’m setting up a Google Form (set as a test) for each reflection topic and am wondering whether to run this through the VLE with links to each form (generating one spreadsheet of outcomes per reflection) or to brave Google Classroom which I believe will dump all the grades into one spreadsheet and save me some excel-whizzing. The plus side of Google Classroom definitely fits the ‘mmm shiny new play time’ category.  I can see some features that really appeal to me for the type of module I’m running. The negative side is that I will be making it all up as I go along and really have little idea beyond that which I can figure out or google. Actually I can see some really nice features in Google Classroom that would work well with the module generally so it might be worth the investment of time.


So, how about you lot? Trying anything new this coming academic year?