That was the academic year that was…

…mostly characterised by marking over 1800 discrete items of assessment (I’d forgotten about a few when I tweeted this earlier). That came from around 90 hours of lecture/teaching/workshops, and about 20 hours of lab supervision.

The marking felt never ending and I can see why. I’m not offering a great deal of insight into the breakdown of tasks there but suffice to say, the following assessment types are well represented:

  • reflective diary (no not a lab diary which is rarely reflective in the reflective sense)
  • academic essay
  • magazine style article
  • magazines (group)
  • business report (group)
  • business pitch presentation (group)
  • business presentation (group)
  • individual presentation
  • dissertation
  • screencast presentations
  • project report
  • exams
  • class tests
  • infographics
  • application form
  • posters
  • poster interviews
  • project interviews
  • presentation slides
  • watching presentations
  • short written pieces with a specific remit
  • annotated bibliographies
  • dissertation plans

For the forthcoming academic year, there will be a decrease in exams, the academic essay is gone (thank goodness), and some of the short pieces/infographics will be gone. It should be closer to 900 in the next academic year.

As I imply above, I dislike the academic essay so I’m glad to see it gone. I enjoy watching in-person presentations but can only do so many at a time. I’m not hugely fond of oral assessment such as interviews or oral exams, again can only do so many at a time. For the majority of these assessments, every submission was different, or there were several variations. I know that I can’t sit and mark 100 lab reports on the same experiment very easily, but 100 magazine style articles on a wide range of student selected topics is fine. I particularly enjoy marking infographics because part of the whole idea is that they should convey their meaning directly and graphically. I also find annotated bibliographies far superior to ‘just’ reference lists because it is far harder to pad it out when you have to state what information comes from each source.

Many of these assessments are small, intended to allow for feedback that can improve later and larger submissions. I find it very frustrating to mark work where previous feedback has not been acted on (or even viewed in many cases). If feedback on a small piece of work is to include figures/images/tables to convey additional content, I do expect to see figures/images/tables in the longer piece of work. If feedback is to review the reference style guidelines, I expect to see greater adherence to them the next time around. It does not motivate me to write feedback when I see the degree to which it is acted on in some cases. I do have, however, a project in progress at the moment looking at more efficient ways to give feedback. Part of this is allowing students to request what feedback they want and I’ve got to analyse the results of that from this year to plan phase 2 of the project.

Well, bar August reassessment, the marking for this academic year is over. I’m going to devote some time this summer to streamlining assessments further, working out how to provide feedback in a more accessible format (the tools we’re using don’t feel like they are working but that’s nothing new), and also to improving assessment guidelines. I want to make it more clear why we are doing these assessment tasks, and also why they do or why they do not ‘help’ with exams in modules where exams exist. I want to highlight the additional skills embedded in many of these tasks, somewhere I got it into my head that this would be useful.

#ChemEdCarnival 2 What education research has most influenced your practice?

As a postdoc I had relatively little exposure to teaching (probably a standard quantity for a postdoc). Sure, there were a couple of project students to supervise, the odd grad student floating around, a couple of lectures and some pizza fueled marking but there wasn’t much teaching (or outreach which is another post entirely). I did, however, attend a seminar by Carl Weimann on teaching physics (related to the establishment of the CWSEI By the end of that postdoc, I’d had the good fortune to hear Prof Weimann’s seminar twice – things take a while to sink into my head, and I’d landed an academic job back in the UK. I hadn’t realised just how influential those seminars were going to be but looking back on ten years of being a lecturer/senior lecturer, it’s clear now they were.

So what was the seminar on?

Cut content. Teach concepts and ‘think like a physicist’.

It seems silly to write a long content dense blogpost to elaborate on this but I’ll have a go.

How we convey information (=content) is generally a limiting step in learning. Too little information and learning is curtailed, and historically this is the origin of the lecture: one copy of  a text book and a sage on a stage to read it out. And as technology advanced from printing presses to powerpoint, conveying information became easier and easier and our expectations became greater. Powerpoint is singled out as an evil of lecturing but that’s principally because it facilitates information delivery, far more information delivery than chalk and talk permits. Chalk and talk is a self-limiting means of conveying information in a teaching scenario. Too much information curtails learning, it makes learning overwhelming and leads to strategic practices as a means of survival. Prof. Weimann’s basic idea was that to build up conceptual understanding of physics, that is to really be able to understand and apply the essential physics behind things, you had to cut out a lot of content. Spend more time on the key concepts and practicing them through application, then tackling the more advanced stuff becomes easier. It can push learners through the transition between novice and expert without memorising a billion examples and exceptions.

An example: Cake 101

Learning Outcome: students who complete this course successfully will be able to bake a cake.

Content dense model of course:

  1. Victoria sponges
  2. Chocolate sponges
  3. drizzle cakes
  4. buns
  5. muffins
  6. fruit cakes
  7. fruit cakes with alcohol
  8. fruit cakes with alcohol and nuts
  9. royal icing
  10. water icing
  11. butter cream frosting
  12. decorations

Now image in each topic there is a 2 hour lecture. In the lecture, lots of variations on the theme are discussed.

Lecture 7: fruit cake with alcohol  – recipes will be considered involving fruitcakes made with sherry, whisky, and rum. Soaking fruit for minutes, hours, days and weeks will be considered. ‘Feeding the cake’ with alcohol after baking will be discussed along with appropriate timescales.

The learners in lecture 7 end up with 3 different recipes to learn, along with 5 protocols for soaking fruit and 3 methods for feeding the cake as well as the theoretical stuff about what happens to alcohol in the oven. If all of the classes are roughly like this, there will be 36 recipes, 50 protocols and 30 methods for various things to learn alongside the theoretical stuff.

Content lite model of course:

  1. sponges
  2. muffins
  3. fruitcakes
  4. icing
  5. decoration
  6. – 12. practical sessions

Still with a 2 hour lecture, but this time we recognise that sessions 1 – 4 in the content dense model can be reduced to one concept – they are all essentially victoria sponges in with different additives or cooked in a different tin. You cream the fat and sugar, add eggs, add flour and flavour and bake. Muffins require a theoretically different approach, adding wet and dry ingredients and typically use a liquid fat. Fruit cakes are in between, and involve a greater number of variables. Icing is essentially the same – some kind of sugary stuff with some kind of fat or fluid and you splodge it on top. Decoration stays largely as is – lots of pretty pictures giving ideas on how to decorate cake.

The learners end up with 4 recipes to learn, a couple of protocols and a couple of methods.  They also spend 14 hours practicing their craft.

The content lite model curates the information for better presentation to the student and explains why each cake is as it should be. It’s possible to extrapolate from these basic ideas into more advanced cake making methods as learners should better grasp the concepts of the cake rather than feeling pressured to memorise 36 recipes etc.

Feeling hungry yet?

So the most influential bit of literature for me was digesting the idea that concepts should be key in courses not content. And that’s even more important in the ‘current age’ when anyone can google a cake recipe. The trick is understanding why you can’t tweak some bits of the recipe but you can others. Cake is a flippant example for what is an incredibly challenging thing to do, particularly as it involves shaking off the ‘must cover content’ mindset. But done right…it’s really good.





Reminder: #ChemEdCarnival #2

Chem Ed Blog Carniva Number 2, 12th April 2018, CERG blog

The next Chem Ed Carnival is being hosted over on the CERG blog (Chemistry Education Research Group). The theme is: What education research has most influenced your practice?

You can find the details:

1st Chemistry Education Blog Carnival #chemedcarnival

Hello and welcome to the 1st Chemistry Education Blog Carnival. We’ve had a range of submissions including blogs, webpages and some very creative ways to share content! I’ve expanded the definition of ‘blog’ to include anything I can link to.

The theme was ‘most memorable teaching session’ and that’s about as wide as it comes. It’s also typically the first activity in many teaching courses.

One of the early submissions comes from Dr S who’s in the middle of teacher training. Last year he had the interesting task of lecturing and reflects on that experience: It’s a heartfelt post with some very valuable advice for anyone stepping into the lecture theatre for the first time.

Dr Patrick Thomson, a teaching fellow,  discusses his first Tea-ching session and the advantages of extending a warm welcome to students.

Dr Kristy Turner shares a recent experience using Knowledge Organisers with year 8, an interesting means of helping students learn in a way that could be widely applicable in high schools and beyond.

I couldn’t make up my mind about my single most memorable teaching session – so many sprung to mind. Some for teaching related reasons, some because of the characters of the students involved!

The team at Education in Chemistry have solicited contributions from many colleagues. There’s lots of heart warming tales of making a connection with students or students making connections with difficult concepts, in one case through the power of song!

Dr Michael Seery takes a different approach, thinking back to an experience as a learner not a teacher. Inspiration for good teaching sessions comes from all sources, chemistry related or not.

Prof Simon Lancaster continues the theme of lessons learned when recalling a particularly memorable chemistry laboratory experiment. It’s lovely to read how Simon’s returned to this particular incident on several occasions across a few years with different perspectives.

Dr Michael O’Neill joins us from twitter, also reflecting on an experience as a learner. I’m a big fan of props in teaching so balloon VSPER is really appealing!

Dr Clarissa Sorensen-Unruh considers a particularly memorable conference presentation where she identified a way she’d love her students to feel in one of her teaching sessions. It’s obvious from the post that she remembers the enthusiasm very clearly!

I think I found all the submissions but please drop a comment if I’ve missed anyone. I’ve really enjoyed reading the submissions and would like to do this again in the future. If anyone would like to host and come up with a topic, jump in!

And finally: homework! The RSC Twitter Poster competition is back on Tuesday March 6th with a ChemEd thread. If you’ve got some research worth sharing, I highly recommend it. If you just want to lurk, details are available:




Memorable Teaching #Chemedcarnival

When I set the prompt for this carnival, I thought I was picking something fairly straightforward. Now I’m struggling to home in on one particular teaching session that was memorable. There are plenty that I remember, but few I remember for ‘good reasons’. Do I select the Maths for Chemistry lecture on ‘dress like a pirate day’ and talk about the student who leapt to his feet (dressed like a pirate) yelling ‘aharrrrr me hearties’ everytime the number four came up? Do I talk about the reasons that I don’t give out postit notes in class (hint: they were kidnapped and later found on my door with various things written on them)? Do I talk about the class that for one reason or another became a discussion about the right of women to wear whatever the hell they want and do not ‘deserve’ to be heckled in any way? Or perhaps the class that ended in a conversation about consent, drunkenness and unconsciousness? Perhaps focussing on the actual content delivery I should focus on a class where I watched the students get it, or a class where what I did seemed especially effective. Then maybe I should think more about my time as a student and consider the lecturers with an enormous capacity for chalk’n’talk (14 sides of derivations), starting to use lecture notes that were shared on the web (and printed in 48 point font as they were the versions used to print the OHP slides), making us research and present content each week, or the coming plagues of powerpoints and torture thereby.

It seems that teaching sessions are much like holidays – many memories for many different reasons. My personal conviction is that any teaching session you walk away from thinking ‘that wasn’t actually too bad’ is a good one.

I teach a fair range of courses and mostly the teaching materials evolve to some degree each year. My current thing is mini-PBL: we start with a lecturey lecture (traditional style content delivery) then in the next session have a brief intro to a problem, a set of questions to structure solving the problem, and some resource slides/reading/bring your own device and do research. A period of class time is allocated to working the problem through, then a whole cohort discussion to get some answers recorded. Mini-PBL because I will not send the class away with homework at times of the semester when they are overloaded with coursework. My own research is pointing out very strongly that this is not an effective or sustainable (or reasonable) approach. It is our job as educators to wring the most out of contact time. Some of those courses are memorable by virtue of being very dry and tedious, some are a lot of fun. Generally it is the subject matter that governs this rather than the teaching method. I’ve yet to find a teaching method that works for the driest content other than ‘suck it up folks’.

I do have a favourite course, and it’s one that I’ve taught the longest and I think it’s probably my best teaching. It’s memorable because I can predict where the students will go wrong and when, how they will go wrong and the fewest steps to getting things back on track. Sometimes I see a new error or misconception but I’ve got a list of the major ones these days. The course also overlaps with a research project into diagnostic tests so there’s a nice synergy there. Perhaps that’s why I picked that topic for the diagnostic test rather than the dry and tedious course.



Chemistry Education Blog Carnival #chemedcarnival

I miss blog carnivals. For those who weren’t lurking in the chemistry blogosphere back in the mid/late naughties, a blog carnival is where a bunch of bloggers write posts on a specific theme proposed by the host. On carnival day, the host publishes a blogpost with links to all the posts and encourages traffic to the participating blogs.

So I propose that here, on A Chemical Unconformity, there will be a chemistry education blog carnival on February 28th.  I don’t include chemistry in the title to exclude people who don’t identify as chemists – it’s open to all who wish to respond to the topic. It’s also open to teachers of chemistry/science at all levels from primary, through secondary and into FE and HE (K12, community college, universities), and to students and…well anyone who wants to write a post on the topic and share the link!

In common with most courses in learning and teaching, let’s start with a nice easy topic:

The most memorable teaching session you have participated in.

It might be one you delivered, one you were a student in, or one you were an observer of. It might have been for any group of students or even part of a conference.

There is no obligation to write a new post, if you’ve got something in your archive that fits the bill, that’s absolutely fine.

Don’t have a blog and still want to play? A blog carnival is a collection of links to things written by interesting people. You could link to a series of tweets threaded together, a static page on a personal website or on any other web based means that can be viewed without requiring a logon for the reader.

Submission of posts: please leave a comment on this post with a link to your post by midnight 26th February (your local time). I will compile the carnival post on the 27th so it can appear on the 28th.

We’re going to need a hashtag! Please share your posts on twitter or other social media using #chemedcarnival

A few rules which may be summarized as ‘be nice’. 

I will not link to posts that are offensive or derogatory to students or teaching staff, or that could be construed as derogatory to prior or parallel levels of study.

I will not link to posts by anyone who can’t behave themselves reasonably in comments or social media related to this carnival. I will also report people for behaving inappropriately, delete comments and use my considerable powers of persuasion to convince the universe to unleash mayhem on them.






VicePhec Storify Backups

The Storify backups I made of Variety in Chemistry Education/Physics Higher Education conference were causing a few issues with the blog loading. I have converted them to pages to eliminate this problem, and each has it’s own URL. They will take time to load as each tweet is fetched individually but that’s less than the time they take collectively.

If anyone associated with ViCEPHeC or the sponsors would like to host these on another site, please get in touch and I can send the html files.










Keele Seminar

I gave a seminar at Keele University (my own uni) yesterday. I decided to present two works in progress (WIPs) rather than any complete thing. I read somewhere that presenting before you’re ready with the completely polished final thing was a useful way to get some feedback and drive your thinking about a project so I thought I’d give it a go.

From Tesla to TESTA: Meanderings in Chemistry Education Research

This seminar comes in two parts. The first looks at the use of diagnostic tests to evaluate the knowledge of students at the start of a block of teaching. Over the past 3 years, two diagnostic tests have been developed, one evaluating 1st year’s knowledge of topics in spectroscopy on entry to 1st year, and the second evaluating 2nd year’s knowledge of NMR and related topics in preparation for a multinuclear NMR course. The prevalence of key misconceptions is determined and areas to be addressed in subsequent teaching identified. The second part looks at the use of the TESTA (Transforming the Experience of Students Through Assessment) process to catalogue changes in the Chemistry course from 2010 to 2017. The TESTA process provides metrics to evaluate the nature of assessment and feedback processes, however is deficient in a key regard: impact of assessment deadlines on student workload. Assuming an ‘ideal spherical student’, a student workload model is proposed and considered in the context of having sufficient time to participate in assessment for learning activities.


If you’ve seen any of my poster presentations in the past year, you’ll be familiar with some of the diagnostic test stuff. I’ve been plugging away developing diagnostic tests for about three years now and I’m getting fairly close to happy with the question sets. I find they are very useful in informing my teaching and a couple of changes I made this year seem to have added clarity to the responses received. Firstly, I moved the tests from paper-based (MCQ, confidence scale, free-text response bit for explanation) to online (MCQ, confidence scale, studied before yes/no/maybe). This allowed for automatic marking and meant I didn’t need to spend hours typing in the answers. I also changed the confidence scale from a 1 – 7 scale with 1 being ‘not at all confident’ and 7 being ‘highly confident’ to a categorical scale. I intended this to make it easier for students to select an answer (I found myself becoming tied up in the difference between a 5 and a 6 whereas the difference between ‘neither confident nor unconfident’ and ‘a little bit confident’ seemed easier to grapple with). Never the less, I do employ NSS style groupings to split the responses into low, medium and high confidence.


The second part is based in part on the ‘Spherical Students’ post and is consideration of a student workload model as well as the use of the TESTA (transforming the experience of students through assessment) process to evaluate a curriculum review. When I looked at our TESTA data from several years ago with a view to seeing how things had evolved since the modules were shiny and new, I realised that while it accounts nicely for type and number of assessment and goes into great detail on feedback, there’s little about considering the timing of assignments. This, combined with my on-going bewilderment about how more ‘active’ forms of learning requiring not insubstantial amounts of pre-sessional activity should be accounted for in module proposals (where hours are broken down) and in timetabling (if you’ve got 8 lectures a week, can every lecture come with an hour of prep?), has all lead me to start reading about student workload models. It’s fascinating stuff and seems fairly under researched in HE. The basics are obvious: it’s really difficult to pin a meaningful metric on the task of evaluating student workload and methods vary. They include word counts and proxy word counts (e.g. a 15 credit module is 4000 – 5000 words) which obviously doesn’t work that well for calculations and the like, time on task which is objective (how much time should be spent) and subjective (how much time is spent) and further complicated by the idea that researching and writing 1000 words at FHEQ level 4 (1st year UK uni) is very different to researching and writing 1000 words at FHEQ level 6 (3rd year UK uni). An effective student workload model must make some allowances for level of difficulty of material but really should be so much richer than that. So far I’ve only got as far as figuring out that where we put our deadlines is really important in whether students appear overloaded in any given week or not but I’ve got more reading and thinking and modelling to do on this one.

As it was about work in progress, I’m not uploading the slides. I got some really good questions at the end and need to think some more about some of the issues.

Spooky Slimey Vampire Science

Last Saturday we did the ‘Spooktacular’ science outreach event at Keele University Hub. If my recollections are correct, it’s the 6th Spooktacular I’ve taken part in and the 5th where I’ve designed and run activities with the help of student volunteers. The audience is families with wee kids, typically primary aged so there has to be lots of seasonal fun and wee bits of science. These events are extremely worthwhile because:

(a) the whole family can attend and so parents and other adults can get involved with the science bit too, helping with the tasks and hearing the explanations.

(b) it allows undergraduates, postgraduates and academics to run activities in a marketplace format and hence test out their science communication skills in a dynamic and changing environment before (for example) taking on a larger project such as going into a school.

(c) it’s fantastic publicity for the institution and cements the role of the university as part of the community. It is a generally positive experience for attendees and breaks down barriers about what a university is.

I started out with one planned activity but then got rather a lot of student volunteers so came up with a second. Here’s what we did and, importantly, why.

  1. Slime (well it’s Hallowe’en, you can’t not!).

Since the whole PVA-borax fiasco that’s resulted in a lot of recipes for PVA slime using contact lens solution with boron based buffers being bandied about, we changed our normal approach to slime to alginates. In the past we’ve had mixed success with them, primarily because the interaction was a little on the short side. This time:

  • measure 20 mL of distilled water into a plastic cup
  • add one spatula of alginate powder
  • stir with lolly stick (highly sophisticated scientific labware)

Now, it takes a fair bit of time to dissolve, a couple of minutes and while this can be challenging for the youngest participants, it provides a good opportunity to talk about the science of what’s going on.

  • add food dye and glitter to the now thick solution.
  • Using a large small pipette (comedy description but this year I bought kids science pipettes from amazon that are around 5 mL and have a larger bulb on them than the usual disposable 3mL plastic pipettes), add to calcium chloride solution.
  • Make worms/beads/lumpy aggregates and fish from solution using large forceps (I bought kids forceps as metal lab ones are often a little difficult for wee hands). Prod, poke and generally play.


  1. Mission Starlight…Vampire Edition

I have a backpack in my office from Summer 2016 when we took the RSC’s global experiment ‘Mission Starlight’ to a summer science event. That was when we had the UV-active beads on teddies and we were protecting teddy from the sun and talking sun safety with visitors. The UV beads change colour in sunlight and we provide various cloths, films and sunscreens to build the perfect UV protection suit. For Hallowe’en we adapted it to protect Vampires from sunlight – what would a Vampire wear to avoid the sun? I bought some rubber bats and put the UV beads on them. It worked pretty well but one draw back is it involves the participants going in and out of the room which can be alarming for parents and guardians, particularly if the room is busy. We also had some UV emitting LEDs which came attached to pens as part of an ‘invisible ink’ set. UV active dye in the pen, no obvious marks, shine the LED at it…you get the idea. In any case, those worked well when the sun went in a wee bit.

Mission Starlight was one of the best RSC Global Experiments in my opinion because there is no clean-up, few consumables (the fabrics can be reused), and it’s dry to store. I’m in favour of outreach being a bit more sustainable in the generating limited waste department. Other experiments such as the polymer hydrogels and the vitamin C in foodstuffs required quite a bit of wet prep and disposal. I need to find more activities like this, although I’ll note that not pre-dissolving the alginates did cut down on prep time significantly and allowed the participants the chance to dye their own alginate.

Next on the outreach calendar is our regional Top of the Bench Heat. I’ve got the practical challenge sorted (shhhh! Top secret until after!) but need to figure out the written challenge.

Automated Marking of Tests/Quizzes

I’ve been dabbling in the automatic marking of tests and quizzes for several years now. By this, I mean a web-based set of questions that a student completes on a specific topic, that automatically grades the answers as correct/incorrect (and sometimes gives partial credit) and returns the mark (sometimes with feedback) to the student at a specific time. Before you think – oh this is good, no marking, I’d warn you about setup burden.

What kinds of assignments make really good auto-tests? I have used them for the following:

  • pre-laboratory exercises that give practice at calculations, some safety aspects, and identifying products/balancing equations (Blackboard Test/Pool)

  • online safety quiz  (Blackboard Test/Pool)

  • assessed tutorial with a very tight deadline before the exam  (Blackboard Test/Pool)

  • referencing and academic conduct test  (Blackboard Test/Pool)

  • diagnostic test (new for 2017! Google Form Test)

The technology has limitations, particularly related to the type of questions you can ask. I find the following types useful:

  • multiple choice questions

  • calculated numeric [with the caveat that Blackboard can’t deal with 5.5 and 5,5, units, or number of decimal places]

  • fill in the blank or short answer [with the caveat that students often can’t spell (even when given a list of words to select their answer from), and sometimes deducing the syntax of the required answer is tricky]

  • matching pairs [really good for reactant/product equation matching in transition metal redox chemistry]

I also like the ability to write a pool of questions and a system that allows each student to be asked a number of questions from the pool. If every question is from a different pool, this reduces the scope for collusion. An example of a good pool question stem for a calculated numeric question:

Calculate the mass of copper sulfate required to prepare a [Y] molar aqueous solution in a [X] mL volumetric flask. 

You can see how simple it is to vary X and Y within the realms of possibility, generate all the correct answers in excel and make a pool of questions.

The setup burden is how long it takes to create the initial test. As a rough guide, I’d estimate it to be at least twice as long as it would take to manually mark! So for a pre-lab done by 50 students, taking me 10 hours to mark, I’d expect to spend about 10 hours developing the online version. I do not recommend doing the online test thing unless you know you can use it for at least 2 years – one reason for doing it is to reduce the marking load and you don’t really start to make gains until the 3rd year of running. On the other hand, it’s a convenient way to ship time from semester (marking time) into quieter times of the year (prep time). I estimate that each test requires 1 – 2 hours of tweaking and set-up each year, usually after reviewing the analytics from Blackboard, weeding out poorer questions, adding a couple of new ones…that sort of thing.

Why do I do this? Well each of the assignments I’ve outlined are reason enough in themselves, but some have transitioned from paper-based to online (pre-labs, diagnostic test) and some would not exist if they could not be online (safety, referencing, academic conduct, assessed tutorial). So sometimes there is no reduction in marking time for me because I wouldn’t offer the assignment in an alternative manner. Technology facilitates the use of formative tests to aid learning, so I use it.

This year I’m expanding my range of formative tests to transfer my 1st year spectroscopy ‘drill’ questions into an online format. When teaching things like the equations of light, basic NMR, IR etc, I recognize the value in doing lots of examples. I also recognize the value in those examples stepping up in difficulty every so often, I’ve been calling them levels.

For example, using the equation E = hν

Level 1 – calculation of E in J, with ν in Hz

Level 2 – calculation of E in kJ/mol

Level 3 – calculation of E in kJ/mol with ν in ‘insert standard prefix here’ Hz

Level 4 – calculation of ν with energy in J

Level 5 – …

You get the idea anyway.  I read a paper on this a few years back, about stepping up calculations in small steps.  So I’m making question pools for each level, bundling a few levels together into a quiz then setting a minimum score requirement to gain access to the next levels. Students will do quiz 1 and if their mark is high enough (80%+) they get access to quiz 2. If it isn’t, they’ll get access to a couple of worked examples and the chance to re-do quiz 1 to get the mark.

I’m aware that this type of drill enforces a purely algorithmic approach, but if my students can’t do these bits, they are going to run into a whole lot of problems at higher levels. When setting exam questions, I balance the question between the algorithmic problem solving stuff like performing a calculation, and the ‘explain your answer’ part where they need to demonstrate a degree of understanding. We can argue over the appropriate balance between those sections but I think the algorithmic stuff should be 40 – 60% of the marks available (depending on the level of the paper) and the balance should be the explanation stuff, or higher level problem solving such as unseen, unfamiliar problem types. With this balance I’m saying ‘well you can probably pass the exam if you can do the basics but you need to show more to get a great mark’.  I also assume that intended learning outcomes define a passing mark (40%) or a low 2:2 mark (50%), rather than a 100% mark.

The experience of setting up and running diagnostic test through Google Form Tests requires a post on it’s own so I’ll come back to that.