What the academy can learn from Hollywood.

Return To All Posts
  • 26/02/2014

    The NYT offers short videos most days. I enjoy watching them, partly because I am trying to produce some very humble efforts for student teaching. I am keen to learn. I am gradually finding my way through FCPX, audio recording and how to produce simple animations. One of today’s videos is about the sound studio Skywalker Sound. Some of what is said is not surprising. Most us know how sound influences our degree of fear in scary films, and how sound and music sets action in context. And yet, the degree of sophistication and invention does surprise me. Films are very complicated giant artefacts, the result of large teams working collectively, but with a mixture of authority, vision, and emergence. Contrast this with the novel, or even the modern textbook. In the former there is a single creator at work, and accepting the need for publishers, typesetters and so on, the cast is small. Textbooks might involve more staff in their creation but, in general, I do not think most textbooks are as sophisticated or skilful as most films. A course module might not stand comparison either.

    So, what has this got to do with medical education? Well, in an earlier post on the importance of design, I described my own (middle or late-life?) epiphany. In medicine the idea that you just string modules together, with lecturers who have rarely sat down together, all producing their own little snippets, is no longer sensible. A bit like trying to make sense of a William Burroughs novel. Asking externals to come to exam boards rather than being involved in the development of course material is another reflection of a broken system. So, whilst in many disciplines, an individual lecturer might produce a series of lectures, and students may indeed get used to the style, feedback and so on, for medicine I do not think this system is appropriate. Medicine is, by its nature multimedia, but is frequently delivered by people who have little oversight of what students are supposed to know. The origins of this are of course in the apprentice system: whereas postgraduate education can follow this model to a limited degree (although the various NHSs are doing their best to destroy it), much of undergraduate medicine is still sadly bums on seats in lecture theatres. Depressing, given how much the kids are paying. We need the equivalents of sound teams, video teams, animators, support staff etc. And stars!

    There is always a tension between the context of particular lectures, and how teaching sessions all fit together. Much of the modern medical school curriculum is a meta-curriculum. There is what is stated in the year guides, and what actually happens. As one student painfully told me: we were told we could join the ward round, but not see the patients. What solutions are there? I do not have a ‘grand theory’ of how to change things, but here are a few –as yet poorly characterised— ideas:

    We need to drastically reduce the number of staff involved in teaching medical students. Apprentice style learning occurs, but is rare, and largely confined to final year. Exclude this for the moment, but then think how teaching compares with what a good school provides. Do all those involved in teaching, know what the students are already expected to know, and what exactly the students are being examined on. My experience is that this is often not the case.
    Second, all material, has to be produced in advance, and be available before any course beings. All staff, and I mean all staff who see, teach or show students anything, have to be familiar with this material. The quality control needs to include guidance on acceptable terminology and whether students have been exposed to it before. The material needs to have been peer reviewed (ideally externally), and a record of iterations kept. No longer can you just turn up with your Powerpoint, or just have somebody sit in clinic with you, without preparation.
    Third, the enemy of medical education is indeed factual overload, and irrelevant material. Much of what the students have ‘learned,’ they have forgotten by the time they quality. Many talk about factual overload as a problem, and then do their best to make it worse. Step forward the GMC. Undergraduate medics are supposed to be educated, not trained, nor terminally differentiated for one health care system on graduation: it is called medical education for a reason.
    Part of the reason for the overload is that more and more material is included as stand-alone units —much of the knowledge and expertise you expect of doctors has been expropriated from the clinical context. The students want to be doctors; everything has to be subjugated to the clinic and the patient (however broadly defined). Do not give them lectures on ethics, rather talk about ethical decisions on patients they see, on the ward, not in a seminar.
    Consider exactly what knowledge is really foundational. Provide the materials and structure, and let the students find it when they need it. I think there are all sorts of exceptions here. Statistics is hard for most students but, as I have posted earlier, you will need small group teaching to make progress here, not lectures alone. For many areas, good books and course material should suffice. Speaking as somebody who spent almost 20 years working in genetics, including presenting a seminar with Jim Watson in the audience, and yet who only received 2 hours of lecture on the topic, much of what students need, they should find out themselves under guidance. How many people cannot learn cell biology after reading ‘Molecular Biology of the Cell’. And if they can’t, should they be at medical school? The power of ‘pull’ rather than push’ in John Seely Brown’s words.

    There are a number of institutional barriers here. Student numbers are probably too large, and too many staff are involved in teaching, but at a low level of interaction. Most of these staff have not opted for a career teaching students. It is time to professionalise. I do not know how many staff a typical medical student is exposed to in their five years of undergraduate education— my guess is that is probably in excess of 500. A number as large of this means that the whole course cannot be controlled or delivered in any meaningful sense. This is one of the reasons why the pace of change in medical education is so slow. I have been re-reading George Pickering’s book, ‘Quest for Excellence’ and what is so depressing is that in 1978 he was talking about problems that the Goodenough report in 1944 flagged up. Many of them are still there, unchanged.

    Clinical academics wear a number of hats; clinician 50% of the time, teacher, researcher, exploiter of basic discoveries, administrator etc. To what degree it was ever possible to combine these activities in a single job, I do not know (shade of Clerk Kerr’s Multiversity), but it certainly isn’t now. Joe Goldstein and Michael Brown, perhaps the two most iconic clinical scientists of our time, argued many years ago that the idea of cloning genes in the morning, and seeing patents in the afternoon was no longer a sensible strategy (and led to fairly second rate ‘kit’ bench science). We need to think hard about teaching in this mix too, and start looking at the relation between university workloads and clinical work. If non-clinical academics struggle with research and teaching, how on earth can you cope with an additional third duty as well. Yes, lots to think about, and try and hone.