June, July are the busiest time of year for. It is when I update all my teaching material, and I always underestimate how long it will take me. Here is a guide to some of it. But still I need to catch up with some more, as the new students have already started.
The title above and quotes below are from this article by Lincoln Allison. To create teaching machines, you need to make teaching so bad that even the machines can do it. We are almost there.
The most particular annoyance for me was the doubling of seminar size from nine to 18 – allegedly to free up time for research. As if anyone is going to develop the capacity for original thought because they have two or three more hours available in the week! To some of my colleagues, this was merely a technical change, but to me it was the abolition of the real seminar, the thing we should have been most proud of in the English university system.
It was part of a general deprioritising of teaching. I remember a colleague looking at her extremely poor ratings on student “feedback” and remarking gaily: “I’m really not very good at this, am I?” She had just had a book published that was extremely well received, and she couldn’t care less that she was failing in her core duties to communicate her ideas within an academic community. Her remark stiffened my resolve to leave – especially once students picked up the vibe about the level of staff interest in teaching and became less challenging and more instrumental.
Much of what I have seen and heard of UK universities in the 14 years since I retired seems to relate to what I would consider proper university teaching about as much as “value” tinned food relates to fresh food. And I think that just as there are people who have never tasted fresh food, there are people who have not experienced real lectures and seminars.
This article from the Economist is much more nuanced than you might think — especially about all the benefits of attending school that are unrelated to what specifically goes on in the classroom. But if you couple it with a close reading of Bryan Caplan’s ‘The Case Against Education’ then it is hard not to feel that the academy has been guilty of failing to check out their own entrails before passing judgement on everybody else’s.
It sounds like a counsel of despair. If every child went to school, millions more would sit in woeful, boring classrooms. But while this sounds awful, it would probably still be good for them, their families and broader society. For, as Justin Sandefur of CGD points out, there is plenty of evidence that even when children do not learn much at school, they still do better for having gone.
Some benefits are economic. Attending school for longer is associated with earning more in later life, in part because those with additional schooling are more likely to get non-agricultural jobs and move to cities. This may indicate that young people are in fact learning something useful at school that is not being picked up by researchers. But it could also be a signalling effect: a shopkeeper may prefer workers who stayed at school for at least five years.
One is simply that if girls are at school they are not having sex at home
Besides, “university league tables are like sausages: the more you know about how they are made, the less you want to [do with] them”.
“Research was structurally unprofitable even if you scored really well in the research excellence framework,” he claims. “It’s being financed by surpluses on taught master’s. I think that’s fine because part of the reason people came on the taught programmes was because the place was very highly ranked in research, and they thought they were going to be sitting at the feet of the best economists around. Academics had to understand the dynamic and deliver the teaching because that was what was paying for the research. Yet because of the history of underfunding [undergraduate] students [before the introduction of £9,000 tuition fees in 2012], a kind of mood gained ground in British universities that [all] students were an unprofitable activity.
“What were your most memorable moments at university?”
“There was a man called Walter Ullmann who taught medieval critical philosophy at 10am – and there was standing room only. I went every week, regardless of how wasted I’d got the night before, because he was brilliant.” THE
Reminds me of people queueing to get into to listen to Isaiah Berlin. Some merit as a metric: standing room only. (Until H&S arrive)
PS. And, for another example, see this from a recent book review of a biography of Enrico Fermi (The Last Man Who Knew Everything: The Life and Times of Enrico Fermi, Father of the Nuclear Age. By David Schwartz).
[the author]..He interviewed many of Fermi’s students and colleagues, shedding light also on Fermi the educator (his lectures were so renowned that even notes taken by his assistants were a bestseller).
I am not a big fan of lectures. The single best piece of advice I received at medical school was not to attend. I therefore skipped lectures for three years (although I got the handouts). It is not that all lectures are bad, they are not. It is just that often they are used for ‘content delivery’, much as we think about delivery of a takeaway. They are ill suited to this role, now that we can write and distribute text cheaply. Good lectures serve a different purpose, but you don’t need too many of them and, in my experience of medicine, there are very few people who lecture well. Lecturing well means choosing those fragments of a domain that lend themselves to this media type. Lectures are (and should be) theatre, but the theatre of the mind needs more.
By chance, I came across the following thoughts from the preface to the Ascent of Man (the TV series and the book). Bronowski understood many things, and I still marvel at how prescient his ideas were.
If television is not used to make these thoughts concrete, it is wasted. The unravelling of ideas is, in any case, an intimate and personal endeavour, and here we come to the common ground between television and the printed book. Unlike a lecture or a cinema show, television is not directed to crowds. It is addressed to two or three people in a room, as a conversation face to face – a one-sided conversation for the most part, as the book is, but homely and Socratic nevertheless. To me, absorbed in the philosophic undercurrents of knowledge, this is the most attractive gift of television, by which it may yet become as persuasive an intellectual force as the book.
The printed book has one added freedom beyond this: it is not remorselessly bound to the forward direction of time, as any spoken discourse is. The reader can do what the viewer and the listener cannot, which is to pause and reflect, turn the pages back and the argument over, compare one fact with another and, in general, appreciate the detail of evidence without being distracted by it.
Then there was PowerPoint and lecture capture.
Luis von Ahn, somebody who has changed the world on more than one occasion, has also been awarded a teaching award from his own university. Take his tips seriously 😉
There is something about teaching that makes you a better researcher. I know this is very countercultural wisdom, but I believed it all along. Luria, Magasanik, and Levinthal all believed it. Levinthal and Luria both had a very strong influence on me in this regard.
An (old) interview with David Botstein, in PloS genetics. Link
At least we are spared the ‘research led teaching’ mission statements.
This was a quote from an article by an ex-lawyer who got into tech and writing about tech. Now some of by best friends are lawyers, but this chimed with something I came across by Benedict Evans on ‘why you must pay sales people commissions’. The article is here (the video no longer plays for me).
The opening quote poses a question:
I felt a little odd writing that title [ why you must pay sales people commissions]. It’s a little like asking “Why should you give engineers big monitors?” If you have to ask the question, then you probably won’t understand the answer. The short answer is: don’t, if you don’t want good engineers to work for you; and if they still do, they’ll be less productive. The same is true for sales people and commissions.
The argument is as follows:
Imagine that you are a great sales person who knows you can sell $10M worth of product in a year. Company A pays commissions and, if you do what you know you can do, you will earn $1M/year. Company B refuses to pay commissions for “cultural reasons” and offers $200K/year. Which job would you take? Now imagine that you are a horrible sales person who would be lucky to sell anything and will get fired in a performance-based commission culture, but may survive in a low-pressure, non-commission culture. Which job would you take?
But the key message for me is:
Speaking of culture, why should the sales culture be different from the engineering culture? To understand that, ask yourself the following: Do your engineers like programming? Might they even do a little programming on the side sometimes for fun? Great. I guarantee your sales people never sell enterprise software for fun. [emphasis mine].
Now why does all this matter? Well personally, it still matters a bit, but it matters less and less. I am towards the end of my career, and for the most part I have loved what I have done. Sure, the NHS is increasingly a nightmare place to work, but it has been in decline most of my life: I would not recommend it unreservedly to anybody. But I have loved my work in a university. Research was so much fun for so long, and the ability to think about how we teach and how we should teach still gives me enormous pleasure: it is, to use the cliche, still what I think about in the shower. The very idea of work-life balance was — when I was young and middle-aged at least — anathema. I viewed my job as a creative one, and building things and making things brought great pleasure. This did not mean that you had to work all the hours God made, although I often did. But it did mean that work brought so much pleasure that the boundary between my inner life and what I got paid to do was more apparent to others than to me. And in large part that is still true.
Now in one sense, this whole question matters less and less to me personally. In the clinical area, many if not most clinicians I know now feel that they resemble those on commission more than the engineers. Only they don’t get commission. Most of my med school year who became GPs will have bailed out. And I do not envy the working lives of those who follow me in many other medical specialties in hospital. Similarly, universities were once full of academics who you almost didn’t need to pay, such was their love for the job. But modern universities have become more closed and centrally managed, and less tolerant of independence of mind.
In one sense, this might go with the turf — I was 60 last week. Some introspection, perhaps. But I think there really is more going on. I think we will see more and more people bailing out as early as possible (no personal plans, here), and we will need to think and plan for the fact that many of our students will bail out of the front line of medical practice earlier than we are used to. I think you see the early stirrings of this all over: people want to work less than full-time; people limit their NHS work vis a vis private work; some seek administrative roles in order to minimise their face-to-face practice; and even young medics soon after graduation are looking for portfolio careers. And we need to think about how to educate our graduates for this: our obligations are to our students first and foremost.
I do not think any of these responses are necessarily bad. But working primarily in higher education, has one advantage: there are lost of different institutions, and whilst in the UK there is a large degree of groupthink, there is still some diversity of approach. And if you are smart and you fall outwith the clinical guilds / extortion rackets, there is no reason to stay in the UK. For medics, recent graduates, need to think more strategically. The central dilemma is that depending on your specialty, your only choice might appear to be to work for a monopolist, one which seeks to control not so much the patients cradle-to-grave, but those staff who fall under its spell, cradle-to-grave. But there are those making other choices — just not enough, so far.
An aside. Of course, even those who have achieved the most in research do not alway want to work for nothing, post retirement. I heard the following account first hand from one of Fred Sanger’s previous post-docs. The onetime post-doc was now a senior Professor, charged with opening and celebrating a new research institution. Sanger — a double Laureate — would be a great catch as a speaker. All seemed will until the man who personally created much of modern biology realised the date chosen was a couple of days after he was due to retire from the LMB. He could not oblige: the [garden] roses need me more!
Incarceration as the educational business model. Medium
This year the University of Edinburgh plans to become one of the first big European universities to launch a blockchain course. Aggelos Kiayias, chair in cyber security and privacy and director of the blockchain technology laboratory at the university, says: “Blockchain technology is a recent development and there is always a bit of a lag as academia catches up.”
What interests me is how we think about all the things that universities do first. And why and how we lose that advantage for our students.
Terrific interview with Alan Kay. Familiar memes, but I do not tire of them.
The business of a university is to help students learn contexts that they were unaware of when they were in high school.
His use of the word context encompasses intellectual creations such as reading, writing, printing etc. His oft quoted quip: a change in context is worth 80 IQ points
Nice piece in ‘Science’ with the title: ‘No easy answers: What does it mean to ask whether a prekindergarten math program “works”?’ Geoff Norman, many years ago, used the term RCT in the context of medical education to stand for Randomised, Confounded and Trivial. Research into what works and what does not work in education is hard, and most studies (IMHO) fail to inform. Education isn’t a product like a drugs is, and gee it is hard to demonstrate when and where most drugs will work if you do not have an understanding of the biology and large effects to play with and outcomes that need to be measured over the long term.
I think about this a lot, but have no easy rules to guide action. Which is, of course, exactly the problem.
At the risk of raising the ire of many researchers, I should note that I am not basing my assessment on the rapid growth in educational neuroscience. You know, the kind of study where a subject is slid into an fMRI machine and asked to solve math puzzles. Those studies are valuable, but at the present stage, at best they provide at most tentative clues about how people learn, and little specific in terms of how to help people learn. (A good analogy would be trying to diagnose an engine fault in a car by moving a thermometer over the hood.) One day, educational neuroscience may provide a solid basis for education the way, say, the modern theory of genetics advanced medical practice. But not yet.
Keith Devlin, talking sense — again. I want to believe the the rest of the article but, worry it may not be so. But it contains some gems:
Classroom studies invariably end up as studies of the teacher as much as of the students, and often measure the effect of the students’ home environment rather than what goes on in the classroom.
This just adds to the problem that Geoff Norman (DOI 10.1007/s10459-016-9705-6) and others have talked about in course evaluations, namely that many studies — even accepting of the limitations outlines above — are riddled with pseudoreplication.
What is missing is any insight into what is actually going on in the student’s mind—something that can be very different from what the evidence shows, as was dramatically illustrated for mathematics learning several decades ago
But, like many outwith medicine, I think he puts too much store by the robustness of the RCT approach — even with digital tools to allow large scale measurement. RCT: ‘randomised, confounded and trivial’, as has been said before (Norman).
I have been busy updating some teaching stuff. It is never finished but there is time for a little pause. I have completed all the SoundCloud audio answers to the questions in ed.derm.101 (Part C) and there is a ‘completed’ version of ed.derm.101 Part C half way down the linked page. Not all the links have been checked, and a lot had to be redone because the superb New Zealand Dermnet site changed their design (the best source of dermatology images, IMHO). An example of the sort of audio material is below.
Interesting post from Tony Bates on the history of distance learning, and the University of London External Programme, which started in 1828.
Unfortunately I have no knowledge of the individuals who originally created the University of London External Programme back in 1828. It’s a worthy research project for anyone interested in the history of distance education.
I was once (mid-1960s) a correspondence tutor for students taking undergraduate psychology courses in the External Programme. In those days, the university would publish a curriculum (a list of topics) and provide a reading list. Students could sit an exam when they felt they were ready. Students paid tutors such as myself to help them with their studies. I would find old exam papers for the course, and set questions for individual students, and they would send me their answers and I would mark them. Many students were in British Commonwealth countries and it could take weeks after students sent in their essays before my feedback eventually got back to them. Not surprisingly, in those days completion rates in the programme were very low…
But I am fascinated by (and was ignorant of) the following:
Note though that teaching and examining in the original External Programme were disaggregated (those teaching it were different from those examining it), contract tutors were separate from the main faculty were used, and students studied individually and took exams when ready. So many of the ‘new’ developments in distance education such as disaggregation, self-directed learning, and many of the elements of competency-based learning are in fact over 150 years old.
I have added some more SoundCloud answers and added and sorted the links in Part C Chapters 5,6,7. Getting there.
I have posted some new audio SoundCloud answers to questions from the first three chapters of ed.derm.101 Part C.
“Investment firm GSV Advisors recently estimated the annual global outlay on education at $5.5 trillion and growing rapidly. Let that number sink in for a second—it’s a doozy. The figure is nearly on par with the global health care industry, but there is no Big Pharma yet in education. Most of that money circulates within government bureaucracies.”
After a spell as a lecturer and reader at the LSE, he returned to his East End roots at the economics department at Queen Mary College, recruiting an impressive roster of academics and students to its venue in a former biscuit factory on Bow Road.
He was known for giving chances to mavericks: if a headteacher warned of a student’s “difficult” nature, Peston would normally take them on.
From an obit of Lord Peston FT
This brought to mind something in Craig Venters excellent autobiography, A life decoded. He described how he was so busy doing science — and publishing — as a student, that he failed some mandatory graduate exams. The faculty had to ‘invent’ an appropriate exam for him — which of course he passed. They obviously didn’t have to deal with the QAA or GMC.
Education is not just about means, but about maintaining intellectual diversity. We have to be concerned about variation, too. It is all too easy to concentrate on minimum standards or pass marks, without considering whether what we are doing, harms those who as a society we are most in need of.
A new welcome video for our Edinburgh Medical School dermatology module. Judging by the gesticulation, I must have some Italian blood in me (or so somebody tells me).
Just because some colleagues asked:
This is another useful talk from HILT. One problem that bugs me with both undergraduate teaching and learning, and clinical expertise when you are qualified, is the dynamic between measures of competence at a defined time point, and the influence of exposure on the pattern of competence over time. People often assert that because you have some skill at timepoint X, that somehow ‘clinical exposure’ will maintain that skill over time ( I am not talking about revalidation here, simply because most accept as currently construed revalidation is not credible). However, keeping knowledge accessible is not just a function of formal learning, but a function of how often you encounter particular clinical problems. The dynamics are worth thinking about. To maintain competence for rare disorders, you must encounter them at a certain rate. However, it seems to me possible that ‘routine clinical practice’ may not allow enough encounters to allow this competence to be maintained at the rate that is required. [So, the rate required to maintain competence, is greater than the rate at which you might routinely encounter the problem]. This is of course why we might use simulation, or attend clinical meetings, or spend much of our life talking about ‘cases’. If you want high level competence, you have to control the ratio of mundane to advanced case-mix. This is one of the reasons you need a hierarchy, and why a consultant delivered service, is not compatible with high level clinical competence (yes, I am skipping over a formal proof, here!). Some of this is at least tangentially related to this video by Robert Bjork — he of desirable difficulty). It is not exactly the rugby training mantra of no gain without pain, but something cognate; and that our tacit yesteryear views of competence are being destroyed.
From an article in today’s NYT
Last year an Interlude video of Bob Dylan’s “Like a Rolling Stone,” which let viewers flip through a fictional TV wasteland — infomercials, game shows — in which actors mouthed Mr. Dylan’s lyrics, got more than 70 million views. Recently, an interactive video for “Stayin Out All Night“ by the rapper Wiz Khalifa, a Warner artist, was viewed 3.8 million times, while a conventional version on YouTube got only 3.6 million views. “This Interlude technology is game-changing,” Mr. Khalifa said in a statement. “I’m very glad to be at the forefront.” For Warner, as well as for advertisers that have begun to use Interlude, the appeal of the technology lies in how it lures people to be more active viewers. According to Mr. Bloch, the company’s chief executive, 90 percent of Interlude’s music video viewers make choices while watching (videos will play even if a viewer does nothing). A more engaged audience yields higher ad rates,
Well, I haven’t sampled (no pun intended) the technology, but the key point is familiar to anyone who knows anything about how students learn: they have to engage, and the more effort them have to put in to any teaching session the more they will learn. Remember Robert Bjork’s phrase: desirable difficulty, in learning.