There is something about teaching that makes you a better researcher. I know this is very countercultural wisdom, but I believed it all along. Luria, Magasanik, and Levinthal all believed it. Levinthal and Luria both had a very strong influence on me in this regard.
An (old) interview with David Botstein, in PloS genetics. Link
At least we are spared the ‘research led teaching’ mission statements.
This was a quote from an article by an ex-lawyer who got into tech and writing about tech. Now some of by best friends are lawyers, but this chimed with something I came across by Benedict Evans on ‘why you must pay sales people commissions’. The article is here (the video no longer plays for me).
The opening quote poses a question:
I felt a little odd writing that title [ why you must pay sales people commissions]. It’s a little like asking “Why should you give engineers big monitors?” If you have to ask the question, then you probably won’t understand the answer. The short answer is: don’t, if you don’t want good engineers to work for you; and if they still do, they’ll be less productive. The same is true for sales people and commissions.
The argument is as follows:
Imagine that you are a great sales person who knows you can sell $10M worth of product in a year. Company A pays commissions and, if you do what you know you can do, you will earn $1M/year. Company B refuses to pay commissions for “cultural reasons” and offers $200K/year. Which job would you take? Now imagine that you are a horrible sales person who would be lucky to sell anything and will get fired in a performance-based commission culture, but may survive in a low-pressure, non-commission culture. Which job would you take?
But the key message for me is:
Speaking of culture, why should the sales culture be different from the engineering culture? To understand that, ask yourself the following: Do your engineers like programming? Might they even do a little programming on the side sometimes for fun? Great. I guarantee your sales people never sell enterprise software for fun. [emphasis mine].
Now why does all this matter? Well personally, it still matters a bit, but it matters less and less. I am towards the end of my career, and for the most part I have loved what I have done. Sure, the NHS is increasingly a nightmare place to work, but it has been in decline most of my life: I would not recommend it unreservedly to anybody. But I have loved my work in a university. Research was so much fun for so long, and the ability to think about how we teach and how we should teach still gives me enormous pleasure: it is, to use the cliche, still what I think about in the shower. The very idea of work-life balance was — when I was young and middle-aged at least — anathema. I viewed my job as a creative one, and building things and making things brought great pleasure. This did not mean that you had to work all the hours God made, although I often did. But it did mean that work brought so much pleasure that the boundary between my inner life and what I got paid to do was more apparent to others than to me. And in large part that is still true.
Now in one sense, this whole question matters less and less to me personally. In the clinical area, many if not most clinicians I know now feel that they resemble those on commission more than the engineers. Only they don’t get commission. Most of my med school year who became GPs will have bailed out. And I do not envy the working lives of those who follow me in many other medical specialties in hospital. Similarly, universities were once full of academics who you almost didn’t need to pay, such was their love for the job. But modern universities have become more closed and centrally managed, and less tolerant of independence of mind.
In one sense, this might go with the turf — I was 60 last week. Some introspection, perhaps. But I think there really is more going on. I think we will see more and more people bailing out as early as possible (no personal plans, here), and we will need to think and plan for the fact that many of our students will bail out of the front line of medical practice earlier than we are used to. I think you see the early stirrings of this all over: people want to work less than full-time; people limit their NHS work vis a vis private work; some seek administrative roles in order to minimise their face-to-face practice; and even young medics soon after graduation are looking for portfolio careers. And we need to think about how to educate our graduates for this: our obligations are to our students first and foremost.
I do not think any of these responses are necessarily bad. But working primarily in higher education, has one advantage: there are lost of different institutions, and whilst in the UK there is a large degree of groupthink, there is still some diversity of approach. And if you are smart and you fall outwith the clinical guilds / extortion rackets, there is no reason to stay in the UK. For medics, recent graduates, need to think more strategically. The central dilemma is that depending on your specialty, your only choice might appear to be to work for a monopolist, one which seeks to control not so much the patients cradle-to-grave, but those staff who fall under its spell, cradle-to-grave. But there are those making other choices — just not enough, so far.
An aside. Of course, even those who have achieved the most in research do not alway want to work for nothing, post retirement. I heard the following account first hand from one of Fred Sanger’s previous post-docs. The onetime post-doc was now a senior Professor, charged with opening and celebrating a new research institution. Sanger — a double Laureate — would be a great catch as a speaker. All seemed will until the man who personally created much of modern biology realised the date chosen was a couple of days after he was due to retire from the LMB. He could not oblige: the [garden] roses need me more!
Incarceration as the educational business model. Medium
This year the University of Edinburgh plans to become one of the first big European universities to launch a blockchain course. Aggelos Kiayias, chair in cyber security and privacy and director of the blockchain technology laboratory at the university, says: “Blockchain technology is a recent development and there is always a bit of a lag as academia catches up.”
What interests me is how we think about all the things that universities do first. And why and how we lose that advantage for our students.
Terrific interview with Alan Kay. Familiar memes, but I do not tire of them.
The business of a university is to help students learn contexts that they were unaware of when they were in high school.
His use of the word context encompasses intellectual creations such as reading, writing, printing etc. His oft quoted quip: a change in context is worth 80 IQ points
Nice piece in ‘Science’ with the title: ‘No easy answers: What does it mean to ask whether a prekindergarten math program “works”?’ Geoff Norman, many years ago, used the term RCT in the context of medical education to stand for Randomised, Confounded and Trivial. Research into what works and what does not work in education is hard, and most studies (IMHO) fail to inform. Education isn’t a product like a drugs is, and gee it is hard to demonstrate when and where most drugs will work if you do not have an understanding of the biology and large effects to play with and outcomes that need to be measured over the long term.
I think about this a lot, but have no easy rules to guide action. Which is, of course, exactly the problem.
At the risk of raising the ire of many researchers, I should note that I am not basing my assessment on the rapid growth in educational neuroscience. You know, the kind of study where a subject is slid into an fMRI machine and asked to solve math puzzles. Those studies are valuable, but at the present stage, at best they provide at most tentative clues about how people learn, and little specific in terms of how to help people learn. (A good analogy would be trying to diagnose an engine fault in a car by moving a thermometer over the hood.) One day, educational neuroscience may provide a solid basis for education the way, say, the modern theory of genetics advanced medical practice. But not yet.
Keith Devlin, talking sense — again. I want to believe the the rest of the article but, worry it may not be so. But it contains some gems:
Classroom studies invariably end up as studies of the teacher as much as of the students, and often measure the effect of the students’ home environment rather than what goes on in the classroom.
This just adds to the problem that Geoff Norman (DOI 10.1007/s10459-016-9705-6) and others have talked about in course evaluations, namely that many studies — even accepting of the limitations outlines above — are riddled with pseudoreplication.
What is missing is any insight into what is actually going on in the student’s mind—something that can be very different from what the evidence shows, as was dramatically illustrated for mathematics learning several decades ago
But, like many outwith medicine, I think he puts too much store by the robustness of the RCT approach — even with digital tools to allow large scale measurement. RCT: ‘randomised, confounded and trivial’, as has been said before (Norman).
I have been busy updating some teaching stuff. It is never finished but there is time for a little pause. I have completed all the SoundCloud audio answers to the questions in ed.derm.101 (Part C) and there is a ‘completed’ version of ed.derm.101 Part C half way down the linked page. Not all the links have been checked, and a lot had to be redone because the superb New Zealand Dermnet site changed their design (the best source of dermatology images, IMHO). An example of the sort of audio material is below.
Interesting post from Tony Bates on the history of distance learning, and the University of London External Programme, which started in 1828.
Unfortunately I have no knowledge of the individuals who originally created the University of London External Programme back in 1828. It’s a worthy research project for anyone interested in the history of distance education.
I was once (mid-1960s) a correspondence tutor for students taking undergraduate psychology courses in the External Programme. In those days, the university would publish a curriculum (a list of topics) and provide a reading list. Students could sit an exam when they felt they were ready. Students paid tutors such as myself to help them with their studies. I would find old exam papers for the course, and set questions for individual students, and they would send me their answers and I would mark them. Many students were in British Commonwealth countries and it could take weeks after students sent in their essays before my feedback eventually got back to them. Not surprisingly, in those days completion rates in the programme were very low…
But I am fascinated by (and was ignorant of) the following:
Note though that teaching and examining in the original External Programme were disaggregated (those teaching it were different from those examining it), contract tutors were separate from the main faculty were used, and students studied individually and took exams when ready. So many of the ‘new’ developments in distance education such as disaggregation, self-directed learning, and many of the elements of competency-based learning are in fact over 150 years old.
I have added some more SoundCloud answers and added and sorted the links in Part C Chapters 5,6,7. Getting there.
I have posted some new audio SoundCloud answers to questions from the first three chapters of ed.derm.101 Part C.
“Investment firm GSV Advisors recently estimated the annual global outlay on education at $5.5 trillion and growing rapidly. Let that number sink in for a second—it’s a doozy. The figure is nearly on par with the global health care industry, but there is no Big Pharma yet in education. Most of that money circulates within government bureaucracies.”
After a spell as a lecturer and reader at the LSE, he returned to his East End roots at the economics department at Queen Mary College, recruiting an impressive roster of academics and students to its venue in a former biscuit factory on Bow Road.
He was known for giving chances to mavericks: if a headteacher warned of a student’s “difficult” nature, Peston would normally take them on.
From an obit of Lord Peston FT
This brought to mind something in Craig Venters excellent autobiography, A life decoded. He described how he was so busy doing science — and publishing — as a student, that he failed some mandatory graduate exams. The faculty had to ‘invent’ an appropriate exam for him — which of course he passed. They obviously didn’t have to deal with the QAA or GMC.
Education is not just about means, but about maintaining intellectual diversity. We have to be concerned about variation, too. It is all too easy to concentrate on minimum standards or pass marks, without considering whether what we are doing, harms those who as a society we are most in need of.
A new welcome video for our Edinburgh Medical School dermatology module. Judging by the gesticulation, I must have some Italian blood in me (or so somebody tells me).
Just because some colleagues asked:
This is another useful talk from HILT. One problem that bugs me with both undergraduate teaching and learning, and clinical expertise when you are qualified, is the dynamic between measures of competence at a defined time point, and the influence of exposure on the pattern of competence over time. People often assert that because you have some skill at timepoint X, that somehow ‘clinical exposure’ will maintain that skill over time ( I am not talking about revalidation here, simply because most accept as currently construed revalidation is not credible). However, keeping knowledge accessible is not just a function of formal learning, but a function of how often you encounter particular clinical problems. The dynamics are worth thinking about. To maintain competence for rare disorders, you must encounter them at a certain rate. However, it seems to me possible that ‘routine clinical practice’ may not allow enough encounters to allow this competence to be maintained at the rate that is required. [So, the rate required to maintain competence, is greater than the rate at which you might routinely encounter the problem]. This is of course why we might use simulation, or attend clinical meetings, or spend much of our life talking about ‘cases’. If you want high level competence, you have to control the ratio of mundane to advanced case-mix. This is one of the reasons you need a hierarchy, and why a consultant delivered service, is not compatible with high level clinical competence (yes, I am skipping over a formal proof, here!). Some of this is at least tangentially related to this video by Robert Bjork — he of desirable difficulty). It is not exactly the rugby training mantra of no gain without pain, but something cognate; and that our tacit yesteryear views of competence are being destroyed.
From an article in today’s NYT
Last year an Interlude video of Bob Dylan’s “Like a Rolling Stone,” which let viewers flip through a fictional TV wasteland — infomercials, game shows — in which actors mouthed Mr. Dylan’s lyrics, got more than 70 million views. Recently, an interactive video for “Stayin Out All Night“ by the rapper Wiz Khalifa, a Warner artist, was viewed 3.8 million times, while a conventional version on YouTube got only 3.6 million views. “This Interlude technology is game-changing,” Mr. Khalifa said in a statement. “I’m very glad to be at the forefront.” For Warner, as well as for advertisers that have begun to use Interlude, the appeal of the technology lies in how it lures people to be more active viewers. According to Mr. Bloch, the company’s chief executive, 90 percent of Interlude’s music video viewers make choices while watching (videos will play even if a viewer does nothing). A more engaged audience yields higher ad rates,
Well, I haven’t sampled (no pun intended) the technology, but the key point is familiar to anyone who knows anything about how students learn: they have to engage, and the more effort them have to put in to any teaching session the more they will learn. Remember Robert Bjork’s phrase: desirable difficulty, in learning.
This is a nice post about a talk by Audrey Watters on how the dream of ‘the’ portal hasn’t died (unfortunately). What do people think links are for? What do people think a web is?
I like Gerd Gigerenzer’s writings (see for instance The Empire of Chance and Simple Heuristics that make us smart) and I am sure it is not his fault that the same stories keep coming round again and again. This story on the BBC web site treads over old ground but of course the lessons remain the same (even if the book is different). Doctors don’t like working using Bayes’ theorem in clinic — at least not if we have to use algebra, rather than real numbers (as Gigerenzer makes clear). And I still think we do a poor job of teaching medical students statistics. But something niggles me about his line of argument, and in part it it is not a million miles away from some of Gigerenzer’s other work on heuristics and ‘quick and dirty’ computation.
One view of expertise is that doctors somehow work from ‘basic principles’ and then work out what to do. This used to be the dominant view of medical expertise: we had to understand the physiology, so that we had a live model in our brain of what was happening to the patient. This may well be true in some instances, but more often it seems to me that the burden of knowledge to do this is so great, that we just follow simple shortcuts or heuristics— or we read it off look-up charts. I actually think this is sensible. We don’t need to fret about the molecules, just as I don’t need to worry about machine code or C+ when I write this blog. What Gigerenzer is drawing attention to is the absence of the relevant cognitive prostheses that takes care of the number crunching for us. Of course if the prosthesis existed, we would play with it, and actually become more at ease with the algebra.
There are some tremendous textbooks, Molecular Biology of the Cell, to quote an example, but many are dull. I understand a little about the business of textbook production, and change seems long overdue. My own efforts are of course very humble, but I am working on improving things. This graph does not attest to much innovation: more Eroom’s law than Moore’s law.
“it’s a worrying sign for philosophy in the academy. Someone who’s very good at conveying complex philosophical ideas in plain English– a good teacher, in other words – has come to the conclusion that a university is not the best place for him to be”.
[simnor_button url=”http://philosophypress.co.uk/?p=1159 ” icon=”double-angle-right” label=”Interview with Nigel Warburton ” colour=”white” colour_custom=”#fff” size=”medium” edge=”straight” target=”_self”]
Professor Cooper described a “very special pedagogy” at the university, based on face-to-face teaching, “high contact hours” and “intensive problem-solving”
[simnor_button url=”http://www.timeshighereducation.co.uk/news/regents-park-place-for-a-leader-who-can-command-respect/2012123.article” icon=”double-angle-right” label=”A very novel strategy for a University ” colour=”white” colour_custom=”#fff” size=”medium” edge=”straight” target=”_self”]
The situation was a familiar one. Some time back, I was gossiping to a medical student, and he began to to talk about some research he had done, supervised by another faculty member of staff. I asked what he had found out: what did his data show? What followed, I have seen if not hundreds of times, then at least on several score occasions. A look of trouble and consternation, a shrug of embarrassment, and the predictable word-salad of ‘significance’, t values, p values, statistics and ‘dunno’. Such is the norm. There are exceptions, but even amongst postgraduates who have undertaken research, the picture is not wildly different. Rarely, without directed questioning, can I get the student to tell me about averages, or proportions, using simple arithmetic. A reasonable starting point surely. ‘What does it look like if you draw it?’ is met with a puzzled look. And yet, if I ask the same student, how they would manage psoriasis, or why skin cancers are more common in some people than others, I get —to varying degrees—a reasoned response. I asked the student how much tuition in statistics they had received. A few lectures was the response, followed by a silence, and then, “They told us to buy a book”. More silence. So this is what you pay >30K a year for? The student just smiled in agreement. This was a good student.
Statistics is difficult. Much statistics is counter-intuitive and, like certain other domains of expertise, learning the correct basics often results in a temporary —or in some cases a permanent —drop in objective performance.** That is, you can make people’s ability to interpret numerical data worse after trying to teach them statistics. On the other hand, statistics is beautiful, hard, and full of wonderful insights that debunk the often sloppy thinking that passes for everyday ‘common sense’. I am a big fan, but have always found the subject anything but easy. But, like a lot of formal disciplines, the pleasure comes from the struggle to achieve mastery. I also think the subject important, and for the medical ecosystem at least, it is critical that there is high level expertise within the community. On the other hand, in my experience many of the very best clinicians are (relatively) statistically illiterate. The converse is also seen.