Marvin Minsky once quipped “Every educational reform is doomed to succeed”. He meant “with some students”.
“If I had to reduce all of educational psychology to just one principle, I would say this: The most important single factor influencing learning is what the learner already knows. Ascertain this and teach him accordingly.”(Ausubel, 1968 p. vi). Via Dylan Wiliam.
When I was a child, growing up in Wales, my father would express puzzlement that I didn’t seem to know how to pronounce certain words. He didn’t get that since Welsh was his equal first tongue — but not mine— knowing how you pronounce Welsh words was obvious to him, but not to me. For my part, it was only scores of years later that I realised some of his verbal mannerisms were not just odd idiosyncratic English or slang, but Welsh, although the meaning was clear to me. I had just not realised these were Welsh words or phrases, and of course I too would use them.
I have noted in the past, that when students mispronounce some of these dreadful dermatological terms, it was a signal that they had read about a disease, but had never been taught on that disease. It signalled to me how much they were acquiring on their own. English is like that, certainly in comparison with German: until you hear the word spoken, guessing how you say it, is tricky. More so, when you chuck in the various languages that contribute to the dermatological lexicon — and when they are then spoken / bastardised by English speakers.
But today, a student today pointed out that it would be helpful to include how words are pronounced in our course material. I am not certain how to do this yet, but I can believe that not knowing how to pronounce a term might ‘inhibit’ thinking and ‘silent talking’ about the topic (I do not know whether there is any research to back this opinion up).
I wrote a post on this topic awhile back trying to map out the territory of funds going in and out of undergraduate medical education. It was a bit too rambling but more thought out I feel than Jeremy Hunt’s latest slant on statistics that the Guardian (and others) reported. So here are some bullet like points on this issue, together with some questions. The backdrop is the article I wrote before, and the issue of ‘we paid for their training so we can seize their passports’ (Phil Hammond’s, ‘Hotel California clause’: ‘you can check in but you can never leave’.) And because, somebody asked me to spell things out a little more.
In England medical students pay 9K fees. HEFCE (Higher education funding) add another 10K. Lets round up and call it 20K. HEFCE also adds money beyond fees for other expensive degrees (engineering, for example) although I do not know if it is 10K or less. This 20K goes through the universities
Medical students do not pay their final year fees in England, but must meet all their living costs, and 9K fees for the other 4 or 5 years. Government loans attract interest and, as others have commented, the government alters the conditions in a way that would be illegal for any bank (Gee! The government makes even the bankers look like saints). So say 40-45K fees, plus living costs. I doubt much change from a 100-120K. The money attracts interest and will be much larger by the time it is paid back, and will also feed into the debt of students who do not earn enough to pay back their fees.
The other funding stream is via the NHS. In England this is called SIFT, in Scotland it is called ACT. This is probably in the region of 20K per student per clinical year, and is designed to meet the costs of the ‘students on the wards’ and pay for all the NHS staff time for those involved in teaching. This money stays within the NHS, and the universities have essentially no access to it.
If you add there two streams together you are talking about close to 30K of ‘state funding’ plus 10K from the students. Living expenses are on top, and I will ignore opportunity costs of what students defer from earning.
The problem with the 30K state funding figure is it fails the reality test. These sums add up to a figure (40K) close to what Stanford charges its small medical student cohort, and yet it is clear that our UK medical students get a much worse deal. Or just compare what this sort of money buys you at an expensive private school. There is a (fat) rabbit off somewhere. Nobody with any knowledge of medical education, and who isn’t playing politics, believes that is what we spend on each of our students.
Above, I said 20K goes through the universities, but I did not say that universities spend that 20K on delivering undergraduate teaching. The obvious issue is that medical research is big business, and most research in general loses an institution money — this is especially true of charity funded research, the main funder of medical research in the UK (although there is an attempt to make up this deficit from QR funds but it is grossly inadequate). Peacocks tail, and all that. So, teaching fees are used to subsidise this loss. There are good costings for this in some US schools, but they use endowments to meet the costs; in the UK we get students to pay for this. To what extent? I do not know. Do not ask, is the mantra. This will run and run. And then unwind.
What about the NHS money. Well, nothing is transparent in the NHS, but we know most of the this money is not used to support teaching, but siphoned off to pay for clinical care. What proportion? I would start at saying 70% (i.e. only 30% goes for what it is intended for). So I think 18K over the whole course. But I know of no convincing data in this area, just the sort of bumph Hansard repeats, which is not reality based. Do not ask, is again the mantra.
My previous post added in come complexities. And there are more, that I have not mentioned.
The key points are:
Anyway you can still listen….
‘Innovative’ educational practice is more fashion-driven than those who attend the catwalks are.
Interesting post from Tony Bates on the history of distance learning, and the University of London External Programme, which started in 1828.
Unfortunately I have no knowledge of the individuals who originally created the University of London External Programme back in 1828. It’s a worthy research project for anyone interested in the history of distance education.
I was once (mid-1960s) a correspondence tutor for students taking undergraduate psychology courses in the External Programme. In those days, the university would publish a curriculum (a list of topics) and provide a reading list. Students could sit an exam when they felt they were ready. Students paid tutors such as myself to help them with their studies. I would find old exam papers for the course, and set questions for individual students, and they would send me their answers and I would mark them. Many students were in British Commonwealth countries and it could take weeks after students sent in their essays before my feedback eventually got back to them. Not surprisingly, in those days completion rates in the programme were very low…
But I am fascinated by (and was ignorant of) the following:
Note though that teaching and examining in the original External Programme were disaggregated (those teaching it were different from those examining it), contract tutors were separate from the main faculty were used, and students studied individually and took exams when ready. So many of the ‘new’ developments in distance education such as disaggregation, self-directed learning, and many of the elements of competency-based learning are in fact over 150 years old.
Nick Carr writes about e-textbooks, quoting research that students don’t like them, or at least they prefer conventional textbooks. Seems reasonable to me. We know a lot more about the design of conventional textbooks, layout, indexing, and interaction and so on. But for dermatology it seems to me e-textbooks offer a way forward. If you want to learn dermatology, you have to look at images, and to do this well, you need access to lots and lots of images. One of the conclusions of a paper we published several years ago was how few instances of a particular disease students are exposed to. Seeing only n of 1 for a particular lesion type is just not enough: imagine your sole idea of what a ‘dog’ is, was based on only seeing one poodle. Current publishing models and norms mean that most dermatology textbooks are short on images — and often the images they contain are poor. E-textbooks are one way round this, and it is difficult to look at an iPad and not wonder what a good dermatology text would like like on it. What will be really interesting is what will happen to the legacy publishers given the price sensitivity of undergraduate students and the lower barriers to entry.
Annotation and memory of position on the page are important issues, but I doubt invention will not improve things. Just look at the way the ‘clunky’ Kindle allows you to highlight text, then retrieve it on the Amazon web site and go back to the text at the various bookmarks. A scholars dream for encouraging accurate referencing and citation.
^^ Skincancer909 is currently being rewritten and the future version will incorporate video with a new design.
“It’s no accident that only 11% of the US workforce is passionate about their work. This is a sign of great success. This is exactly what these institutions were designed to do – suppress passion. It starts with our schools that seek to prepare us for all the other institutional environments seeking people who can reliably follow instructions and execute in a predictable manner. Think of our current institutions as powerful chisels, relentlessly chipping away at our edges until we fit neatly into the tightly defined roles that our institutions have create.”
The most important thing in learning is copying how other people think. I don’t think learning by doing really gets one to emulate how other people think. Marvin Minksy
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
This is a quote from a review of Alice Gopnik’s most recent book, ‘The gardener and the carpenter’. I have enjoyed Gopnik’s previous work, and this quote could, with some latitude, be applied to medical school and medical education (where students become competent despite the best intentions of the medical educators…).
It assumes that the ‘right’ parenting techniques or expertise will sculpt your child into a successful adult. But using a scheme to shape material into a product is the modus operandi of a carpenter, whose job it is to make the chair steady or the door true. There is very little empirical evidence, Gopnik says, that “small variations” in what parents do (such as whether they sleep-train) “have reliable and predictable long-term effects on who those children become”. Raising and caring for children is more like tending a garden: it involves “a lot of exhausted digging and wallowing in manure” to create a safe, nurturing space in which innovation, adaptability and resilience can thrive.
We can only grasp reality by metaphor.
‘The Socratic slogan: “If you understand it, you can explain it’, should be reversed: Anyone who thinks he can fully explain his skill, does not have expert understanding’. Hubert Dreyfus.
There is sometimes a prejudice in medical education that somehow teaching at the bedside is always best. Of course most medical encounters are not at the bedside (any more) simply because most clinical encounters are not on wards, but in offices, whether the offices are in hospitals or elsewhere. The arguments for the bedside include tradition, but also reflect the fear that medical education will be expropriated from the clinical context. I have a lot of sympathy with the latter view, but it will sometimes lead to error.
Yesterday, I talked about the Dermofit App, to which I contributed. One of the rationales for this whole approach almost a dozen years ago now, was my belated realisation that clinical exposure — however intense — in dermatology might not be as efficient as a learning environment in a virtual world. In dermatology, simulation is over one and a half centuries old, and the history of this simulation, tracks the development of technology. It is just that this simulation relies on something we have got used to because it is all around us: high quality graphics. Pictures of lesions.
Several years later we published a paper, exploring this. We wrote:
“The overwhelming majority of students 82% (n = 41) did not see an example of each of the three major skin cancers (BCC, SCC, melanoma) and only a single student (2%) witnessed two examples of each. The percentage of students witnessing 1, >3 and >5 examples is given for each of the 16 lesions and demonstrates that there was not only a lack of breadth but also of depth to the students’ exposure.”
In one sense this is all very obvious. We know that (perceptual) classification tasks require practice, and that practice requires multiple training examples. The training signal: noise ratio can be higher in the virtual world, and it is easier to manipulate events in the virtual world. If the quip is that technology is everything that gets invented after your teenage years, we don’t recognise the obvious technology here simply because it is has been around so long. It is just that silicon really allows it to be done so much better. The caveat is whether the business model allows this.
Students will prefer the clinic, for reasons I understand. But they will often be to wrong to do so.
Roger Schank is speaking at OEB 2016, and there is a post from him here on the OEB site.
His second sentence is:
Sorry to be a downer, but technology will change nothing if what is meant by technology is that we have new ways of delivering the same old material.
His suggestions include:
He goes on to argue how AI / tech can help.
I can agree with much, if not all of this (excepting the ten year old bit, if taken too literally). And building simulators of the ‘real world’ is where we need to be. But I still wrestle with what is foundational and how much preloaded material students need in order to allow them to make sense of the real world (‘preloaded material’, yes, I can hear the hackles…)
You might divide learning into ‘just-in-time’ and ‘foundational’. Foundational rightly has a bad name because most foundational learning is not foundational at all, but often reflects the prejudices of those who benefit from selling particular content. Medical degrees are stuffed full of foundational stuff that is nothing of the kind.
In medicine, you learn your craft by doing it, by seeing patients and diagnosing their ailments in the company of experts, and seeing what happens to them (supplemented by learning about what has happened to others); and by going back to the books and foundational concepts continually. But there is a framework that creates the clinical worldview, and that worldview has a language, that requires immersion.
One (and only one) key goal of medical school is to enable to you to function in a clinical environment such that you can make sense of it, and learn from it. But that is not available to a novice, however bright, without a lot of ‘preloaded’ baggage. The question is about what the balance is between ‘preloaded’ and ‘just-in-time’. I think we obsess over the former and need to shift much more to the latter. But I do not know exactly how it will look in the end, although I know the direction of travel that is needed.
And yes, tech can help this, but not when PowerPoint is involved. He is right there, and has been for a long time.
[Alice Gopnik, in one of the John Brockman edited books (I think), remarked that although she had spent much of her professional life studying how babies make sense of the world, little of any of these learning insights made it into how she delivered material to her university students or how they learned. Three years of lectures on cracking eggs, and then in the fourth, you get to do it.]
Derek Bok states that some of those who were found guilty of criminal acts in the recent waves of corporate malfeasance in the US, scored very well on their ethics modules at Harvard. It is easy (and facile) to imagine that somehow doing a ‘course’ on a particular topic will produce a change in behaviour that is permanent and withstands countervailing forces (culture eats strategy,and culture eats morality etc, I hear you say). Those in universities should of course know better — producing changes in behaviour in response to an environmental stimulus is a paraphrase of one definition of learning. But the message doesn’t get through, largely because the academy has increasingly chosen to turn its professional tools away from examination of its own purpose. It is deemed rude to ask for evidence when everybody knows the sun goes round the earth.
Nor, if we are to believe Timothy Wilson, should we go in with the ‘null’ hypothesis that courses wishing to eradicate ‘isms’ may only be beneficial. The evidence points in a different direction: they make some people’s behaviour worse. I sometimes wonder if anybody is really too worried about whether these interventions work — they just want to tick boxes to comply with yet more rituals of verification (to use Michael Power’s phrase from the Audit Society).
Anyway these ramblings were by way of introduction for what is for me one of the clearest expositions of morality and the human condition. I have no idea why I cannot keep it out of my mind but maybe putting it down in writing might help. It comes from a short article by Jacob Bronowski, in a posthumous collection of his essays, ‘A sense of the future’. The article is “A moral for an age of plenty” and it includes an account of the death of the physicist Louis Slotin.
Louis Slotin was a physicist in his mid thirties, working at Los Alamos in 1946. Bronowski described him so: ‘Slotin was good with his hands; he liked using his head; he was bright and a little daring — in short, he was like any other man anywhere who is happy in his work’. Just so.
Slotin was moving bits of plutonium closer together, but for obvious reasons, not too close. And as experts are tempted to do, he was using a screwdriver. His hand slipped. The monitors went through the roof. He immediately pulled the pieces of plutonium apart, and asked everybody to mark their precise positions at the time of the accident. The former meant he would die (9 days later, as it turned out); the latter allowed him to prognosticate on what would happen to the others (they survived).
There are two things that make up morality. One is the sense that other people matter: the send of common loyalty….The other is a clear judgement of what is at stake: a cold knowledge, without a trace of deception, of precisely what will happen to oneself and to others if one plays the hero or the coward. This is the highest morality: to combine human love with an unflinching, a scientific judgement.
I actually think we are more lacking in the second than the former. Worse still, we are less tolerant of evidence than we once were: we prefer to wallow smugly in our self-congratulatory goodness. We have been here before. Medicine only became useful when physicians learned this lesson.
[ And yes, people remarked that Slotin hadn’t followed protocol…]
I didn’t know.
Two Stanford graduate students, Jerry Yang and David Filo, saw opportunity. Working from a trailer on campus, they began compiling websites into a list, organized by topic. They eventually named it Yahoo, an acronym for ‘Yet Another Hierarchical Officious Oracle.
I now have a new term for ‘learning outcomes’. Yahoo. Although ‘hideous’ competes with hierarchical
“The entire system of learning at Oxford, so far as I can recall, consisted of the combination of mnemonics, composition and argumentation. Reading lists were prodigious: often 20 or 30 items − both entire volumes and journal articles – so redundancy was a given: hours needed to be spent in the library to extract the pith from acres of paper. I took two courses (as modules were then called) every term, and the coursework requirement was an essay of 3,000 words per week for each of them; the sheer amount I had to write gave me the core facility needed for an entire adult working life as a professional writer.
The argumentation was, of course, astonishingly thorough when compared with the meagre “contact hours” most contemporary students are mandated: a full hour vis-à-vis, usually one-to-one, reading out your essay and then picking it apart.”
I learned a new acronym, too: BDDM (bi-directional digitial media). Ugly, but not silly
For many, assessments are a lighthouse in the fog of education—a clear guide by which to make safe decisions. But in reality, assessments create the fog.
Dan Schwartz here
Coursera pilots a new course format
‘Starting today, we will begin piloting a few courses in which all content is available only to learners who have purchased the course, either directly or by applying for and receiving financial aid.’
Stephen Downes comments ‘You will recognize it as “what we had before MOOCs” ‘ Anybody remember the OU? Although at least with the OU you could watch the TV programmes and look at the books in the bookshop.
“I actually believe that we need domain specific online learning environments that cater to the pedagogies appropriate to different disciplines.” Mark Smithers
As compared with the LMS as all about management and and not much about learning.
‘When people say ludicrous things like “we don’t need to remember things any more because we have Google!” you can assume they haven’t tried to learn anything outside their domain for a long time.’
Yes, facts matter and memory is our intellectual ballast. Here.
I have come across this from Paul Graham before, but I think he is saying something very important about learning in general. He points out that there are lots of skiing instructors, but not many running instructors. When you learn to ski, often your intuition is to lean back. Big mistake. Often you need to lean forward. It takes a while to re program the intuition.
I used to think most medical students knew how to learn and acquire expertise efficiently. I no longer think this way. In particular, there is an over emphasis on rule based strategies, rather than naturalistic ones. Just as people over emphasise predicate calculus in thinking about the world, so we and they, are often prisoners of mistaken theories about how ‘learning works’. Much learning is very unnatural.
It is usually taken for granted that the skills gap is a problem of skills supply, and public concerns often focus on a lack of STEM skills and soft skills. So proposed solutions tend to involve reforming education and worker training programs. The most popular approach has been to reduce tuition fees for selective fields of study, usually STEM majors.
However, I argue that this view is not correct. Research that I and my colleagues have conducted suggests that the skills gap persists mainly because employers are unwilling or unable to pay market price for the skills they require…..
Yet the skills gap remains, because the adjustments that workers and firms make will only eliminate the gap if wages reflect the relative supply and demand for various skills across occupations. But our data shows that this is not happening: Many jobs in industries that generate high profits (retail trade, educational services, mining, and forestry) tend to pay low wages and are therefore unattractive to workers, whereas jobs in industries that pay higher wages (finance, computer and electronics manufacturing, paper and printing) are not very profitable.
Via Stephen Downes
A couple of sentences from this article on the value of interdisciplinary research got me thinking — or at least pulling some memes off my dusty intellectual shelf of clutter. The article is about Ian Goldin, and some ideas I am sure he talks about in his new book, which I haven’t read.
He added that “one of the reasons” for the 2008 financial crisis was that “people lost their ethics, their judgement, and their wisdom” because of disciplinary silos.
I agree. I remember the Economist putting it more harshly: …’professors fixated on crawling along the frontiers of knowledge with a magnifying glass’[Economist, December 10th 2011]. Economics, a bit like psychiatry in medicine, is the canary in the mine. Nor would teaching mandatory ethics courses (‘I am certified in ethics A+!), do very much. Enron’s management were stars at HBS. This is one of the tragedies of many modern universities, so busy edging their way up largely meaningless ranking scales, that they are unable to tackle the problems society faces.
Golden was quoted as saying, ‘[there is] a “real pressure” on universities to be “thinking ahead” and teaching information that will remain relevant when current students “reach their mid-careers”’.
There are two aspects to this. One is that the whole idea of education is a way of hedging against a changing environment. If the world was constant, we could dispense with much (but not all) education — training would suffice. This is just another way of saying advance comes from when sons do not do what their fathers did (‘20th century physics was made by the sons of coblers’. Substitute your gender, please). But from a teaching perspective there is another facet to think about. We cannot adequately judge how well we educate our students over the short term (alone). Yes, they can pass finals. Yes, they can take a history etc. But the test of education is how well they behave and think 20 years down the line. This is a large search space that we can only navigate using theories about what makes the world change, and what makes people push at the boundaries: do not cite Cronbach’s alpha, at me. But in examinations and certification, like so much else in science and society, we are blinded by the apparent certitude of short term goals. And the allure of summary measures, rather than the messiness of the real world.
‘The rising prices of textbooks appears to be reaching a tipping point, the professor adds. The latest US Census Bureau statistics show that textbook prices increased more than 800 per cent from 1978 to 2014, more than triple the cost of inflation and more than the rate of increase of college tuition.’
Nice paean to David Attenborough, with the above title. There was a time when it is said that people like Jacob Bronowski were denied academic advance in the UK because he has appeared on TV. How the world of impact has changed. But just as many of us get excited by the power of the web to broaden access and change higher-ed, we should remember that some of the most potent educational agents of the twentieth century, got there before us. I am thinking cheap paperbacks (Pelican); the BBC; the OU on the BBC; and the movement throughout the 1930’s and beyond for prominent academics to reach out to that large part of the population who had been denied access to higher ed. If we want to build large scale resources for higher ed, we could do worse than look at the extraordinary teams the BBC put together. So, please don’t start lectures with those dismal learning outcomes, but just look at how Bond movies grab the attention and focus of their audience. Or just look at the Ascent of Man.
After a spell as a lecturer and reader at the LSE, he returned to his East End roots at the economics department at Queen Mary College, recruiting an impressive roster of academics and students to its venue in a former biscuit factory on Bow Road.
He was known for giving chances to mavericks: if a headteacher warned of a student’s “difficult” nature, Peston would normally take them on.
From an obit of Lord Peston FT
This brought to mind something in Craig Venters excellent autobiography, A life decoded. He described how he was so busy doing science — and publishing — as a student, that he failed some mandatory graduate exams. The faculty had to ‘invent’ an appropriate exam for him — which of course he passed. They obviously didn’t have to deal with the QAA or GMC.
Education is not just about means, but about maintaining intellectual diversity. We have to be concerned about variation, too. It is all too easy to concentrate on minimum standards or pass marks, without considering whether what we are doing, harms those who as a society we are most in need of.
Roger Schank has a new web site (not before time), so I have been rereading some of his stuff. Wonderful: (he of: ‘There are only two things wrong with the education system (i) what we teach and (ii) how we teach it’. Below are some of his choice phrases, but it really is worth a visit. I think he gets most things right. (FWIW, I have said on more than one occasion, that if universities were serious about learning, they would ban Powerpoint text slides).
“So, my advice. Know what matters to you. Learn that. Temporarily memorize nonsense if you want to graduate but have a proper perspective on it. Nothing you learn in high school will matter in your future life……That having been said, in honor of the coming school year, I have decided to give students some ammunition. Here are most of the subjects you take in high school, listed one by one, with an explanation about why there is no point in taking them.”
“In 1989, I witnessed corporate training for the first time. We had just started the Institute for The Learning Sciences, which was sponsored by Andersen Consulting (now Accenture). I visited their campus in St. Charles, Illinois, and saw many classrooms full of people who were mostly half asleep or in a daze from being talked at about corporate culture, and Andersen’s core values, and client needs, and so on……So, now I want to say something easy to understand about corporate training: STOP IT”
“Content cannot be delivered. If it could, you could hire FedEx to do it.”
“But we keep talking at people. If your training includes talking, then it isn’t working. If it includes Power Point it isn’t working.”
“There are always assessment questions. Why? Because the training department wants to know if the students have “learned the material.” What the training department should really want to know is what the student can do that they couldn’t do before the training.”
What I said was: “e-learning is the same garbage just in a new medium.”