I have a secret admiration for some aspects of surgical training. We all know the bad ones, so I do not need to talk about them. When Lisa was doing her Mohs’ fellowship, it was the following vector: you watch X procedures, you perform Y procedures under supervision and you then perform Z procedures ‘independently’, with help on hand. After that, you keep learning. Sensible, and has the essential character of what has been known about craft apprenticeships for over one thousand years: apprentice; journeyman; master. This BMJ piece by a urologist asks:
If you were applying for a certificate of completion of training (CCT) in urology in 2015 you had to have seen or assisted in at least 20 radical prostatectomies before being signed off as competent. A year later, for no apparent reason, it appears that 10 will do.
He then goes on:
Standing in a theatre, unscrubbed, so you can say you’ve seen a procedure was never a part of surgical training, nor should it be now. It has no value. Unless you are very good at the procedure already and you are learning nuanced techniques from a master surgeon, watching a procedure will never make you a better surgeon.
Now, I despair of this sort of thing even when we ask medical students to do it. Why, is the question? What value is there, in watching? That this is considered meaningful at this level of training is even more worrying. And of course the figures will be pushed down, over time. This is the NHS, after all; never let expert judgment get in the way of a political imperative or somebody paid by the government: “we have to revise the speed of light for operational reasons….”
There is a more subtle point which makes thinking about the article even more worth while.
Trainees should spend their training doing the things that they’ll be spending their lives doing, not watching procedures they will never perform.
Now, it is clear that the current bull coming out from HEE, NHS, Deans etc is that we don’t need experts anymore, just people to cope with whatever disease is the flavour of the month (that there are demographic changes — pace the lectures I received from John Grimey Evans in 1976— was apparently not obvious to NHS managers or Jeremy Hunt till late 2016). Here is a problem.
When people finish formal training they are not as expert as they will be in 10 or 20 years. I do want an experienced dermatopathologist to be reading the samples I sent him. Wisdom is not the sole preserve of the old, but in many craft or perceptual disciplines I know about, the old guys and women do it better. So, problem one, is that when people come off a training scheme they are not the doctor they want to be. They are not qualified, they are just setting out, able to work without immediate supervision — as they choose and judge. This is the ticket.
The second problem, as the author makes clear, is that the training schemes are wasteful and not geared to excellence. Again, in a world of ‘pull’ (John Seely Brown’s phrase) the NHS is still trapped within the metaphors of the same industrial age that Donald Trump thinks is going to bring all those jobs back.
We have lost our way in much of what is important in medicine. It’s time that we focused on what really makes a surgeon better and stopped the pointless processes that surround training
Amen. But the surgeons have got some things going for them. IMHO many other branches of medicine are much, much worse.
Kenneth Arrow has died. A real economist. I have a very hard time talking most health economists seriously, especially when they think QUALYs are anything but an almighty sleight of hand over reason (contract work for the NHS). As one economist pointed out to me, one is tempted to imagine that economists who were not very good at economics, become health economists. But that may be a little unkind, and we do need clear thinking about health. Here is a link to a paper published on the economics of the health care industry in 1963. It is both deep, humble, and wonderfully lucid. The econocracy movement should champion it.
People who make predictions of how many doctors or even what specific type of doctor we need in (say) 20 years are IMHO generally deluded. Or they are telling fibs. Or selling something. The following is from the NEJM and is about ‘hospitalists’
Twenty years ago, we described the emergence of a new type of specialist that we called a “hospitalist.”. Since then, the number of hospitalists has grown from a few hundred to more than 50,000 — making this new field substantially larger than any subspecialty of internal medicine (the largest of which is cardiology, with 22,000 physicians), about the same size as pediatrics (55,000), and in fact larger than any specialty except general internal medicine (109,000) and family medicine (107,000). Approximately 75% of U.S. hospitals, including all highly ranked academic health centers, now have hospitalists. The field’s rapid growth has both reflected and contributed to the evolution of clinical practice over the past two decades.
The only way you can play ‘make believe’ like the DoH and all the NHS ‘experts’ so keen to trample all over our medical students’ futures is if you think Stalin is still alive and sorting out the tractor numbers.
Doctors seem to diagnose what they know, so find out what they know before you ask them whats wrong with you.
From an obituary of George Klein in Nature. If you have ever thought about cancer, his thoughts have touched yours.
In 1957, a chair was created for him in tumour biology, a research field that he had helped to establish. The department of tumour biology that ensued was international and influential. Most of today’s leading cancer researchers who are over 50 have had some interaction with George and his department. Seven secretaries wrangled his large correspondence. He invented social media before the technology existed.
A telling phrase:
His last book, Resistance (Albert Bonniers Förlag, 2015; published in Swedish), won the prestigious Gerard Bonnier prize for the best essay collection of that year. It deals with resistance to extremism and to cancer. Throughout his life, George was preoccupied with the thin borders between evil and good, and health and disease.
Remember the jokes about the only way to run an efficient hospital is to have one without any sick people? Well just read this, from an editorial by Martin McKee and colleagues in the BMJ. The context is what might be involved in any trade deals with the US and what US corporations would require:
They can be expected to look abroad, making the UK, with a struggling NHS, a tempting target. The UK prime minister, Theresa May, has not excluded the possibility of opening the NHS even further to them. At present, US corporations struggle to make a profit in the NHS. They would be unlikely to agree any deal that limited their ability to press for changes that would generate profits, such as excluding poor and ill people.
It was reading Herb Simon’s ‘Sciences of the Artificial’ that woke me up what some professional schools had in common. I even wrote a piece in PLoS Medicine arguing that medicine is more engineering than science (‘The problem with academic medicine: engineering our way into and out of the mess’). And I think I called it right. But the parallels between medicine and many other other traditional professions is large. I am thinking law, architecture, teaching, and engineering. These are all design sciences, or since I sort of object to this use of the word science, design domains. One of the reasons medical education — and to a lesser extent medicine is in such a mess — is the way that we have failed to grasp this distinctions. I wrote last year:
Simon was a genuine — and it is an overused word— polymath, and at that time I was ignorant of his many contributions. His work ranged through business administration, economics (for which he was awarded a ‘Nobel’ prize), cognitive science, computing, and artificial intelligence. But what fascinated me most was the content of his most famous book, ‘sciences of the artificial’. In this work Simon set out to unify and provide a common intellectual framework for many human activities that involve creating artefacts that that realise a purpose of our choosing. Unlike our dissection of the natural world, whether that be identification of a gene for a disease, or a virus that causes a human disease, Simon was concerned with how humans build artefacts. In particular how do we navigate search spaces that are large, and where uncertainty is all around, and where there may be no formal calculus to allow us to fire across boundaries. He was thinking about thinking machines of course, but quite explicitly he was concerned with the professions, architecture, law, and of great interest to me, medicine and teaching and learning. I was hooked.
One of my favourite quotes is from Simon’s ‘Models of My Life’
More and more, business schools were becoming school of operations research, engineering schools were becoming schools of applies physics and math, and medical schools ere becoming schools of biochemistry and molecular biology. Professional skills were disappearing from the faculties.…they did not fit the general norms of what is properly considered academic. As a result, they were gradually squeezed out of professional schools to enhance respectability in the eyes of academic colleagues.
So I warmed to an article titled ‘Building a future for engineering’ in the Times Higher, linking to a Royal Academy of Engineering’s 2014 report, ‘Thinking Like an Engineer – Implications for the Education System’. I have not read all of the latter, but I warm to the phrase in the THE, referring to the report: ‘Even more fundamentally, engineering is a set of habits of mind’. Clinical medicine is more engineering than science.
This is an absolutely terrific article about ‘Choosing a specialty’. It focuses on psychiatry and the particularly problems that effect psychiatry but contains many powerful insights that most medics will recognise even if you have not expressed them. Many will not admit to them, and the medical schools will look the other way.
When you ask me whether you should enter psychiatry, your question also becomes whether I would go into psychiatry once again, knowing what I know now. Most people will tell you to enter their profession for that reason. They are justifying their own decisions. Their reply to you is a means of reassuring themselves.
You should ask yourself: Is your main purpose in choosing this line of work to make a living? If it is, then you should know it is, and don’t put too much effort or care into worrying about the work. It isn’t your main purpose in life. Your main purpose in life could be your marriage, or your children, or your larger family. Or it could be another activity other than your main paid work, such as writing, or art, or music, or faith.
The it gets into the DSM….
“Trouble is, the intrusions cannot be ignored or wished away. Nor can the coercion. Take the case of Aaron Abrams. He’s a math professor at Washington and Lee University in Virginia. He is covered by Anthem Insurance, which administers a wellness program. To comply with the program, he must accrue 3,250 “HealthPoints.” He gets one point for each “daily log-in” and 1,000 points each for an annual doctor’s visit and an on-campus health screening. He also gets points for filling out a “Health Survey” in which he assigns himself monthly goals, getting more points if he achieves them. If he chooses not to participate in the program, Abrams must pay an extra $50 per month toward his premium.
Abrams was hired to teach math. And now, like millions of other Americans, part of his job is to follow a host of health dictates and to share that data not only with his employer but also with the third-party company that administers the program. He resents it, and he foresees the day when the college will be able to extend its surveillance.”
Cathy O’Neil, ‘Weapons of Math Destruction’, excerpt on Backchannel.
This sort of thing is going to be all over education and our private lives. Big data masquerading as big ideas, or just ‘big money’. Its just because ‘we care’.
I fully understand the pharmaceutical industry’s frustration with NICE – it stops them making so much money at taxpayers expense. What I don’t understand is why we are paying them any attention.
Comment from FinPhil in the FT. Most recent egregious example of pharma here, although this is not the NICE example. 2016 was the year when pharma began to boast about their new ‘we do not need to invent drugs’ financial model. Then they got embarrassed, as commentators like Martin Wolf of the FT pointed out that many people now viewed pharma like they viewed bankers. Financial engineering is much easier than biological engineering.
“Banking regulators failed and now we have no money. Does the NMC make nursing safer? No. Does the GMC make doctors safer? No. Regulators are the dust-carts that follow the Lord Mayor’s Show of life. There is an utter fiction that regulation improves anything. The catastrophes of history are testimony to that.
Health regulation fails because humans, doctors and nurses fail. They may fail because we do not support them, educate them or motivate them, fund their purpose or demand too much. We regulate activities because they are complicated or vital. That is our mistake.”
Go read, The Audit Society: rituals of verification by Michael Power.
From a review in the FT of Steve Silberman, of what sounds like a book for our time, ‘How to survive a plague’, by David France.
“Jesse Helms, the five-term Republican senator from North Carolina, personally blocked spending on Aids prevention, treatment or research for years, pontificating from the Senate floor, “We’ve got to call a spade a spade, and a perverted human being a perverted human being.”
“Some of the most haunting passages in the book record twists of fate that delayed effective prevention and treatments for years. A young chemist at Merck who suspected early on that a class of drugs called protease inhibitors might yield promising avenues for research was killed on Pan Am Flight 103, downed by Libyan terrorists over Lockerbie, Scotland, in 1988. Years passed before other researchers went down that road again, discovering a drug that became a template for the “cocktails” that have turned HIV infection into a manageable chronic illness rather than a certain death sentence.”
In 1978, the distinguished professor of psychology Hans Eysenck delivered a scathing critique of what was then a new method, that of meta-analysis, which he described as “an exercise in mega-silliness.”
Matthew Page and David Moher here in a commentary on a paper by the ever ‘troublesome’ John Ioannidis, in his article titled, “The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses”
To which, some of use would say, this was all predictable when the EBM bandwagon jumped on the idea that collating some information, and ignoring other information was ‘novel’. Science advances by creating and testing coherent theories of how the world works. Adding together ‘treatment effects’ is messier, and more prone to error. Just because you can enter data in a spreadsheet, doesn’t mean you should.
UCAS has just released its data on medical school applications for this year (2017 entry)……A few bits of background information are useful first. Total applications in 2015 were 82,034 for 7,424 places, which leads to the oft-cited ‘one place for eleven applicants’. However these numbers include multiple applications by students. The number of individual applicants was 20,100 and success at gaining a place depended on country of origin. Success rates were 40% for UK applicants, 10% for EU applicants and 20% for non-EU applicants.
Andrew Goddard, here. I am surprise that the acceptance rate is this high, but I guess there is a lot of self-selection. I do not have a good feel for what is influencing career choices, but the baby boomers had it better — once they got in, sadly. I will stick with my default opinion and change as the evidence changes: what happened to teaching, will happen to medicine. Politics is a dominant negative mutation, and all buckle before it.
This (‘Meaning and the Nature of Physicians’ Work’, NEJM 275;19:1813-5) is one of the best things I have read in a long time. It is right on some big issues, and is only depressing if you think positive action is not possible. The article speaks to the dehumanisation of the practice of medicine, and what happens when there is a gulf between those who practice, and those who think that a summary of practice is the same as actual practice. Tacit knowledge doesn’t fill cells in Excel spreadsheets. It is of course, not just medicine that is changing this way, but other professions — look at what has happened to schoolteaching and the bogus politics and ‘common sense’ that masquerades as evidence in education.
There are lots of downsides to technology making it harder for some things to be forgotten. But lots of advantages too. I use a simple diarying App that is available on mobile and Mac, Day One.
One of its nice features is that gives you the option to see comments you have made on the same date in previous years. Now, like many people I tend to often agree with myself — at least over the short term— but it is fun to read earlier musings and wonder if the nuance needs changing, but also to see the same underlying memes appearing again and again. Often, I am still in agreement, with my earlier comments. Sometimes not. Here is one from three years ago.
On the usually sound principle that there is nothing in UK medicine that can’t be made worse by the involvement of the General Medical Council
Nigel Hawkes in BMJ,10 November 2012
This film from 38 degrees is incredibly moving. And I recognise the places and the hospitals. Worth a critical review, but the abuse of trust is manifest (which is why they are called ‘trusts”). My post title — ‘Forty year ago we would have started a revolution’ — is my favourite line.
The quick retort is, of course, that they are actually treated worse — at least worse than some factory workers. The heading is from an article in BMJ careers. Jane Dacre is quoted as saying we need more juniors, and ‘more boots on the ground’. By contrast, Catherine Calderwood, Scotland’s CMO didn’t agree that an any increase in junior doctors was necessary:
“I’m not sure that more is definitely better. We’ve had a 38% increase in consultants in Scotland in the past eight years, and almost a doubling of our emergency medicine consultants, and I’m not sure if I walked into an emergency department they would tell me it’s half as much work as it was eight years ago.”
She went on to spread the kool-aid a little more:
“I think sometimes doctors have not embraced others doing some tasks, and I would like us to be much more like a conductor for the orchestra. So (only) the really difficult stuff, the really responsible stuff, and the really clever stuff is what comes to the doctor as the senior leader.”
I think some of what they both say is correct, but I also fear some of what they both say is politics. The UK — including Scotland — is desperately short of doctors. Appointment times are too brief, waiting times out of control, clinical expertise increasingly patchy, and doubts about the adequacy of training, widespread. Demand is rising, and wants and needs increasingly confused. From where I stand, clinical service in some areas is getting worse. Scotland’s NHS is, in some parts, second world standard, as a colleague from mainland Europe once reminded me.
The issue about paramedical staff is important, but we have been here before. I may be misquoting the figures a little — and they are for the US— but around 1900 one in three health care workers was a doctor. Now the figure is around 1 in 14. This trend in the ratio will and should continue. The discussions about physician assistants and the like have been going on for over half a century, with little evidence of action. The key issue is that if you want to encourage people to move into new roles, you have to create a certification system that rewards and encourages people to do these jobs. That is why we have radiographers, pharmacists and the like. But successive UK government officials hate the idea of certification, and prefer that ‘nurses’ moves into roles that they have little formal training for, and instead, end up existing free of any meaningful regulation. Governments cannot face up to the fact that these ‘health care workers’ (apologies: an awful bloody phrase) will need more than subsistence wages, and setting the system up will require upfront funding. I used to laugh at the image of a doctor’s office with all the certificates on the wall behind the desk, but now I advise patients always to ask whether the person providing diagnostic or therapeutic activities has a recognised medical qualification. How many melanomas do you diagnose a year?; how many times have you performed this procedure?; what exams did you have to pass etc.
If you ask whether you can train graduates to become physician assistants in dermatology, dermatological surgery and the like, the answer is a clear yes. But to make this a sensible career choice, we need certification — theory and practical exams and so on — and job titles that are transferable wherever somebody works. At present we simply to not have this. The dentists have done this and — key to any debate — practitioners (hygienists etc), whether they are dentists or not are registered with the General Dental Council. Now, the GDC is not a very popular organisation, but the idea of formal certification —something that means that practitioners can move jobs easily — is a key component of making this system work. Institutions matter, as do incentives.
In all places, I came across the following in Dylan Wiliam’s most recent book (if you want to understand what teaching feedback really is, read it). After pointing out how some machine learning techniques can outperform some medics in some contexts, he writes
However, it is important to realize that the key factor in making jobs suitable for automating is not that they are manual or low skill. It is that they are routine. A task can require many years of training for humans to become good at it, but it can still be relatively routine, thus making it relatively straightforward to automate. This is just one example of a much more general principle, which is that many of the things that we thought would be easy to automate turn out to be rather complex, while many of the things that we thought would be hard to automate turn out to be reasonably simple. …. This observation—that high-level reasoning seems to require very little in the way of machine power, while many low-level sensorimotor skills appear to require huge computational resources—is known as Moravec’s paradox, named after Hans Moravec, who pointed out that “it is comparatively easy to make computers exhibit adult-level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility” (Moravec, 1988, p. 15).
Now, of course I work in a domain that is heavily perceptual and, as yet, AI systems have made little inroad. This may well be because the task is difficult (and that the human visual system is powerful) but also (critically) that the available training sets are orders of magnitude too small. This will only change if the clinical workflow is fully digital. We have published some work in this area, and if you visit the Dermofit app on the iOS store you will see an app that uses some machine learning. But a long, long way to go, and the humans can still pay the bills. For the moment.
The question for medical education is that as we (reasonably) concentrate our procedures around high level processing, the sort of environments we need to develop perceptual skills are neglected. You can do both.
There is a nice piece by Nassim Nicholas Taleb in Medium. It is from a forward to a book (I think) on physical / strength training. If you have read Taleb you will know this is not too surprising.
You will never get an idea of the strength of a bridge by driving several hundred cars on it, making sure they are all of different colors and makes, which would correspond to representative traffic. No, an engineer would subject it instead to a few multi-ton vehicles. You may not thus map all the risks, as heavy trucks will not show material fatigue, but you can get a solid picture of the overall safety.
Likewise, to train pilots, we do not make them spend time on the tarmac flirting with flight attendants, then switch the autopilot on and start daydreaming about vacations, thinking about mortgages or meditating about corporate airline intrigues — which represent about the bulk of the life of a pilot. We make pilots learn from storms, difficult landings, and intricate situations — again, from the tails.
In one sense he is saying something that is easy to agree with. But if you delve a little deeper, it is not what we always do in medical education.
The structures we create to enable learning in a clinical discipline are not mirrors of what goes on in the real world. Pace the airline example. We shouldn’t expect teaching time to mirror disease prevalence; we don’t spend most of our time in dermatology teaching students about viral warts, or dandruff, or toxic erythema. When you try to recognise objects, you do not just study those particularly objects. Rather, you have to study all the other objects. If you want to be able to ‘call out’ whenever you see a dog, you have to study cats. And chimps, and wolves and so on. This is one of the reasons why just learning about the top ten conditions makes little sense, if acts of recognition are involved. Most things are defined by what they are not. To think in the box, you have to know what is outside the box. This is what makes medical education a hard problem.
There are implications for clinical practice for the expert, too. Everyday practice appears to minimise the role of the statistical tails. Your learning about common condition may be ‘everyday stuff’ requiring little formal study. But for rare conditions, or odd presentations of common conditions, everyday practice, may not be sufficient — simply put, you do not see rare events frequently enough to consolidate and strengthen your memories. Everyday practice rarely provides enough critical mass, you might say. A practical example.
When I was a trainee in Newcastle if we saw an ‘interesting patient’ or a patient in which the diagnosis was unclear, we pressed a buzzer. The buzzer and flashing light went off in all the clinic rooms, the laboratories, the professor’s office and the seminar room. What happened then, resembled the Stepford wives. All descended on the particularly clinic room, as though under some malign influence. There were times when this was quite funny, although some patients might have told this differently.
This simple tool was just an implementation of another one of Rees’s rules: routine clinical practice is not sufficient to consolidate or acquire the skills you need to provide routine clinical practice. This seems like a paradox, but it isn’t. “A sailor gets to know the sea only after he has waded ashore.” Rather, I always view it as a solution to the forgetting curve that Ebbinghaus described (although I think there may be other justifications)
There is a simple learning point here. The acquisition or maintenance of clinical competence requires much more than seeing patients (and by this, I do not just mean reading research papers). Software, and virtual worlds that we control, might help. But the Rees maxim remains: routine clinical practice is not sufficient to consolidate or acquire the skills you need to provide routine clinical practice
There was a story in the FT a few weeks back (paywall). It concerned the painting ‘Portrait of a Man’, by the Dutch artist Frans Hals. Apparently, the Louvre had wanted to buy the painting some time back, but were unable to raise the funds. However, a few weeks ago, the painting was declared a “modern forgery” by Sotheby’s — trace elements of synthetic 20th-century materials have been discovered in it. The story has a wider resonance however. The FT writes:
But if anything the fake Hals merely highlights an existing problem in how we determine attribution. In their quest to confirm attributions, dealers and auction houses seek the imprimatur of independent, usually academic, experts. Often that person’s “expertise” is deduced by whether they have published anything on a particular artist. But the skills required to publish a book are different to those needed to recognise whether a painting is genuine. Many academics are also fine connoisseurs. One of the few to doubt the attribution to Parmigianino of the St Jerome allegedly connected to Ruffini was the English scholar, David Ekserdjian. But too often the market values being a published writer over having a good “eye”.
Here is a non trivial problem: how can we designate expertise, and to what extent can you formalise it. In some domains — research for example — it is easier than in others. But as anybody who reads Nature or the broadsheets knows, research publication is increasingly dysfunctional, partly because of the scale of modern science; partly because the ‘personal knowledge’ and community has been exiled; and partly because it has become subjugated to academic accountancy because the people running universities cannot admit that they do not possess the necessary judgment to predict the future. To use George Steiner’s tidy phrase, there is also the ‘stench of money’.
But the real danger is when the ‘research model’ is used in areas where it not only does not work, but does active harm. I wrote some time back in a paper in PLoS Medicine:
Herbert Simon, the polymath and Nobel laureate in economics, observed many years ago that medical schools resembled schools of molecular biology rather than of medicine . He drew parallels with what had happened to business schools. The art and science of design, be it of companies or health care, or even the type of design that we call engineering, lost out to the kudos of pure science. Producing an economics paper densely laden with mathematical symbols, with its patently mistaken assumptions about rational man, was a more secure way to gain tenure than studying the mess of how real people make decisions.
Many of the important problems that face us cannot be solved using the paradigm that has come to dominate institutional science (or I fear, the structures of many universities). For many areas (think: teaching or clinical expertise), we need to think in ‘design’ mode. We are concerned more with engineering and practice, than is normal in the world of science. I do not know to what extent this expertise can be formalised — it certainly isn’t going to be as easy as whether you published in ‘glossy’ or ’non-glossy’ cover journals, but reputations existed long before the digital age and the digital age offers new opportunities. Publishing science is one skill, diagnosing is another, but there is a lot of dark matter linking the two activities. What seems certain to me, is that we have got it wrong, and we are accelerating in the wrong direction.
No, it doesn’t: pure clickbait. But how many does it need? The headline was taken from a comment by Eric Schmidt, the former CEO of Google, that the ‘UK needs 10,000 computer science academics’, When I saw the headline, I initially read it as saying the UK needed another 10,000 computer science graduates. Oops. He means staff, not students.
But then I wondered, as I often have, how many academics in medicine we need, and how we might go about working out what the number should be. And I should add, I am sceptical we can know how many doctors we need, only those untouched by reality like Jeremy Hunt, know answers to question like that. But there are some numbers that are relevant, even if I cannot match Enrico Fermi’s ability to perform back of the envelope calculations (how many piano teachers are there in New York?).
Depending on how you parse the data, skin disease is said to be the commonest reason to visit a GP in the UK. Estimates suggest there are 15 million visits to GPs with a skin problem each year. In many countries all these patients would go direct to an office dermatologist (this distinction is important, but marginal to my argument here).
Each year about one million people with skin disease are referred from primary care to secondary care. New to follow up ratios are falling — being forced down
without any clinical reason because of money — but assume 1 to 1.5. In terms of visits, the ratio is much higher, because we have to include surgery and phototherapy, so the ratio of new to follow up is much higher, at a guess 1:4. This would mean 4 million visits. This seems frighteningly high.
There are around 70,000 GPs on the register, and around 600 consultant dermatologists in the UK. GP recruitment problems are well known, and estimates are that close to one third of all dermatologist posts are vacant (‘no suitable candidates’). There are juniors (sic) on top, and other miscellaneous doctors too. In terms of new patients, I see 26 per week, and I am clinically part time, so around 1000 per year, plus some on call work, which is light. If we divide the 1,000,000 new referrals by 400 consultants, we get each consultant seeing around 2,500. But if we add in juniors, staff grades and locus, the numbers *feel* about right.
If we were to look at academic staffing, we have about 30-35 clinical academics in dermatology in the UK. They spend their time between clinical practice, research and teaching. Most UK students are taught for most of their time by people who are not ‘academics’ or at least by people without what in most subjects and in most advanced countries would be recognised as an academic apprenticeship. Skin biology or skin science is notable by its almost complete absence in many — possibly the majority— of medical schools. If we argue — and I would — that those who run and organise teaching in higher education need to view this task as a *professional* task, we are running with say 15 FTE providing the undergraduate teaching resource that underpins clinical practice and early training / education. Note: my argument is about undergraduate education, and not specialist training; and I believe that teaching is not a ‘bolt-on’ activity at the undergraduate level (if you don’t agree with this view, I suggest you could largely dispense with university medical schools).
There is a simple way to frame any answer to my question. Do you think it is possible to produce and maintain a culture of learning and clinical expertise given the numbers above?
I try to avoid writing on this topic, finding it too depressing — although not as depressing as I once did, as I am closer to the end rather than the beginning. And there are signs of hope, just not where they once were.
There is an editorial in Nature titled ‘Early-career researchers need fewer burdens and more support’. It makes depressing reading. The contrast is with a talk on YouTube I listened to a few days back, by the legendary computer engineer (and Turing award winner and much else) Alan Kay, in which he points out that things were really much better in the 1960s and people at the time knew they were much better. Even within my short career, things were much better in 1990 than 2000, 2000 than 2010 and so on. When people ask me, is it sensible to pursue a career in science, I am nervous about offering advice. Science is great. Academia, in many places, is great. But you can only do most science or academia in a particular environment, and there are few places that I would want to work in if I were starting out. And I might not get into any of them, anyway (Michael Eisen’s comment: never a better time to do science, never a worse time to be a scientist’). I will share a few anecdotes.
Maybe 10-15 years ago I was talking to somebody who — with no exaggeration — I would describe as one of the UKs leading biologists. This person described how one of their offspring was at university and had, for the first few years not taken his/ her studies too seriously. Then things changed, and they wondered about doing a PhD and following a ‘classical’ scientific career. The senior biologist expressed concern, worried that there was now no sensible career in science, and that much as though he/she had enjoyed their career, he/she could not longer recommend it. There was some guilt, but your children are your children.
The second, was a brief conversation with the late physicist John Ziman. I had read some of Ziman’s work — his ‘Real Science’ is for me essential reading for anybody who wants to understand what has happened to the Mertonian norms, and why science is often increasingly dysfunctional — but he shared a bit of his life history with me. When he was appointed as a lecturer at Cambridge in physics, the topic of his lectures was ‘new’ and there were no established books. So he set out to remedy the situation and spent the first two years writing such a book (still available, I think), and after that, turned his attention back to physics research, and later much more (‘you have to retire to have the time to do serious work’). He commented that this would simply be impossible now.
With respect to medicine, there has been attempts for most of my life to develop schemes to encourage and support young trainees. I benefited from them, but I question whether they target the real problem. There are a number of issues.
First, the model of training of clinical academics in medicine is unusual. Universities tend to want external funders to support the research training of clinical academics (Fellowships), but that is a model with severe limitations. Nurturing talent is a core business of the universities, and they need to devote resource to it. It is their resposibility. Of course, they need to train and support academics, not just researchers. This is what career progression within academia is about: lecturer, reader, professor etc. What medical schools want to do is to off load the risk on to the person, and then only buy when the goods have been tasted. In a competitive world, where other career options are open, this might not work well. Worst of all, it funnels a large number of institutions — institutions that should show diversity of approaches — into the lowest common denominator of what is likely to be funded by the few central funders. Until you have independence of mind and action, you cut your chances of changing the world. (Yes, I hear you say, there is not enough money, but most universities need to cut back on ‘volume’.)
The second issue, is about whether the focus should be on schemes encouraging young people into science. I know I may sound rather curmudgeonly, but I worry that much activity relating to pursuing certain careers is reminiscent of ‘wonga like’ business models. I think we should do better. If youngsters look at what life is like at 40, 50 and 60 or beyond, and like it, they might move in that direction. You would not need to encourage them — we are dealing with bright people. A real problem for science funding is that for many individuals, it resembles a subsistence society, with little confidence about long term secure funding, and little resilience against changes in political will. Just look at Brexit. I remember once hearing somebody who had once considered a science career telling me that it seemed to him that most academics spent their life writing grants, and feeling uncomfortable about replacing what they wanted to do, with what might be funded. Conversations about funding occupied more time than serous thinking. I listened nervously.
Finally, I take no pleasure in making the point, but I do not see any reason to imagine that things will get better over a ten or twenty year period. One of my favourite quotes of the economist Kenneth Galbraith, is to the effect that the denigration of value judgement is one of the ways the scientific establishment maintains its irrelevance. I think there is a lot in that phrase. If we were to ask the question, what is more critical: understanding genetics, or understanding how institutions work, I know where my focus wold be be. I suspect there is more fun there too, just that much of the intellectual work might not be within academia’s walls.
Note: After writing this I worried that people would think that I was opposing schemes to encourage young people, or that I failed to understand that we have to treat those with new ideas differently. That was not my intention. Elsewhere I have quoted Christos Papadimitriou, and he gets my world view, too.
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
“The immune system is unknowable, dynamic, complicated, and it always surprises you.” Stephen Deeks quoted in Science. And yet, useful discoveries are made, and have been made for a long time.
The University believes the Secretary of State should not have the ability to determine, or even to have significant influence over, which subjects universities can and cannot teach.
Written evidence submitted by the University of Cambridge (HERB 17). Higher Education and Research Bill Committee.
Years ago, we would have said that the state trying to control education would have been a characteristic of societies — Russia, China, spring to mind — that paid little attention to individual liberty or which failed to understand that most expertise — and power — should reside outwith government. No longer. Sadly, medical education is already increasingly subjugated to the state.
“The NHS has developed a widespread culture more of fear and compliance, than of learning, innovation and enthusiastic participation in improvement.” It also said “Virtually everyone in the system is looking up (to satisfy an inspector or manager) rather than looking out (to satisfy patients and families)” and “managers ‘look up, not out.’”
The Institute for Healthcare Improvement (IHI), a US organisation, report on the NHS ( quoted by Brian Jarman in the BMJ. BMJ 2012;345:e8239 doi: 10.1136/bmj.e8239 (Published 19 December 2012)).
The NHS could then become a threadbare charity, available to avoid the embarrassment of visible untreated illness. Julian Tudor Hart BMJ 2016:354;i4934
Abraham Vergese, an infectious disease physician, gave a talk here in Edinburgh last week. It was a very mixed audience, but I suspect the many students who were there enjoyed it. I have not read any of this books — nor looked at his TED talk — but his Wikipedia entry gives you a flavour of how interesting he is, and how varied a career can be — when you have courage.
One issue that came up tangentially, was the history of diagnosis, and there were some opinions ventured by the audience in terms of when diagnosis was historically established. I may have missed key points, but I found it hard to accept that the idea of diagnosis was something you could date except in very broad terms, even less that you could associated it with the 1870s or with the idea of stethoscopes being a key marker of when modern ideas of diagnosis were established. For instance — and since the lecturer was an ID physician — my first thoughts turned to scabies. The scabies mite was identified in the 1690s, and it was recognised as the cause of the disease ( I am not quoting primary sources so let me know if……) So here we have a clear linking of symptoms, signs, causality, a causal agent, and a broader theory about pathogenesis and epidemiology. So, this it got me thinking about how I view the topic of diagnosis.
Diagnosis is the mapping of one state with another, with the two states being linked by a network of attributes. Diagnosis is a suitcase term: it may contain lots of different tools, tools suited to various purposes, and tools for which we may find different purposes over time. Diagnosis represents an attempt to classify the world into particular states with often the goal of making some predictions about some other state. Most of the time, we think in terms of prediction, about what might happen to that person with or without some intervention. If you see these physical signs (burrows) and the patient describes particular symptoms (itch), then the ‘state’ is scabies. If the diagnosis is correct, you can say something about what causes the state, what might happen, and what effect a particular intervention (permethrin / malathion etc) might have. If you are lucky, you can feel happy with causal arrows linking much of what you say and think. Prediction is important but it is of course not the only quality we want in a theory. We tend to prefer some theories to others, even when they why make similar predictions. Think of Copernicus. We tend to prefer one of the following, irrespective of whether both allow the same quantitative clinical predictions:
Our suitcase of diagnostic concepts have changed over time, however. For instance, even in modern medicine, causality is often lacking. We may use proxy or associated factors to define particular states. We may use simple heuristics as our guide to action, even though we have little idea of where the causal arrows are going. Think much of psychiatry. This does not mean we are powerless, just that we are more ignorant than we would like. We are of course wedded to particular metaphysical systems.
Diagnosis might have been used in the absence of knowledge about particular interventions to attribute blame, as an explanation. If a patient behaved in this way or suffered some state, it was a divine punishment for some behaviour. Now, I may not agree with this world view, but this too is diagnosis. The theory my seem wrong, it may seem primitive, but then my ideas of physics are primitive too if they are applied to the world of the very small.
Galen thought in terms of the mean, and the treatment by opposites (hot treatments for cold; moist treatment for drying diseases etc). This all sounds slightly crazy to modern ears (although dermatologists among you will point out the latter has definite therapeutic merit within very particular skin states). Or how about the idea of therapeutic ‘signatures’. This is from Ian Hacking :
Syphilis is signed by the market place where it is caught; the planet Mercury has signed the market place; the metal mercury, which bears the same name, is therefore the cure for syphilis.
As Hacking points out this allowed Paracelsus to kill lots of people simply because he knew that mercury worked. But whatever the metaphysical system linking two states, the idea of diagnosis was firmly established. Just as Newton got most things right in his physics, and most of us ignore what came after — except when we use the GPS.
Diagnosis was not limited to medicine. Our ancestors spent their lives making diagnoses about what to eat and what not to eat. Making diagnoses about what particular weather states would do to crops etc. Plumbers make diagnoses, as do any humans trying to make sense of an environment that is not static, and where we value intervention.
What may have been specific to medicine was our hangs up about whether there was something special about humans, and whether the simple rules, experimentations and demonstrations of efficacy that allowed other types of human technological progress or indeed much of everyday life, applied in the domain of disease. Successful interventions or demonstrations will have had an effect on metaphysical beliefs in the long term. And of course much of this story is tied up with the growth of that particular branch of formal knowledge we call science. 1870 is just a little late.
 Hacking I. The emergence of probability : A philosophical study of early ideas about probability, induction and statistical inference. Cambridge: Cambridge University Press; 1984.
“Physicists studying sport have established that many fieldsmen are very good at catching balls, but bad at answering the question: “Where in the park will the ball land?” Good players don’t forecast the future, but adapt to it. That is the origin of the saying “keep your eye on the ball”.
As complex systems go, the interaction between the ball in flight and the moving fieldsman is still relatively simple. In principle, most of the knowledge needed to compute trajectories and devise an optimal strategy is available: we just don’t have the instruments or the time for analysis and computation. More often, the relevant information is not even potentially knowable. The skill of the sports player is not the result of superior knowledge of the future, but of an ability to employ and execute good strategies for making decisions in a complex and changing world. The same qualities are characteristic of the successful executive. Managers who know the future are more often dangerous fools than great visionaries.”
I think you could say the same about education and medicine: you can say less than you know.
Donald “D.A.” Henderson, an American epidemiologist who led the international war on smallpox that resulted in its eradication in 1980, has died.
“But it was in the fight on smallpox — perhaps the most lethal disease in history and one that killed an estimated 300 million people in the 20th century alone — that he became known around the world…”
“I think it can be fairly said that the smallpox eradication was the single greatest achievement in the history of medicine,” Richard Preston, the best-selling author of volumes including “The Hot Zone,” about the Ebola virus, and “The Demon in the Freezer,” about smallpox, said in an interview. He described Dr. Henderson as a “Sherman tank of a human being — he simply rolled over bureaucrats who got in his way.”