A few months back, I was walking past the entrance of the old Edinburgh Medical School, founded in 1726. A not-so-crazy thought came into my head, one that I could not dismiss: we need to move on from the idea that a Medical School must be situated within a University (and of course, it wasn’t always, anyway). The founding set of ideas that we have struggled with ever since Flexner, we should now recast for a very different world. We need to create something new, something that makes sense in terms of a university and something that puts professional training within a professional context. At present, we fail on both of these accounts. Rather than integrate we should fracture. We need to search out our own new world.
Specialisation and the division of labour is as old as humanity, and of course it goes way back further when we are talking biology. Adam Smith may have formalised why and how it was important economically but he did not invent it. Most specialisation relies on expertise, at least it used to until Crapita and the like started mining the seams of government ignorance.
The quote below is from an article in the Economist in May this year. It is about Public Health England (PHE) and how since they only possessed 290 contact tracers, they needed to call on those wonderful experts in everything, Serco, to help them out. Of course, expertise in such tasks always used to reside with Local Government, not PHE, but Boris and his bunch of Maoists, when they are not having their eyes tested in the fast lane, have decreed that Local Government — along with the opposition, the judges, the education sector and more — are enemies of the people. Given this mindset, we are left with those whose main area of expertise is commercialising ignorance.
Firms such as Serco, a big contractor, are in talks with the government to provide the workforce. It should be possible to train new recruits fairly quickly—the requirements of the job are similar to those of 111 operators, for whom the training time is just four hours. They will work from a script that guides them through the various stages of an interview [emphasis added].
Awhile back, I ended up corresponding with somebody in the Scottish government about how misleading their self-help pages on skin disease were: they contained factual errors, and would mislead people seeking medical help. The content had clearly not been written by a medical practitioner — defined as somebody with domain clinical expertise and who might have actually dealt with patients by shaking hands with them. Asking for validation studies or some sort of empirical evidence to support the content, was unhelpful as the content was supplied by another agency and was commercially ‘confidential’. I didn’t follow up because the person I corresponded with clearly knew that his own position was both untenable, and uncomfortable. Its just business: you know, ‘new ways of working’, ‘direction of travel’, and all those other vacuous suitcase terms that just mark a space where reason or domain expertise used to reside.
Rather than making clever machines, or allowing humans to do what only humans can do1, it seems we are content to make humans behave as stupidly as Excel spreadsheets. 111 is not for BoJo et al.; 111 is for poor people waiting to be levelled up, even if the best way to do that, is to go to straight to A&E. 2
I read about the QALY (quality adjusted life year) during my intercalated degree in 1980-81, when we were exposed to some health economics. It was considered new and interesting at the time. It took me about 10 minutes to sense that it was nonsense, even if I couldn’t quite put my feelings into words that quickly1. The goal was fine, but the methodology was metaphysical in nature, rather than grounded in the world that you could touch with your fingers. At least not if you look at the world through the prism of the natural sciences.
Economists have a disturbing habit of confusing how the world works with their own (strange) ideas of rationality. If only the world could be said to work in a way that was amenable to their methods. When physicists wanted to estimate the speed of light they recognised that they had to create some theory and some technology in order to obtain the correct answer. Embarrassingly — at least from the economists point of view — they had to do some experiments and see if their answer made sense when applied to new observations in the external world. Until they had done this, they stayed shtum.
Not so, for our economists. Their solution is effectively to agree some conventions, and then define what the speed of light should be. Whether their theory explains the way the world really works is neither here-nor-there. So QUALYs became a make-believe that suited both economists and the technocrats in government. The former, because the need for QUALYs became a job creation scheme for health economists (just as evidence based medicine (EBM) became a lifeline for all those epidemiologists who belatedly realised that much of their subject was methodologically deeply flawed). The technocratic governments liked what the economists brought them because it exiled judgement (and hence blame), allowing human suffering to be traded in arbitrage markets from which they could metaphorically wash their hands — ‘just following the science’, ‘just following the science’ (ring any bells?). Many politicians don’t want to do politics, but they do want to stay in power. As do economists2, who appear pathologically obsessed with rank and status3. The Economist had a nice line earlier this year germane to my doubts:
But unlike poets, economists prefer to quantify their analogies—to measure whether thou art 15% or 20% more lovely and more temperate.
But if you think that artificial models that cannot predict the world are still useful — useful in the way the philosophers trolley problems are — then the quote below should indeed make you sit up and stare.
If we’re willing to pay $150,000 for each quality-adjusted extra year of life (a commonly used estimate), then we ought to view a 10% increase in spending per capita as a good investment if it extended average life expectancy by 2.5 days. That number may give readers pause — hence the importance of clarifying our spending priorities and focusing on care that produces real value for patients. With such a focus, we could feel more confident that higher health care spending was worth it.
(Image of NotGeld (emergency money) at top of page from here)
Doctors need three qualifications: to be able to lie and not get caught; to pretend to be honest; and to cause death without guilt.” So wrote Jean Froissart, a diarist of the Middle Ages, after an outbreak of bubonic plague in the 14th century. Fake news then meant rumours that the plague could be cured by sitting in a sewer, eating decade-old treacle or ingesting arsenic.
Someone in your family has fallen ill with a respiratory infection that has already killed large numbers. Your small house means that you do not have enough room to quarantine them. Your have little money, and the hospitals are full. You contact the local public health authority.
Not to worry, you are told: A crew will be by shortly to set up a sturdy, well-ventilated, portable, tiny house in your yard. Once installed, your family member will be free to convalesce in comfort. You can deliver home-cooked meals to their door and communicate through open windows — and a trained nurse will be by for regular examinations. And no, there will be no charge for the house.
A fascinating story by Naomi Klein in the Intercept. Seemingly from a time when government knew what government was for.
This is not a dispatch from some future functional United States, one with a government capable of caring for its people in the midst of spiraling economic carnage and a public health emergency. It’s a dispatch from this country’s past, a time eight decades ago when it similarly found itself in the two-fisted grip of an even deeper economic crisis (the Great Depression), and a surging contagious respiratory illness (tuberculosis).
Whenever I have looked at the CVs of many young doctors or medical students I have often felt saddened at what I take to be the hurdles than many of them have had to jump through to get into medical school. I don’t mean the exams — although there is lots of empty signalling there too — but the enforced attempts to demonstrate you are a caring or committed to the NHS/ charity sector person. I had none of that; nor do I believe it counts for much when you actually become a doctor1. I think it enforces a certain conformity and limits the social breadth of intake to medical school.
However, I did
do things work outside school before going to university, working in a variety of jobs from the age of 14 upwards: a greengrocer’s shop on Saturdays, a chip shop (4-11pm on Sundays), a pub (living in for a while 😃), a few weeks on a pig-farm (awful) and my favourite, working at a couple of petrol stations (7am-10pm). These jobs were a great introduction to the black economy and how wonderfully inventive humanity — criminal humanity— can be. Naturally, I was not tempted😇. Those in the know would even tell you about other types of fraud in different industries, and even that people actually got awarded PhDs by studying and documenting the sociology of these structures (Is that why you are going to uni, I was once asked).
On the theme of that newest of crime genres — cybercrime — there is a wonderful podcast reminding you that if much capitalism is criminal, there is criminal and there is criminal. But many of the iconic structures of modern capitalism — specialisation, outsourcing and the importance of the boundaries between firm and non-firm — are there. Well worth a listen.
I think there is a danger in exaggerating the role of caring and compassion in medicine. I am not saying you do not need them, but rather that I think they are less important that the technical (or professional) skills that are essential for modern medical practice. I want to be treated by people who know how to assess a situation and who can judge with cold reason the results of administering or withholding an intervention. If doctors were once labelled priests with stethoscopes, I want less of the priest bit. Where I think there are faults is in the idea that you can contribute most to humanity by ‘just caring’. The Economist awhile back reported on an initiative from the Centre for Effective Altruism in Oxford. The project labelled the 80,000 hours initiative advises people on which careers they should choose in order to maximise their impact on the world. Impact should be judged not on how much a particular profession does, but on how much a person can do as an individual. Here is a quote relating to medicine:
Medicine is another obvious profession for do-gooders. It is not one, however, on which 80,000 Hours is very keen. Rich countries have plenty of doctors, and even the best clinicians can see only one patient at a time. So the impact that a single doctor will have is minimal. Gregory Lewis, a public-health researcher, estimates that adding an additional doctor to America’s labour supply would yield health benefits equivalent to only around four lives saved.
The typical medical student, however, should expect to save closer to no lives at all. Entrance to medical school is competitive. So a student who is accepted would not increase a given country’s total stock of doctors. Instead, she would merely be taking the place of someone who is slightly less qualified. Doctors, though, do make good money, especially in America. A plastic surgeon who donates half of her earnings to charity will probably have much bigger social impact on the margin than an emergency-room doctor who donates none.
Yes, the slightly less qualified makes me nervous.
Henry Miller died a few months before I started medical school in Newcastle in 1976. At the time of his death he was VC of the university having been Dean of Medicine and Professor of Neurology. By today’s standards he was a larger than life figure. I like reading what he said about medical education, although with hindsight I think he was wrong about many if not most things. But there was a freshness and sense of spirited independence of mind in his writing that we not longer see in those who run our universities (with some notable exceptions such as Louise Richardson). In the time of COVID we should remember the costs of conformity and patronage.
It would be naive to express surprise at the equanimity with which successive governments have regarded the deteriorating hospital service, since it is in the nature of governments to ignore inconvenient situations until they become scandalous enough to excite powerful public pressure. Nor, perhaps, should one expect patients to be more demanding: their uncomplaining stoicism springs from ignorance and fear rather than fortitude; they are mostly grateful for what they receive and do not know how far it falls short of what is possible. It is less easy to forgive ourselves…..Indeed election as president of a college, a vice chancellor, or a member of the University Grants committee usually spells an inevitable preoccupation with the politically practicable, and insidious identification with central authority, and a change of role from informed critic to uncomfortable apologist.
Originally published in the Lancet, 1966,2, 647-54. (This version from ‘Remembering Henry’, edited by Stephen Lock and Heather Windle).
The alternative to science is academic politics, where persistent disagreement is encouraged as a way to create distinctive sub-group identities.
The usual way to protect a scientific discussion from the factionalism of academic politics is to exclude people who opt out of the norms of science. The challenge lies in knowing how to identify them.
I can agree go along with both, but it is in the details that the daemons feast. It appears to me that the ‘norms of science’ argument is itself problematic, reminding me of those silly things you learn at school about the scientific method 1. The historical origin of the concept of the scientific method owed more to attempts to brand certain activities in the eyes of those who were not practicing scientists 2. As a rough approximation, the people who talk about the scientific method tend not to do science. Of course, in more recent times, the use of the term ‘science’ itself has been a flag for obtaining funding, status or approval. Dermatology is now dermatological sciences ; pharmacology is now pharmacological sciences. Even more absurd, in the medical literature I see the term delivery science (and I don’t mean Amazon), or reproducibility science. The demarcation of science from non-science is a hard philosophical problem going back way before Popper; I will not solve it. The danger is that we might end up exiling all those meaningful areas of human rationality that we once — rightly — considered outwith science, but still valued. There is indeed a subject that we might reasonably call medical science(s). It is just not synonymous with the principles and practice of medicine. It is also why political economy is a more useful subject than economics (or worse still, economic sciences).
As with HIV, “an epidemic reveals the fault lines in society. The big one this epidemic has revealed is how we treat the elderly. We often park them in pre-mortuary type institutions and give a bit of money and hope it is OK”.
When the tide goes out you see who is not wearing bathing costumes…
Once there was General Practice, medicine in the image of the late and great Julian Tudor-Hart. Then there was Primary Care. The following article from Pulse made me sit up and wonder whether we have got it right.
Under the five-year contract announced last year, networks were to receive 70% of the funding to employ a pharmacist, a paramedic, a physiotherapist and a physician associate, and 100% of the funding for a hiring social prescriber, by 2023/24… Six more roles will now be added to the scheme from April ‘at the request of PCN clinical directors’ – pharmacy technicians, care co-ordinators, health coaches, dietitians, podiatrists and occupational therapists…PCNs can choose to recruit from the expanded list to ‘make up the workforce they need’…The document added that mental health professionals, including Improving Access to Psychological Therapy (IAPT) therapists, will be added from April 2021 following current pilots…NHS England will also explore the feasibility of adding advanced nurse practitioners (ANPs) to the scheme [emphasis added].
Adam Smith among others pointed out the advantages of specialisation. We owe virtually all of the modern capitalist world to the power of this insight. But we also know that there are opposing forces — and not just those of the Luddites. Just think back to Ronald Coase and the Theory of the Firm. Why do companies not outsource everything? Why are there companies at all? Simply because under some circumstances transaction costs and formalisation of roles and contracts limit outsourcing 1. Contra the English approach is that of the Buurtzorg (links here, here and here) in the Netherlands where it is explicit that many of the tasks undertaken by highly skilled staff do not require high level skills. But — so the argument goes — the approach is more successful, robust and rewarding for both patients and staff. This is closer to the Tudor-Hart model. It really does depend on what sort of widgets you are dealing with, and whether fragmentation of activity improves outcomes, or merely diminishes costs in situations where outcomes are hard to define in an Excel spreadsheet.
From the Financial Times a few months back, this is a story about Facebook and the life of its content moderators (as in, the people who watch those videos of obscene or violent acts).
The document was distributed to all moderators at the European facility in early January via email, asking them to sign it immediately. It stated: “I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (PTSD).”
The two-page form also outlines Accenture’s WeCare programme, which provides employees with access to “wellness coaches” from whom they can receive mental health support. The company says, however, that “the wellness coach is not a medical doctor and cannot diagnose or treat mental disorders”.
What caught my eye was a phrase that you see more and more: ‘X’ is not a medical doctor and cannot diagnose or treat mental disorders’. ‘X’ can be a person or simply some text on a web page.
Too much of modern hypercapitalism is about arbitrage between honesty or morality, and the law as it is codified. The business model is dishonesty, or worse. It would be impossible to act this way if the contact was human rather than digital. This is the very feature (not a bug!) that allows the push-button extinguishing of civilian lives in far-away parts of the world by people who drive to work and pick up the kids on the way home. As for medicine, corporations will insist on exploiting similar fault lines. There was once a time when ‘medicine’ was a small part of the economy, and when it could play by an insular set of rules that both society and the practitioners recognised, if not agreed upon. Nowadays there is simply too much money to be made.
A comment from Icarus Fallen on this article:
In the new social network gig economy, your mental health, has an hourly price. Try not to sell it.
From an article in the LRB by the historian of science, Steven Shapin. The book under review is The Mosquito: A Human History of Our Deadliest Predator by Timothy Winegard. The story — if you can call it that — is malaria.
There’s a pub quiz question: ‘What’s the deadliest animal?’ Lots of people guess sharks (just four deaths a year), lions (a hundred), or crocodiles (a thousand). The animal that causes the second highest number of human deaths is other humans (475,000), but the answer is the mosquito, at 750,000 deaths, many of them caused by diseases other than malaria.
The subsequent destruction of the Pontine hydraulic works was also an act of war. On the advice of German malariologists, the Wehrmacht, retreating from southern Italy in the winter of 1943-44, flooded the Pontine Marshes with seawater to bring back mosquitoes – and malaria – as an obstacle to the Allied forces who were landing at Anzio, south of Rome, as well as to punish the Italians, who had just switched sides. The outcome of the Battle of Anzio wasn’t much affected by the Nazis’ act of biological warfare – both sides suffered – but it had a marked effect on Italian civilians: in 1939, there were 614 cases of malaria in the area; in 1944, there were 54,929.
The wretched of the earth suffer from underdevelopment, which is both a cause of their malarial afflictions and an effect of malaria. And they suffer from political indifference, as the jobs of prevention and cure have increasingly been off-loaded onto charitable foundations: the Rockefeller Foundation in the early part of the 20th century, then the Gates Foundation, which now spends more on global health than the World Health Organisation. Bill Gates has pointed out repeatedly that more money goes into curing male baldness than into research on the prevention and cure of malaria [emphasis added]. Capitalism is ‘flawed’, he says, and the persistence of malaria is a failure of the marketplace.
The political swamp breeds the inequality and poverty on which malaria thrives; the physical swamp breeds its insect vector. Drain the swamps.
Facts, dear boy. Facts.
Just because your doctor has a name for your condition, doesn’t mean he knows what it is — Franz Kafka.
I hadn’t come across this quote by Franz Kafka before. It is of course true, but the converse is even more worrying. I like Sam Shuster’s aphorism better: the worst thing you can do is make a diagnosis (because it stops you thinking about what really is going on).
Even those who liked it at the beginning are becoming wary of the creeping clapping fascism,
Why is digital healthcare full of promises that don’t deliver? Why is interoperability such a big problem? Why are health IT systems so unreliable and hard to use, and so much worse than consumer devices?
I put digital health’s problems down to ‘cat thinking’. Cat Thinking, which is fine for consumer products, promises simple, exciting health IT solutions, regardless of evidence and hard science. Unfortunately Cat Thinking misdirects politicians, funders and referees – as well as doctors and hospitals – into thinking that digital health is an easy win.
There is lots about covid-19 that I do not understand — the biology and all that. But the NHS and government’s responses are something else. I find it hard not to assume that every statement has an ulterior motive: they are, it seems, strangers to the truth. Here is Bruce Schneier (the security guru as the Economist once called him).
“My problem with contact tracing apps is that they have absolutely no value,” Bruce Schneier, a privacy expert and fellow at the Berkman Klein Center for Internet & Society at Harvard University, told BuzzFeed News. “I’m not even talking about the privacy concerns, I mean the efficacy. Does anybody think this will do something useful? … This is just something governments want to do for the hell of it. To me, it’s just techies doing techie things because they don’t know what else to do.”
I haven’t blogged about this because I thought it was obvious. But from the tweets and emails I have received, it seems not.
It has nothing to do with privacy concerns. The idea that contact tracing can be done with an app, and not human health professionals is just plain dumb.
Testing, testing and more testing, please.
But, as R.H. Tawney once observed, shifts to collective provision are only realised after demonstrations that ‘high individual incomes will not purchase the mass of mankind immunity from cholera, typhus and ignorance’: many elements of the coming future ought to be favourable to the left, though only if they are shaped politically, and if blame – always elusive in the UK’s diffuse system of responsibility – is correctly apportioned.
One of the pleasures of retirement from medical practice is not being on the General Medical Council (GMC) register. If you were able to listen in on many doctors private conversations, and run some Google word analytics, the word you might find in closest proximity to the term General Medical Council (GMC) would be loathe. There would be other less polite words, too. As the BMJ once wrote: there is very little in British medicine that the GMC cannot make worse. It is a legalised extortion racket that fails to protect the public, messes up medical education and makes many doctors’ lives miserable.
The following are quotes from the Lancet and the FT. They are about the horrendous crimes perpetrated by a surgeon, Ian Paterson. The full Independent Inquiry report can be found here. I am not surprised by anything I have read in the investigation into these crimes and the attacks on those who attempted to draw attention to them.
Health-care workers reporting concerns often come under substantial pressure from health-care management, and sometimes have to justify their own practice and reasons for speaking out. Four of the health-care professionals who did report Paterson were subject to fitness to practice scrutiny by the GMC during the later investigation because they had worked alongside him
The FT draws up some lessons. Here is number four:
The fourth lesson is that those who speak up are likely to suffer. Some of Paterson’s colleagues were worried about his practices. When six doctors raised concerns with the chief executive of the NHS trust where Paterson worked, four were themselves investigated by the General Medical Council because they had worked with him.
Maybe after clapping this Thursday evening people need to take a long hard look at the culture of NHS governance and its proxies in the UK. Pandemics just open up the cracks of incompetence that are hidden in plain sight.
A query with the catchy expression “global pandemic” or “global pandemic preparedness” in scientific databases, restricted to a 2009–19 range, will return more than 1400 results in JAMA (Journal of the American Medical Association), 30 in-depth papers in ArXiv (Cornell University), and a stunning 17,000 results in Google Scholar, which aggregates multiple repositories. As for the general public, it had the choice between no less than 98 TED Talks on the matter.
We had no excuses.
Just before the H1N1 episode in 2009, France had accumulated an inventory of 1 billion high protection masks (N95 equivalent). It was the consequence of the SARS epidemic. In the same way, the government had stored 20 million doses of vaccine. Later, the Health Ministry responsible for this precaution was blasted for this “excessive” stockpile — which was eventually destroyed as it decayed.
Frederik Filloux makes (and has for a while been making) an argument about journalism and journalism schools that I have not seen advanced by anybody else. The changing economics of the press mean that the modern Fourth Estate lacks expertise across many domains of modern life. He suggests that journalism schools need to regroup and change how they work and take advantage of the fact that most expertise will reside with those who did not go to journalism school in their 20’s. Rather, the press will need to rely on those with professional skills gained in particular domains. He writes:
The shortage of experts is also rooted in a priority shift that plagues major news organizations. All of them became obsessed with not being left in the dust by digital-native organizations riding the wave of social networks. As a consequence, newsroom managers, supported by bean-counters, found it clever to hire bunches of expendable digital “content” serfs who were mandated to keep up with the social frenzy. It was seen as a better investment than keeping a former doctor turned medical correspondent, even if he or she was loaded with decades of expertise, able to lean on a reliable network on practitioners, surgeons, epidemiologists, public health officials, etc. A pure cost vs. benefit choice, and ultimately a bad one.
I do not think there will be any shortage of candidates who possess medical degrees and medical experience.
We are living in dark times, and since I have been sifting through the ashes of a career, it is no surprise that failures signal through like radioactive tracers. Below is one.
Through most of my career I have been interested in the relation between science and medicine. In truth, if what matters is what you think about in the shower, I have been more interested in the relation between science and medicine than I have been interested in either activity in isolation. If I were to use a phrase to describe my focus, although it is a term that I would not have used then, I am interested in the epistemological foundations of medical practice. Pompous, I agree. I could use another phrase: what makes medicine and doctors useful? Thinking about statistical inference is a part of this topic, but there is much more to explore.
These issues became closer to my consciousness soon after I moved to Edinburgh. My ideas about what was going on were not shared by many locally, and I was nervous about going public in person rather than in print at a Symposium hosted by the Royal College of Physicians of Edinburgh. My nervousness was well founded: whilst I liked my abstract, my talk went down badly. Not least because it was truly dreadful (and the evident failure still rankles). Jan Vandenbroucke, one of the other speakers and somebody whose work I greatly admire (his paper in the Lancet, Homoeopathy trials: Going nowhere. [Lancet.1997;350:824], was to me the most important paper published in the Lancet in the 1990s), said some kind words to me afterwards, muttering that I had tried to say far too much to an audience that was ill prepared for my speculations. All true, but he was just being kind. It was worse than that.
Anyway, some tidying up deep in my hard drive surfaced the abstract. I still like it, but it is a shame that at the appropriate time I was unable to explain why.
JAMES LIND SYMPOSIUM: From scurvy to systematic reviews and clinical guidelines: how can clinical research lead to better patient care? (31-10-2003, RCPE Edinburgh)
There are three great branches of science: theory, experiment, and computation. (Nick Trefethen)
Advance in the mid-third of the twentieth century, the golden age of medical research, was predicated on earlier discoveries in the nineteenth century in both physiology and medicinal chemistry (1). Genetics dominated biology in the latter third of the twentieth century and many believe changes in medical practice will owe much to genetics over the next third-century (1). I disagree, and I will give an alternative view more credence: in 30 years’ time we will look back more to Neumann and Morgenstern than we will to Watson and Crick. What the Nobel laureate Herbert Simon referred to as The Sciences of the Artificial (2), subjects which have largely been peripheral to medicine, will become central.
Over the last 20 years we have seen the first (largely inadequate, I would add) attempts to explicitly demarcate methods of obtaining and promulgating knowledge about clinical practice (3,4). This has usually taken the form of proselytising a particular set of terms – systematic reviews, evidence-based practice, guidelines and the like, terms that have little to commend them or rigour. What is interesting, however, is that they reflect a long overdue renaissance of interest with the practice of medicine and medical epistemology.
The change of emphasis from the natural to the artificial is being driven by a number of forces, mostly extraneous to biomedicine: the increasing instrumental role of science in medicine and society; the increase in corporatisation of knowledge, whether by private corporations or monopsonistic institutions like the NHS (5); the rising costs of healthcare; and a remaining inability to frame questions with broad support about how to chose between alternative disease states at the level of society (6,7).
I will try to illustrate some of these issues by the use of three examples. First, the widespread use of a mode of statistical inference largely ill-suited to medicine, namely Neyman-Pearson hypothesis testing (decision-making), and the way in which this paradigm has been used to undermine expert opinion (8). Second, I will argue that we need to think much harder about clinical practice and fashion a more appropriate theoretical underpinning for clinical behaviour. Third, I will suggest how UK medical schools, in so far as they remain interested in clinical practice, should look to alternative models, perhaps business and law schools, for ideas of how they should operate (2).
Afterword. The symposium used structured abstracts, a habit that might have a place somewhere in this galaxy, but out of choice I would prefer to live in another one. Anyway, in the published version, it reads:
A fair cop.
Alfred North Whitehead: “Some of the major disasters of mankind have been produced by the narrowness of men with a good methodology” (The Function of Reason).
Comments that seem germane to some of our current day covid-19 debates.
People are always demanding that medical students must learn this or that (obesity, psychiatry, dermatology, ID, eating disorders). The result is curriculum overload, a default in favour of rote learning by many students, and the inhibition of curiosity. It was not meant to be like this, but amongst others, the GMC, the NHS, and others have pushed a vision of university medical education that shortchanges both the students and medical practice over the long term. Short-termism rules. Instead of producing graduates who are ready to learn clinical medicine is an area of their choice, we expect them to somehow come out oven-ready at graduation. I do not believe it is possible to do this to a level of safety that many other professions demand, nor is this the primary job of a university. Sadly, universities have given up on arguing, intimidated by the government and their regulatory commissars, and nervous of losing their monopoly on producing doctors.
But I will make a plea that one area really does deserve more attention within a university : the history of how medical advance occurs. No, I do not mean MCQs asking for the date of birth of Robert Koch or Lord Lister, but a feel for the historical interplay of convention and novelty. Without this our students and our graduates are almost confined to living in the present, unaware of the past, and unable to doubt how different the future will be. Below is one example.
”In 1938 Albert Hofmann, a chemist at the Sandoz Laboratories in Basel, created a series of new compounds from lysergic acid. One of them, later marketed as Hydergine, showed great potential for the treatment of cerebral arteriosclerosis. Another salt, the diethylamide (LSD), he put to one side, but he had “a peculiar presentiment,” as he put it in his memoir LSD: My Problem Child (1980), “that this substance could possess properties other than those established in the first investigations.
In 1943 he prepared a fresh batch of LSD. In the final process of its crystallization, he started to experience strange sensations. He described his first inadvertent “trip” in a letter to his supervisor:
At home I lay down and sank into a not unpleasant, intoxicated-like condition, characterized by extremely stimulated imagination. In a dream-like state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.
After eliminating chloroform fumes as a possible cause, he concluded that a tiny quantity of LSD absorbed through the skin of his fingertips must have been responsible. Three days later he began a program of unsanctioned research and deliberately ingested 250 micrograms of LSD at 4:20 PM. Forty minutes later, he wrote in his lab journal, “Beginning dizziness, feeling of anxiety, visual distortions, symptoms of paralysis, desire to laugh.” He set off home on his bicycle, accompanied by his laboratory assistant. This formal trial of what Hofmann considered a minute dose of LSD had more distressing effects than his first chance exposure:
Every exertion of my will, every attempt to put an end to the disintegration of the outer world and the dissolution of my ego, seemed to be wasted effort. A demon had invaded me, had taken possession of my body, mind, and soul. I jumped up and screamed, trying to free myself from him, but then sank down again and lay helpless on the sofa…. I was taken to another world, another place, another time.
A doctor was summoned but found nothing amiss apart from a marked dilation of his pupils. A fear of impending death gradually faded as the drug’s effect lessened, and after some hours Hofmann was seeing surreal colors and enjoying the play of shapes before his eyes.
Many editors of learned medical journals now automatically turn down publications describing the sort of scientific investigation that Albert Hofmann carried out on himself. Institutional review boards are often scathing in their criticism of self-experimentation, despite its hallowed tradition in medicine, because they consider it subjective and biased. But the human desire to alter consciousness and enrich self-awareness shows no sign of receding, and someone must always go first. As long as care and diligence accompany the sort of personal research conducted by Pollan and Lin, it has the potential to be as revealing and informative as any work on psychedelic drugs conducted within the rigid confines of universities.
Richard Horton in the Lancet writes:
Imagine if the entire edifice of knowledge in medicine was built upon a falsehood. Systematic reviews are said to be the highest standard of evidence-based health care. Regularly updated to ensure that treatment decisions are built on the most up-to-date and reliable science, systematic reviews and meta-analyses are widely used to inform clinical guidelines and decision making. Powerful organisations have emerged to construct a knowledge base in medicine underpinned by the results of systematic reviews. One such organisation is Cochrane, with 11 000 members in over 130 countries. This extraordinary movement of people is justifiably passionate about the idea that it is contributing to better health outcomes for everyone, everywhere. The industry that drives the production of systematic reviews today is financed by some of the most influential agencies in medical research. Cochrane, for example, points to three funders providing over £1 million each—the UK’s National Institute for Health Research (NIHR), the US National Institutes of Health (NIH), and Australia’s National Health and Medical Research Council (NHMRC).
Well, it really is a bit late for all this soul searching. See my earlier post here ‘Mega-silliness’ (commenting on what others had already pointed out); or my Evidence Based Medicine: the Epistemology That Isn’t, written over 20 years ago; and my contribution to the wake (even if I didn’t put my hand in my pocket), Why we should let “evidence-based medicine” rest in peace. The genesis of EBM was as a cult whose foundational myth was that P values could act as a true machine. Those followers who had originally hoped for a place in the promised afterlife, soon settled for paying the bills, and EBM morphed into a career opportunity for those who found accountancy too daring. So, pace John Mayall on Jazz Blues Fusion, don’t come here to listen to an old record. I promise.
Dr Chris Day writes:
Two weeks ago, I swabbed my first positive Covid-19 patient during an A&E Locum shift. I must say back then, I hadn’t fully taken in what we as a country will have to face over the coming months. The reports from colleagues in Italy and China are beyond belief.
The UK has been left to fight Covid-19 with half the Intensive Care beds per capita of Italy. Back in 2014, the trigger for my whistleblowing case was my attempt to try and secure more ICU resources for South East London (see Private Eye).
Instead of spending 5 years and £700k fighting /smearing me and damaging whistleblowing law, the NHS could have just fixed the problem. There has never been a more important time for the public and the politicians to understand Intensive Care resourcing and what is decided on their behalf by NHS leaders.
Two letters in Lancet Oncology. This bloody story never ends. We have not invented truth machines: judgement has never been exiled from discovery.
Since he shared every passing observation online, it was not surprising that on December 30th he put up a post about an odd cluster of pneumonia cases at the hospital. They were unexplained, but the patients were in quarantine, and they had all worked in the same place, the pungent litter-strewn warren of stalls that made up the local seafood market. Immediately this looked like person-to-person transmission to him, even if it might have come initially from bats, or some other delicacy. Immediately, too, it raised the spectre of the sars virus of 2002-03 which had killed more than 700 people. He therefore decided to warn his private WeChat group, all fellow alumni from Wuhan University, to take precautions. He headed the post: “Seven cases of sars in the Huanan Wholesale Seafood Market”. That was his mistake.
The trouble was that he did not know whether it was actually sars. He had posted it too fast. In an hour he corrected it, explaining that although it was a coronavirus, like sars, it had not been identified yet. But to his horror he was too late: his first post had already gone viral, with his name and occupation undeleted, so that in the middle of the night he was called in for a dressing down at the hospital, and January 3rd he was summoned to the police station.
But most of Case and Deaton’s ire focuses on the health care industry, which not only underperforms but is also wrecking the US economy. We [USA] spend twice per capita what France spends on health care, but our life expectancy is four years shorter, our rates of maternal and infant death are almost twice as high, and, unlike the French, we leave 30 million people uninsured. The amount Americans spend unnecessarily on health care weighs more heavily on our economy, Case and Deaton write, than the Versailles treaty reparations did on Germany’s in the 1920s. If, decades ago, we’d built a health system like Switzerland’s, which costs 30 percent less per capita than ours does, we’d now have an extra trillion dollars a year to spend, for example, on replacing the pipes in the nearly four thousand US counties where lead levels in drinking water exceed those of Flint, Michigan, and on rebuilding America’s bridges railroads, and highways—now so rundown that FedEx replaces delivery van tires twice as often as it did twenty years ago.
In the US, health insurance accounts for 60 percent of the cost of hiring a low-wage worker. Many employers opt instead to hire contract workers with no benefits, or illegal immigrants with no rights at all.
Terrific article on Covid-19 (Sars-CoV-2). in the LRB by Rupert Beale. He says written in haste but it doesn’t read that way. It contains some memorable lines.
As the US health secretary Michael Leavitt put it in 2006, ‘anything we say in advance of a pandemic happening is alarmist; anything we say afterwards is inadequate.’
And how do you think hard about research funding for the long term (I am old enough to remember when stroke and dementia were virtually non-subjects as far as ‘good research funding’ was concerned).
Virologists need more than clever tricks: we also need cash. Twenty years ago, funding wasn’t available to study coronaviruses. In 1999, avian infectious bronchitis virus was the one known truly nasty coronavirus pathogen. Only poultry farmers really cared about it, as it kills chickens but doesn’t infect people. In humans there are a number of fairly innocuous coronaviruses, such as OC43 and HKU1, which cause the ‘common cold’. Doctors don’t usually bother testing for them – you have a runny nose, so what?
And note the conditional tense:
The global case fatality rate is above 3 per cent at the moment, and if – reasonable worst case scenario – 30-70 per cent of the 7.8 billion people on earth are infected, that means between 70 and 165 million deaths. It would be the worst disaster in human history in terms of total lives lost. Nobody expects this, because everyone expects that people will comply with efficient public health measures put in place by responsible governments.
And to repeat my own mantra (stolen from elsewhere): the opposite of science is not art, but politics:
The situation isn’t helped by a president [Trump] who keeps suggesting that the virus isn’t that bad, it’s a bit like flu, we will have a vaccine soon: stopping flights from China was enough. Tony Fauci, the director of the National Institute of Allergy and Infectious Disease, deftly cut across Trump at a White House press briefing. No, it isn’t only as bad as flu, it’s far more dangerous. Yes, public health measures will have to be put in place and maintained for many months. No, a vaccine isn’t just around the corner, it will take at least 18 months. Fauci was then ordered to clear all his press briefings on Covid-19 with Mike Pence in advance: the vice president’s office is leading the US response to the virus. ‘You don’t want to go to war with a president,’ Fauci remarked.
And Beale ends by quoting an ID colleague.
This is not business as usual. This will be different from what anyone living has ever experienced. The closest comparator is 1918 influenza.
Caution: pace the author, ‘This is a fast-moving situation, and the numbers are constantly changing – certainly the ones I have given here will be out of date by the time you read this.’
Link. (London Review of Books: Vol. 42 No. 5, 5 March 2020: “Wash your Hands”: Rupert Beale)
I titled a recent post musing over my career as ‘The Thrill is Gone’. But I ended on an optimistic note:
‘The baton gets handed on. The thrill goes on. And on’
But there are good reasons to think otherwise. Below is a quote from a recent letter in the Lancet by Gagab Bhatnaga. You can argue all you like about definitions of ‘burnout’, but good young people are leaving medicine. The numbers who leave for ever may not be large but I think some of the best are going. What worries as much is those who stay behind.
The consequences of physician burnout have been clearly observed in the English National Health Service (NHS). F2 doctors (those who are in their second foundation year after medical school) can traditionally go on to apply to higher specialist training. Recent years have seen an astounding drop in F2 doctors willing to continue NHS training4 with just over a third (37·7%) of F2 doctors applying to continue training in 2018, a decrease from 71·3% in 2011. Those taking a career break from medicine increased almost 3-fold from 4·6% to 14·6%. With the NHS already 10 000 doctors short, the consequences of not recruiting and retaining our junior workforce will be devastating.