As with HIV, “an epidemic reveals the fault lines in society. The big one this epidemic has revealed is how we treat the elderly. We often park them in pre-mortuary type institutions and give a bit of money and hope it is OK”.
When the tide goes out you see who is not wearing bathing costumes…
Once there was General Practice, medicine in the image of the late and great Julian Tudor-Hart. Then there was Primary Care. The following article from Pulse made me sit up and wonder whether we have got it right.
Under the five-year contract announced last year, networks were to receive 70% of the funding to employ a pharmacist, a paramedic, a physiotherapist and a physician associate, and 100% of the funding for a hiring social prescriber, by 2023/24… Six more roles will now be added to the scheme from April ‘at the request of PCN clinical directors’ – pharmacy technicians, care co-ordinators, health coaches, dietitians, podiatrists and occupational therapists…PCNs can choose to recruit from the expanded list to ‘make up the workforce they need’…The document added that mental health professionals, including Improving Access to Psychological Therapy (IAPT) therapists, will be added from April 2021 following current pilots…NHS England will also explore the feasibility of adding advanced nurse practitioners (ANPs) to the scheme [emphasis added].
Adam Smith among others pointed out the advantages of specialisation. We owe virtually all of the modern capitalist world to the power of this insight. But we also know that there are opposing forces — and not just those of the Luddites. Just think back to Ronald Coase and the Theory of the Firm. Why do companies not outsource everything? Why are there companies at all? Simply because under some circumstances transaction costs and formalisation of roles and contracts limit outsourcing 1. Contra the English approach is that of the Buurtzorg (links here, here and here) in the Netherlands where it is explicit that many of the tasks undertaken by highly skilled staff do not require high level skills. But — so the argument goes — the approach is more successful, robust and rewarding for both patients and staff. This is closer to the Tudor-Hart model. It really does depend on what sort of widgets you are dealing with, and whether fragmentation of activity improves outcomes, or merely diminishes costs in situations where outcomes are hard to define in an Excel spreadsheet.
From the Financial Times a few months back, this is a story about Facebook and the life of its content moderators (as in, the people who watch those videos of obscene or violent acts).
The document was distributed to all moderators at the European facility in early January via email, asking them to sign it immediately. It stated: “I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (PTSD).”
The two-page form also outlines Accenture’s WeCare programme, which provides employees with access to “wellness coaches” from whom they can receive mental health support. The company says, however, that “the wellness coach is not a medical doctor and cannot diagnose or treat mental disorders”.
What caught my eye was a phrase that you see more and more: ‘X’ is not a medical doctor and cannot diagnose or treat mental disorders’. ‘X’ can be a person or simply some text on a web page.
Too much of modern hypercapitalism is about arbitrage between honesty or morality, and the law as it is codified. The business model is dishonesty, or worse. It would be impossible to act this way if the contact was human rather than digital. This is the very feature (not a bug!) that allows the push-button extinguishing of civilian lives in far-away parts of the world by people who drive to work and pick up the kids on the way home. As for medicine, corporations will insist on exploiting similar fault lines. There was once a time when ‘medicine’ was a small part of the economy, and when it could play by an insular set of rules that both society and the practitioners recognised, if not agreed upon. Nowadays there is simply too much money to be made.
A comment from Icarus Fallen on this article:
In the new social network gig economy, your mental health, has an hourly price. Try not to sell it.
From an article in the LRB by the historian of science, Steven Shapin. The book under review is The Mosquito: A Human History of Our Deadliest Predator by Timothy Winegard. The story — if you can call it that — is malaria.
There’s a pub quiz question: ‘What’s the deadliest animal?’ Lots of people guess sharks (just four deaths a year), lions (a hundred), or crocodiles (a thousand). The animal that causes the second highest number of human deaths is other humans (475,000), but the answer is the mosquito, at 750,000 deaths, many of them caused by diseases other than malaria.
The subsequent destruction of the Pontine hydraulic works was also an act of war. On the advice of German malariologists, the Wehrmacht, retreating from southern Italy in the winter of 1943-44, flooded the Pontine Marshes with seawater to bring back mosquitoes – and malaria – as an obstacle to the Allied forces who were landing at Anzio, south of Rome, as well as to punish the Italians, who had just switched sides. The outcome of the Battle of Anzio wasn’t much affected by the Nazis’ act of biological warfare – both sides suffered – but it had a marked effect on Italian civilians: in 1939, there were 614 cases of malaria in the area; in 1944, there were 54,929.
The wretched of the earth suffer from underdevelopment, which is both a cause of their malarial afflictions and an effect of malaria. And they suffer from political indifference, as the jobs of prevention and cure have increasingly been off-loaded onto charitable foundations: the Rockefeller Foundation in the early part of the 20th century, then the Gates Foundation, which now spends more on global health than the World Health Organisation. Bill Gates has pointed out repeatedly that more money goes into curing male baldness than into research on the prevention and cure of malaria [emphasis added]. Capitalism is ‘flawed’, he says, and the persistence of malaria is a failure of the marketplace.
The political swamp breeds the inequality and poverty on which malaria thrives; the physical swamp breeds its insect vector. Drain the swamps.
Facts, dear boy. Facts.
Just because your doctor has a name for your condition, doesn’t mean he knows what it is — Franz Kafka.
I hadn’t come across this quote by Franz Kafka before. It is of course true, but the converse is even more worrying. I like Sam Shuster’s aphorism better: the worst thing you can do is make a diagnosis (because it stops you thinking about what really is going on).
Even those who liked it at the beginning are becoming wary of the creeping clapping fascism,
Why is digital healthcare full of promises that don’t deliver? Why is interoperability such a big problem? Why are health IT systems so unreliable and hard to use, and so much worse than consumer devices?
I put digital health’s problems down to ‘cat thinking’. Cat Thinking, which is fine for consumer products, promises simple, exciting health IT solutions, regardless of evidence and hard science. Unfortunately Cat Thinking misdirects politicians, funders and referees – as well as doctors and hospitals – into thinking that digital health is an easy win.
There is lots about covid-19 that I do not understand — the biology and all that. But the NHS and government’s responses are something else. I find it hard not to assume that every statement has an ulterior motive: they are, it seems, strangers to the truth. Here is Bruce Schneier (the security guru as the Economist once called him).
“My problem with contact tracing apps is that they have absolutely no value,” Bruce Schneier, a privacy expert and fellow at the Berkman Klein Center for Internet & Society at Harvard University, told BuzzFeed News. “I’m not even talking about the privacy concerns, I mean the efficacy. Does anybody think this will do something useful? … This is just something governments want to do for the hell of it. To me, it’s just techies doing techie things because they don’t know what else to do.”
I haven’t blogged about this because I thought it was obvious. But from the tweets and emails I have received, it seems not.
It has nothing to do with privacy concerns. The idea that contact tracing can be done with an app, and not human health professionals is just plain dumb.
Testing, testing and more testing, please.
But, as R.H. Tawney once observed, shifts to collective provision are only realised after demonstrations that ‘high individual incomes will not purchase the mass of mankind immunity from cholera, typhus and ignorance’: many elements of the coming future ought to be favourable to the left, though only if they are shaped politically, and if blame – always elusive in the UK’s diffuse system of responsibility – is correctly apportioned.
One of the pleasures of retirement from medical practice is not being on the General Medical Council (GMC) register. If you were able to listen in on many doctors private conversations, and run some Google word analytics, the word you might find in closest proximity to the term General Medical Council (GMC) would be loathe. There would be other less polite words, too. As the BMJ once wrote: there is very little in British medicine that the GMC cannot make worse. It is a legalised extortion racket that fails to protect the public, messes up medical education and makes many doctors’ lives miserable.
The following are quotes from the Lancet and the FT. They are about the horrendous crimes perpetrated by a surgeon, Ian Paterson. The full Independent Inquiry report can be found here. I am not surprised by anything I have read in the investigation into these crimes and the attacks on those who attempted to draw attention to them.
Health-care workers reporting concerns often come under substantial pressure from health-care management, and sometimes have to justify their own practice and reasons for speaking out. Four of the health-care professionals who did report Paterson were subject to fitness to practice scrutiny by the GMC during the later investigation because they had worked alongside him
The FT draws up some lessons. Here is number four:
The fourth lesson is that those who speak up are likely to suffer. Some of Paterson’s colleagues were worried about his practices. When six doctors raised concerns with the chief executive of the NHS trust where Paterson worked, four were themselves investigated by the General Medical Council because they had worked with him.
Maybe after clapping this Thursday evening people need to take a long hard look at the culture of NHS governance and its proxies in the UK. Pandemics just open up the cracks of incompetence that are hidden in plain sight.
A query with the catchy expression “global pandemic” or “global pandemic preparedness” in scientific databases, restricted to a 2009–19 range, will return more than 1400 results in JAMA (Journal of the American Medical Association), 30 in-depth papers in ArXiv (Cornell University), and a stunning 17,000 results in Google Scholar, which aggregates multiple repositories. As for the general public, it had the choice between no less than 98 TED Talks on the matter.
We had no excuses.
Just before the H1N1 episode in 2009, France had accumulated an inventory of 1 billion high protection masks (N95 equivalent). It was the consequence of the SARS epidemic. In the same way, the government had stored 20 million doses of vaccine. Later, the Health Ministry responsible for this precaution was blasted for this “excessive” stockpile — which was eventually destroyed as it decayed.
Frederik Filloux makes (and has for a while been making) an argument about journalism and journalism schools that I have not seen advanced by anybody else. The changing economics of the press mean that the modern Fourth Estate lacks expertise across many domains of modern life. He suggests that journalism schools need to regroup and change how they work and take advantage of the fact that most expertise will reside with those who did not go to journalism school in their 20’s. Rather, the press will need to rely on those with professional skills gained in particular domains. He writes:
The shortage of experts is also rooted in a priority shift that plagues major news organizations. All of them became obsessed with not being left in the dust by digital-native organizations riding the wave of social networks. As a consequence, newsroom managers, supported by bean-counters, found it clever to hire bunches of expendable digital “content” serfs who were mandated to keep up with the social frenzy. It was seen as a better investment than keeping a former doctor turned medical correspondent, even if he or she was loaded with decades of expertise, able to lean on a reliable network on practitioners, surgeons, epidemiologists, public health officials, etc. A pure cost vs. benefit choice, and ultimately a bad one.
I do not think there will be any shortage of candidates who possess medical degrees and medical experience.
We are living in dark times, and since I have been sifting through the ashes of a career, it is no surprise that failures signal through like radioactive tracers. Below is one.
Through most of my career I have been interested in the relation between science and medicine. In truth, if what matters is what you think about in the shower, I have been more interested in the relation between science and medicine than I have been interested in either activity in isolation. If I were to use a phrase to describe my focus, although it is a term that I would not have used then, I am interested in the epistemological foundations of medical practice. Pompous, I agree. I could use another phrase: what makes medicine and doctors useful? Thinking about statistical inference is a part of this topic, but there is much more to explore.
These issues became closer to my consciousness soon after I moved to Edinburgh. My ideas about what was going on were not shared by many locally, and I was nervous about going public in person rather than in print at a Symposium hosted by the Royal College of Physicians of Edinburgh. My nervousness was well founded: whilst I liked my abstract, my talk went down badly. Not least because it was truly dreadful (and the evident failure still rankles). Jan Vandenbroucke, one of the other speakers and somebody whose work I greatly admire (his paper in the Lancet, Homoeopathy trials: Going nowhere. [Lancet.1997;350:824], was to me the most important paper published in the Lancet in the 1990s), said some kind words to me afterwards, muttering that I had tried to say far too much to an audience that was ill prepared for my speculations. All true, but he was just being kind. It was worse than that.
Anyway, some tidying up deep in my hard drive surfaced the abstract. I still like it, but it is a shame that at the appropriate time I was unable to explain why.
JAMES LIND SYMPOSIUM: From scurvy to systematic reviews and clinical guidelines: how can clinical research lead to better patient care? (31-10-2003, RCPE Edinburgh)
There are three great branches of science: theory, experiment, and computation. (Nick Trefethen)
Advance in the mid-third of the twentieth century, the golden age of medical research, was predicated on earlier discoveries in the nineteenth century in both physiology and medicinal chemistry (1). Genetics dominated biology in the latter third of the twentieth century and many believe changes in medical practice will owe much to genetics over the next third-century (1). I disagree, and I will give an alternative view more credence: in 30 years’ time we will look back more to Neumann and Morgenstern than we will to Watson and Crick. What the Nobel laureate Herbert Simon referred to as The Sciences of the Artificial (2), subjects which have largely been peripheral to medicine, will become central.
Over the last 20 years we have seen the first (largely inadequate, I would add) attempts to explicitly demarcate methods of obtaining and promulgating knowledge about clinical practice (3,4). This has usually taken the form of proselytising a particular set of terms – systematic reviews, evidence-based practice, guidelines and the like, terms that have little to commend them or rigour. What is interesting, however, is that they reflect a long overdue renaissance of interest with the practice of medicine and medical epistemology.
The change of emphasis from the natural to the artificial is being driven by a number of forces, mostly extraneous to biomedicine: the increasing instrumental role of science in medicine and society; the increase in corporatisation of knowledge, whether by private corporations or monopsonistic institutions like the NHS (5); the rising costs of healthcare; and a remaining inability to frame questions with broad support about how to chose between alternative disease states at the level of society (6,7).
I will try to illustrate some of these issues by the use of three examples. First, the widespread use of a mode of statistical inference largely ill-suited to medicine, namely Neyman-Pearson hypothesis testing (decision-making), and the way in which this paradigm has been used to undermine expert opinion (8). Second, I will argue that we need to think much harder about clinical practice and fashion a more appropriate theoretical underpinning for clinical behaviour. Third, I will suggest how UK medical schools, in so far as they remain interested in clinical practice, should look to alternative models, perhaps business and law schools, for ideas of how they should operate (2).
Afterword. The symposium used structured abstracts, a habit that might have a place somewhere in this galaxy, but out of choice I would prefer to live in another one. Anyway, in the published version, it reads:
A fair cop.
Alfred North Whitehead: “Some of the major disasters of mankind have been produced by the narrowness of men with a good methodology” (The Function of Reason).
Comments that seem germane to some of our current day covid-19 debates.
People are always demanding that medical students must learn this or that (obesity, psychiatry, dermatology, ID, eating disorders). The result is curriculum overload, a default in favour of rote learning by many students, and the inhibition of curiosity. It was not meant to be like this, but amongst others, the GMC, the NHS, and others have pushed a vision of university medical education that shortchanges both the students and medical practice over the long term. Short-termism rules. Instead of producing graduates who are ready to learn clinical medicine is an area of their choice, we expect them to somehow come out oven-ready at graduation. I do not believe it is possible to do this to a level of safety that many other professions demand, nor is this the primary job of a university. Sadly, universities have given up on arguing, intimidated by the government and their regulatory commissars, and nervous of losing their monopoly on producing doctors.
But I will make a plea that one area really does deserve more attention within a university : the history of how medical advance occurs. No, I do not mean MCQs asking for the date of birth of Robert Koch or Lord Lister, but a feel for the historical interplay of convention and novelty. Without this our students and our graduates are almost confined to living in the present, unaware of the past, and unable to doubt how different the future will be. Below is one example.
”In 1938 Albert Hofmann, a chemist at the Sandoz Laboratories in Basel, created a series of new compounds from lysergic acid. One of them, later marketed as Hydergine, showed great potential for the treatment of cerebral arteriosclerosis. Another salt, the diethylamide (LSD), he put to one side, but he had “a peculiar presentiment,” as he put it in his memoir LSD: My Problem Child (1980), “that this substance could possess properties other than those established in the first investigations.
In 1943 he prepared a fresh batch of LSD. In the final process of its crystallization, he started to experience strange sensations. He described his first inadvertent “trip” in a letter to his supervisor:
At home I lay down and sank into a not unpleasant, intoxicated-like condition, characterized by extremely stimulated imagination. In a dream-like state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.
After eliminating chloroform fumes as a possible cause, he concluded that a tiny quantity of LSD absorbed through the skin of his fingertips must have been responsible. Three days later he began a program of unsanctioned research and deliberately ingested 250 micrograms of LSD at 4:20 PM. Forty minutes later, he wrote in his lab journal, “Beginning dizziness, feeling of anxiety, visual distortions, symptoms of paralysis, desire to laugh.” He set off home on his bicycle, accompanied by his laboratory assistant. This formal trial of what Hofmann considered a minute dose of LSD had more distressing effects than his first chance exposure:
Every exertion of my will, every attempt to put an end to the disintegration of the outer world and the dissolution of my ego, seemed to be wasted effort. A demon had invaded me, had taken possession of my body, mind, and soul. I jumped up and screamed, trying to free myself from him, but then sank down again and lay helpless on the sofa…. I was taken to another world, another place, another time.
A doctor was summoned but found nothing amiss apart from a marked dilation of his pupils. A fear of impending death gradually faded as the drug’s effect lessened, and after some hours Hofmann was seeing surreal colors and enjoying the play of shapes before his eyes.
Many editors of learned medical journals now automatically turn down publications describing the sort of scientific investigation that Albert Hofmann carried out on himself. Institutional review boards are often scathing in their criticism of self-experimentation, despite its hallowed tradition in medicine, because they consider it subjective and biased. But the human desire to alter consciousness and enrich self-awareness shows no sign of receding, and someone must always go first. As long as care and diligence accompany the sort of personal research conducted by Pollan and Lin, it has the potential to be as revealing and informative as any work on psychedelic drugs conducted within the rigid confines of universities.
Richard Horton in the Lancet writes:
Imagine if the entire edifice of knowledge in medicine was built upon a falsehood. Systematic reviews are said to be the highest standard of evidence-based health care. Regularly updated to ensure that treatment decisions are built on the most up-to-date and reliable science, systematic reviews and meta-analyses are widely used to inform clinical guidelines and decision making. Powerful organisations have emerged to construct a knowledge base in medicine underpinned by the results of systematic reviews. One such organisation is Cochrane, with 11 000 members in over 130 countries. This extraordinary movement of people is justifiably passionate about the idea that it is contributing to better health outcomes for everyone, everywhere. The industry that drives the production of systematic reviews today is financed by some of the most influential agencies in medical research. Cochrane, for example, points to three funders providing over £1 million each—the UK’s National Institute for Health Research (NIHR), the US National Institutes of Health (NIH), and Australia’s National Health and Medical Research Council (NHMRC).
Well, it really is a bit late for all this soul searching. See my earlier post here ‘Mega-silliness’ (commenting on what others had already pointed out); or my Evidence Based Medicine: the Epistemology That Isn’t, written over 20 years ago; and my contribution to the wake (even if I didn’t put my hand in my pocket), Why we should let “evidence-based medicine” rest in peace. The genesis of EBM was as a cult whose foundational myth was that P values could act as a true machine. Those followers who had originally hoped for a place in the promised afterlife, soon settled for paying the bills, and EBM morphed into a career opportunity for those who found accountancy too daring. So, pace John Mayall on Jazz Blues Fusion, don’t come here to listen to an old record. I promise.
Dr Chris Day writes:
Two weeks ago, I swabbed my first positive Covid-19 patient during an A&E Locum shift. I must say back then, I hadn’t fully taken in what we as a country will have to face over the coming months. The reports from colleagues in Italy and China are beyond belief.
The UK has been left to fight Covid-19 with half the Intensive Care beds per capita of Italy. Back in 2014, the trigger for my whistleblowing case was my attempt to try and secure more ICU resources for South East London (see Private Eye).
Instead of spending 5 years and £700k fighting /smearing me and damaging whistleblowing law, the NHS could have just fixed the problem. There has never been a more important time for the public and the politicians to understand Intensive Care resourcing and what is decided on their behalf by NHS leaders.
Two letters in Lancet Oncology. This bloody story never ends. We have not invented truth machines: judgement has never been exiled from discovery.
Since he shared every passing observation online, it was not surprising that on December 30th he put up a post about an odd cluster of pneumonia cases at the hospital. They were unexplained, but the patients were in quarantine, and they had all worked in the same place, the pungent litter-strewn warren of stalls that made up the local seafood market. Immediately this looked like person-to-person transmission to him, even if it might have come initially from bats, or some other delicacy. Immediately, too, it raised the spectre of the sars virus of 2002-03 which had killed more than 700 people. He therefore decided to warn his private WeChat group, all fellow alumni from Wuhan University, to take precautions. He headed the post: “Seven cases of sars in the Huanan Wholesale Seafood Market”. That was his mistake.
The trouble was that he did not know whether it was actually sars. He had posted it too fast. In an hour he corrected it, explaining that although it was a coronavirus, like sars, it had not been identified yet. But to his horror he was too late: his first post had already gone viral, with his name and occupation undeleted, so that in the middle of the night he was called in for a dressing down at the hospital, and January 3rd he was summoned to the police station.
But most of Case and Deaton’s ire focuses on the health care industry, which not only underperforms but is also wrecking the US economy. We [USA] spend twice per capita what France spends on health care, but our life expectancy is four years shorter, our rates of maternal and infant death are almost twice as high, and, unlike the French, we leave 30 million people uninsured. The amount Americans spend unnecessarily on health care weighs more heavily on our economy, Case and Deaton write, than the Versailles treaty reparations did on Germany’s in the 1920s. If, decades ago, we’d built a health system like Switzerland’s, which costs 30 percent less per capita than ours does, we’d now have an extra trillion dollars a year to spend, for example, on replacing the pipes in the nearly four thousand US counties where lead levels in drinking water exceed those of Flint, Michigan, and on rebuilding America’s bridges railroads, and highways—now so rundown that FedEx replaces delivery van tires twice as often as it did twenty years ago.
In the US, health insurance accounts for 60 percent of the cost of hiring a low-wage worker. Many employers opt instead to hire contract workers with no benefits, or illegal immigrants with no rights at all.
Terrific article on Covid-19 (Sars-CoV-2). in the LRB by Rupert Beale. He says written in haste but it doesn’t read that way. It contains some memorable lines.
As the US health secretary Michael Leavitt put it in 2006, ‘anything we say in advance of a pandemic happening is alarmist; anything we say afterwards is inadequate.’
And how do you think hard about research funding for the long term (I am old enough to remember when stroke and dementia were virtually non-subjects as far as ‘good research funding’ was concerned).
Virologists need more than clever tricks: we also need cash. Twenty years ago, funding wasn’t available to study coronaviruses. In 1999, avian infectious bronchitis virus was the one known truly nasty coronavirus pathogen. Only poultry farmers really cared about it, as it kills chickens but doesn’t infect people. In humans there are a number of fairly innocuous coronaviruses, such as OC43 and HKU1, which cause the ‘common cold’. Doctors don’t usually bother testing for them – you have a runny nose, so what?
And note the conditional tense:
The global case fatality rate is above 3 per cent at the moment, and if – reasonable worst case scenario – 30-70 per cent of the 7.8 billion people on earth are infected, that means between 70 and 165 million deaths. It would be the worst disaster in human history in terms of total lives lost. Nobody expects this, because everyone expects that people will comply with efficient public health measures put in place by responsible governments.
And to repeat my own mantra (stolen from elsewhere): the opposite of science is not art, but politics:
The situation isn’t helped by a president [Trump] who keeps suggesting that the virus isn’t that bad, it’s a bit like flu, we will have a vaccine soon: stopping flights from China was enough. Tony Fauci, the director of the National Institute of Allergy and Infectious Disease, deftly cut across Trump at a White House press briefing. No, it isn’t only as bad as flu, it’s far more dangerous. Yes, public health measures will have to be put in place and maintained for many months. No, a vaccine isn’t just around the corner, it will take at least 18 months. Fauci was then ordered to clear all his press briefings on Covid-19 with Mike Pence in advance: the vice president’s office is leading the US response to the virus. ‘You don’t want to go to war with a president,’ Fauci remarked.
And Beale ends by quoting an ID colleague.
This is not business as usual. This will be different from what anyone living has ever experienced. The closest comparator is 1918 influenza.
Caution: pace the author, ‘This is a fast-moving situation, and the numbers are constantly changing – certainly the ones I have given here will be out of date by the time you read this.’
Link. (London Review of Books: Vol. 42 No. 5, 5 March 2020: “Wash your Hands”: Rupert Beale)
I titled a recent post musing over my career as ‘The Thrill is Gone’. But I ended on an optimistic note:
‘The baton gets handed on. The thrill goes on. And on’
But there are good reasons to think otherwise. Below is a quote from a recent letter in the Lancet by Gagab Bhatnaga. You can argue all you like about definitions of ‘burnout’, but good young people are leaving medicine. The numbers who leave for ever may not be large but I think some of the best are going. What worries as much is those who stay behind.
The consequences of physician burnout have been clearly observed in the English National Health Service (NHS). F2 doctors (those who are in their second foundation year after medical school) can traditionally go on to apply to higher specialist training. Recent years have seen an astounding drop in F2 doctors willing to continue NHS training4 with just over a third (37·7%) of F2 doctors applying to continue training in 2018, a decrease from 71·3% in 2011. Those taking a career break from medicine increased almost 3-fold from 4·6% to 14·6%. With the NHS already 10 000 doctors short, the consequences of not recruiting and retaining our junior workforce will be devastating.
Henry characterise the less attractive teaching rounds as examples of shifting dullness
Henry Miller (apologies, a medic joke)
My earliest conscious memory of disease and doctors was in the management of my atopic dermatitis. Here is Sam Shuster writing poetically about atopic dermatitis in ‘World Medicine’ in 1983.
A dozen years of agony; years of sleeplessness for child and parents, years of weeping, itching, scaling skin, the look and feel of which is detested.
The poverty of our treatments is made all the worse by the unfair raising of expectations: I don’t mean by obvious folk remedies; I mean medical folk remedies like the recent pseudoscientific dietary treatments which eliminate irrelevant allergens. There neither is nor ever was good evidence for a dietary mechanism. And as for cows’ milk, I would willingly drown its proponents in it. We have nothing fundamental for the misery of atopic eczema and that’s why I would like to see a real treatment—not one of those media breakthroughs, and not another of those hope raising nonsenses like breast-feeding: I mean a real and monstrously effective treatment. Not one of your P<.05 drugs the effect of which can only be seen if you keep your eyes firmly double-blind, I mean a straightforward here today and gone tomorrow job, an Aladdin’s salve—put it on and you have a new skin for old.
Nothing would please me more in the practice of clinical dermatology than never again to see a child tearing its skin to shreds and not knowing how long it will be before it all stops, if indeed it does.
Things are indeed better now, but not as much we need: we still don’t understand the itch nor can we easily block the neural pathways involved. Nor has anything replaced the untimely murder of ‘World Medicine’. A glass of milk has never looked the same since, either.
There is an interesting review in the Economist of the ‘Great Pretender: The Undercover Mission that Changed out Understanding of Madness,’ written by Susan Cahalan. The book is the story of the American psychologist David Rosenhan who “recruited seven volunteers to join him in feigning mental illness, to expose what he called the ‘undoubtedly counter-therapeutic’ culture of his country’s psychiatry”.
Rosenthal’s studies are well known and were influential, and some might argue that may have had have a beneficial effect on subsequent patient care. The question is whether they were true. The review states:
in the end Rosenham emerges as an unpalatable symptom of a wider academic malaise”.
As for the ‘malaise’, the reviewer goes on:
Many of psychology’s most famous experiments have recently been discredited or devalued, the author notes. Immense significance has been attached to Stanley Milgram’s shock tests and Philip Zimbardo’s Stanford prison experiment, yet later re-runs have failed to reproduce their findings. As Ms Cahalan laments, the feverish reports on the undermining of such theories are a gift to people who would like to discredit science itself.
I have a few disjointed thoughts on this. There are plenty of other considered critiques of the excesses of modern medical psychiatry. Anthony Clare’s ‘Psychiatry in Dissent’ was for me the best introduction to psychiatry. And Stuart Sutherland’s “Breakdown’ was a blistering and highly readable attack on medical (in)competence as much as the subject itself (Sutherland was a leading experimental psychologist, and his account is autobiographical). And might the cross-country diagnostic criteria studies not have happened without Rosenham’s work?
As for undermining science (see the quote above), I think unreliable medical science is widespread, and possibly there is more of it than in many past periods. Simple repetition of experiments is important but not sufficient, and betrays a lack of of understanding of why some science is so powerful.
Science owes its success to its social organisation: conjectures and refutations, to use Popper’s terms, within a community. Just repeating an experiment under identical conditions is not sufficient. Rather you need to use the results of one experiment to inform the next, and with the accumulation of new results, you need to build a larger and larger edifice which whilst having greater explanatory power is more and more intolerant of errors at any level. Building large structures out of Lego only works because of the precision engineering of each of the component bricks. But any errors only become apparent when you add brick-on-brick. When a single investigator or group of investigators have skin in the game during this process — and where experimentation is possible — science is at its strongest (the critiques can of course come from anywhere).
An alternative process is when the results of a series of experiments are so precise and robust that everyday life confirms them: the lights go on when I click the switch. This harks back to the reporting of science as ‘demonstrations’.
By these two standards much medical science may be unreliable. First, because the fragmentation of enquiry discourages the creation of broad explanatory theories or tests of the underlying hypotheses. The ‘testing’ is more whether a publishable unit can be achieved rather than nature understood. Second, in many RCTs or technology assessments there is little theoretical framework on which to challenge nature. Nor can everyday practice act as the necessary feedback loop in the way the tight temporal relationship between flipping the switch and seeing the light turn on can.
Perhaps, perhaps not. But when and where is even more important.
Hailed as a maths prodigy at school, Shields accepted a junior position at Merrill Lynch after studying engineering, economics and management at Oxford University because the trading room floor offered him a thrilling, dynamic environment. He was not alone: of 120 engineers in his year group at university, Shields added, only five went into engineering.
I think we should be much more cautious in attempting to direct young people’s choices beyond providing them with an education. We should feel proud of their independence of mind, remembering that supply side factors will likely win out over central planning. It is the supply side that we need to deal with, not least Putts Law. The same applies to medicine.
This personal story is worth a read for other lessons, too.
The government has instructed Health Education England to consult patients and the public on what they need from “21st century” medical graduates
It won’t end well.
One-third of everyone employed in London, 1.6 million people, work at night.
In 2018, at least 8,855 people slept rough on the streets of London, a 140% increase over the past decade, with similar trends globally.
“If biology is difficult, it is because of the bewildering number and variety of things one must hold in one’s head”.
John Maynard Smith (1977).
Leo Szilard recalled, that when he did physics he could lounge in the bath for hours and hours, just thinking. Once he moved into biology things were never the same: he was always having to get out to check some annoying fact. Dermatology is worse, trust me.