It is hard not to be moved nor not be angry on reading the editorial in this week’s Lancet, written by three members of the Covid-19 Bereaved Families for Justice group.
The UK Prime Minister Boris Johnson has previously suggested that an immediate public inquiry into the government’s handling of COVID-19 would be a distraction7 or diversion of resources in the fight against COVID-19. We have long proposed that quite the opposite is true: an effective rapid review phase would be an essential element in combating COVID-19.
An independent and judge-led statutory public inquiry with a swift interim review would yield lessons that can be applied immediately and help prevent deaths in this tough winter period in the UK. Such a rapid review would help to minimise further loss of life now and in the event of future pandemics. In the wake of the Hillsborough football stadium disaster on April 15, 1989, for example, the Inquiry of Lord Justice Taylor delivered interim findings within 11 weeks, allowing life-saving measures to be introduced in stadiums ahead of the next football season.
I will quote Max Hastings, a former editor of the Daily Telegraph and Evening Standard, and a distinguished military historian, writing in the Guardian many years ago. He was describing how he had overruled some of his own journalists who had suspected Peter Mandelson of telling lies.
I say this with regret. I am more instinctively supportive of institutions, less iconoclastic, than most of the people who write for the Guardian, never mind read it. I am a small “c” conservative, who started out as a newspaper editor 18 years ago much influenced by a remark Robin Day once made to me: “Even when I am giving politicians a hard time on camera,” he said, “I try to remember that they are trying to do something very difficult – govern the country.” Yet over the years that followed, I came to believe that for working journalists the late Nicholas Tomalin’s words, offered before I took off for Vietnam for the first time back in 1970, are more relevant: “they lie”, he said. “Never forget that they lie, they lie, they lie.” Max Hastings
Two of Hasting’s journalists at the Evening Standard were investigating the funds Peter Mandelson had used to purchase a house.
One morning, Peter Mandelson rang me at the Evening Standard. “Some of your journalists are investigating my house purchase,” he said. “It really is nonsense. There’s no story about where I got the funds. I’m buying the house with family money.”
I knew nothing about any of this, but went out on the newsroom floor and asked some questions. Two of our writers were indeed probing Mandelson’s house purchase. Forget it, I said. Mandelson assures me there is no story. Our journalists remonstrated: I was mad to believe a word Mandelson said. I responded: “Any politician who makes a private call to an editor has a right to be believed until he is proved a liar.” We dropped the story.
Several months later
…when the Mandelson story hit the headlines, I faced a reproachful morning editorial conference. A few minutes later, the secretary of state for industry called. “What do I have to do to convince you I’m not a crook ?” he said.
I answered: “Your problem, Peter, is not to convince me that you are not a crook, but that you are not a liar.”
The default, and most sensible course of action, is to assume that the government and many of those who answer directly to the government have lied and will continue to lie.
An article discussing Canadian health care with echoes of the UK’s own parochial attitude to health care (and don’t mention Holland, Germany, France, Switzerland…).
How do such gaps and problems persist? Part of the problem, ironically, is the system’s high approval ratings: with such enthusiasm for the existing system, and with responsibility for it shared between federal and provincial or territorial governments, it’s easy for officials to avoid making necessary changes. Picard sees our narrowness of perspective as a big obstacle to reform: “Canadians are also incredibly tolerant of mediocrity because they fear that the alternative to what we have is the evil US system.” Philpott agrees that Canadians’ tendency to judge our system solely against that of the United States can be counterproductive. “If you always compare yourself to the people who pay the most per capita and get some of the worst outcomes,” she told me in a recent Zoom call, “then you’re not looking at the fact that there are a dozen other countries that pay less per capita and have far better outcomes than we do.”
The Holy See is thus viewed as the central government of the Catholic Church. The Catholic Church, in turn, is the largest non-government provider of education and health care in the world. The diplomatic status of the Holy See facilitates the access of its vast international network of charities.[emphasis added]
There is a famous quote ( I don’t have a primary source) by the great Rudolf Virchow
“Medicine is a social science, and politics is nothing more than medicine on a large scale.”
I know what Virchow was getting at, but if only.
I think the quip was from the series Cardiac Arrest: the ITU used to be called the ICU (intensive care unit) until they realised nobody did.
In March, 2019, a doctor informed 78-year-old Ernest Quintana, an inpatient at a hospital in California, USA, that he was going to die. His ravaged lungs could not survive his latest exacerbation of chronic obstructive pulmonary disease, so he would be placed on a morphine drip until, in the next few days, he would inevitably perish. There was a twist. A robot had delivered the bombshell. There, on a portable machine bearing a video screen, crackled the pixelated image of a distant practitioner who had just used cutting-edge technology to give, of all things, a terminal diagnosis. The hospital insisted that earlier conversations with medical staff had occurred in person, but as Mr Quintana’s daughter put it: “I just don’t think that critically ill patients should see a screen”, she said. “It should be a human being with compassion.”
According to a helpful app on my phone that I like to think acts as a brake on my sloth, I retired 313 days ago. One of the reasons I retired was so that I could get some serious work done; I increasingly felt that professional academic life was incompatible with the sort of academic life I signed up for. If you read my previous post, you will see this was not the only reason, but since I have always been more of an academic than clinician, my argument still stands.
Over twenty years ago, my friend and former colleague, Bruce Charlton, observed wryly that academics felt embarrassed — as though they had been caught taking a sly drag round the back of the respiratory ward — if they were surprised in their office and found only to be reading. No grant applications open; no Gantt charts being followed; no QA assessments being written. Whatever next.
I thought about retirement from two frames of reference. The first, was about finding reasons to leave. After all, until I was about 50, I never imagined that I would want to retire. I should therefore be thrilled that I need not be forced out at the old mandatory age of 65. The second, was about finding reasons to stay, or better still, ‘why keep going to work?’. Imagine you had a modest private income (aka a pension), what would belonging to an institution as a paid employee offer beyond that achievable as a private scholar or an emeritus professor? Forget sunk cost, why bother to move from my study?
Many answers straddle both frames of reference, and will be familiar to those within the universities as well as to others outwith them. Indeed, there is a whole new genre of blogging about the problems of academia, and employment prospects within it (see alt-acor quit-lit for examples). Sadly, many posts are from those who are desperate to the point of infatuation to enter the academy, but where the love is not reciprocated. There are plenty more fish in the sea, as my late mother always advised. But looking back, I cannot help but feel some sadness at the changing wheels of fortune for those who seek the cloister. I think it is an honourable profession.
Many, if not most, universities are very different places to work in from those of the 1980s when I started work within the quad. They are much larger, they are more corporatised and hierarchical and, in a really profound sense, they are no longer communities of scholars or places that cherish scholarly reason. I began to feel much more like an employee than I ever used to, and yes, that bloody term, line manager, got ever more common. I began to find it harder and harder to characterise universities as academic institutions, although from my limited knowledge, in the UK at least, Oxbridge still manage better than most 1. Yes, universities deliver teaching (just as Amazon or DHL deliver content), and yes, some great research is undertaken in universities (easy KPIs, there), but their modus operandi is not that of a corpus of scholars and students, but rather increasingly bends to the ethos of many modern corporations that self-evidently are failing society. Succinctly put, universities have lost their faith in the primacy of reason and truth, and failed to wrestle sufficiently with the constraints such a faith places on action — and on the bottom line.
Derek Bok, one of Harvard’s most successful recent Presidents, wrote words to the effect that universities appear to always choose institutional survival over morality. There is an externality to this, which society ends up paying. Wissenschaft als Beruf is no longer in the job descriptions or the mission statements2.
A few years back via a circuitous friendship I attended a graduation ceremony at what is widely considered as one of the UK’s finest city universities3. This friend’s son was graduating with a Masters. All the pomp was rolled out and I, and the others present, were given an example of hawking worthy of an East End barrow boy (‘world-beating’ blah blah…). Pure selling, with the market being overseas students: please spread the word. I felt ashamed for the Pro Vice Chancellor who knew much of what he said was untrue. There is an adage that being an intellectual presupposes a certain attitude to the idea of truth, rather than a contract of employment; that intellectuals should aspire to be protectors of integrity. It is not possible to choose one belief system one day, and act on another, the next.
The charge sheet is long. Universities have fed off cheap money — tax subsidised student loans — with promises about social mobility that their own academics have shown to be untrue. The Russell group, in particular, traducing what Humboldt said about the relation between teaching and research, have sought to diminish teaching in order to subsidise research, or, alternatively, claimed a phoney relation between the two. As for the “student experience”, as one seller of bespoke essays argued4, his business model depended on the fact that in many universities no member of staff could recognise the essay style of a particular student. Compare that with tuition in the sixth form. Universities have grown more and more impersonal, and yet claimed a model of enlightenment that depends on personal tuition. Humboldt did indeed say something about this:
“[the] goals of science and scholarship are worked towards most effectively through the synthesis of the teacher’s and the students’ dispositions”.
As the years have passed by, it has seemed to me that universities are playing intellectual whack-a-mole, rather than re-examining their foundational beliefs in the light of what they offer and what others may offer better. In the age of Trump and mini-Trump, more than ever, we need that which universities once nurtured and protected. It’s just that they don’t need to do everything, nor are they for everybody, nor are they suited to solving all of humankind’s problems. As had been said before, ask any bloody question and the universal answer is ‘education, education, education’. It isn’t.
That is a longer (and more cathartic) answer to my questions than I had intended. I have chosen not to describe the awful position that most UK universities have found themselves in at the hands of hostile politicians, nor the general cultural assault by the media and others on learning, rigour and nuance. The stench of money is the accelerant of what seeks to destroy our once-modern world. And for the record, I have never had any interest in, or facility for, management beyond that required to run a small research group, and teaching in my own discipline. I don’t doubt that if I had been in charge the situation would have been far worse.
Sydney Brenner, one of the handful of scientists who made the revolution in biology of the second half of the 20th century once said words to the effect that scientists no longer read papers they just Xerox them. The problem he was alluding to, was the ever-increasing size of the scientific literature. I was fairly disciplined in the age of photocopying but with the world of online PDFs I too began to sink. Year after year, this reading debt has increased, and not just with ‘papers’ but with monographs and books too. Many years ago, in parallel with what occupied much of my time — skin cancer biology and the genetics of pigmentation, and computerised skin cancer diagnostic systems — I had started to write about topics related to science and medicine that gradually bugged me more and more. It was an itch I felt compelled to scratch. I wrote a paper in the Lancet on the nature of patents in clinical medicine and the effect intellectual property rights had on the patterns of clinical discovery; several papers on the nature of clinical discovery and the relations between biology and medicine in Science and elsewhere. I also wrote about why you cannot use “spreadsheets to measure suffering” and why there is no universal calculus of suffering or dis-ease for skin disease ( here and here ); and several papers on the misuse of statistics and evidence by the evidence-based-medicine cult (here and here). Finally, I ventured some thoughts on the industrialisation of medicine, and the relation between teaching and learning, industry, and clinical practice (here), as well as the nature of clinical medicine and clinical academia (here and here ). I got invited to the NIH and to a couple of AAAS meetings to talk about some of these topics. But there was no interest on this side of the pond. It is fair to say that the world was not overwhelmed with my efforts.
At one level, most academic careers end in failure, or at last they should if we are doing things right. Some colleagues thought I was losing my marbles, some viewed me as a closet philosopher who was now out, and partying wildly, and some, I suspect, expressed pity for my state. Closer to home — with one notable exception — the work was treated with what I call the Petit-mal phenomenon — there is a brief pause or ‘silence’ in the conversation, before normal life returns after this ‘absence’, with no apparent memory of the offending event. After all, nobody would enter such papers for the RAE/REF — they weren’t science with data and results, and since of course they weren’t supported by external funding, they were considered worthless. Pace Brenner, in terms of research assessment you don’t really need to read papers, just look at the impact factor and the amount and source of funding: sexy, or not?5
You have to continually check-in with your own personal lodestar; dead-reckoning over the course of a career is not wise. I thought there was some merit in what I had written, but I didn’t think I had gone deep enough into the problems I kept seeing all around me (an occupational hazard of a skin biologist, you might say). Lack of time was one issue, another was that I had little experience of the sorts of research methods I needed. The two problems are not totally unrelated; the day-job kept getting in the way.
He was nearer seventy than sixty, and not from one of Edinburgh’s more salubrious neighbourhoods. He sat on the examination couch unsure what to do next. His right trouser was leg rolled up, exposing a soiled bandage crusted with blood that had clearly been there for more than a few days. He nodded as I walked into the clinic room and I introduced myself with a shake of his hand. This was pre-covid.
I knew his name because that was typed on the clinic list alongside the code that said he was a ‘new’1 patient, but not much else. Not much else because his clinical folder contained sticky labels giving his name, address, date of birth and health care number only. That was it. As has become increasingly the norm in the clinic room, you ask the patient if they know why they are here.
He had phoned the hospital four days earlier, he said, and he was very grateful that he had been given an appointment to see me. He thanked me as though I was his saviour. If true, I didn’t know from what or from whom. If he was a new patient he would have seen his GP and there should be a letter from his GP in his notes. But no, he hadn’t seen his GP for over a year. Had I seen him before? No, he confirmed, but he had seen another doctor in the very same department about eighteen months previously. I enquired further. He said he had something on his leg — at the site of the distinctly un-fresh bandage — that they had done something to. It had now started to bleed spontaneously. He had phoned up on several occasions, left messages and, at least once, spoken to somebody who said they would check what had happened and get back to him. ‘Get back to you’ is often an intention rather than an action in the NHS, so I was not surprised when he said that he had heard nothing back. His leg was now bleeding and staining his trousers and bed clothes, hence the bandage. He thought that whatever it had been had come back.
Finally, four days before this appointment day, after he relayed his story one more time over the phone, he had been given this appointment. He again told me again how grateful he was to me for seeing him. And no, he didn’t know what diagnosis had been made in the past. I asked him had he received any letters from the hospital. No, he replied. Could he remember the name of any of the doctors he had seen over one year previously? Sadly, not. Had he been given an appointment card with a consultant’s name on? No.
There was a time when nursing and medicine were complementary professions. At one time the assistant who ushered him into the clinic room would have removed the bandage from his leg. In my clinical practice, those days ended long ago. I asked him if he would unwrap the bandage while I went in search of our admin staff to see if they knew more than me about why he was here.
He had been seen before, just as he had said, around eighteen months earlier. He had seen an ‘external provider’, one of a group of doctors employed via commercial agencies who are contracted to cope with all the patients that the regular staff employed by the hospital are unable to see. That demand exceeds supply, is the one feature of the NHS that all agree on, whatever their politics. It outlives all reorganisations. Most of these external provider doctors travel up for weekends, staying in a hotel for one or more nights, and then fly back home. They get paid more than the local doctors (per clinic), and the agency takes a substantial arrangement fee in addition. This had been the norm for over ten years, and of course makes little clinical or financial sense — except if the name of the game is to be able to shape waiting lists with electoral or political cycles, turning the tap on and off. Usually more off, than on.
The doctors who undertake this weekend work are a mixed bunch. Most of them are very good, but of course they don’t normally work in Scotland, and medicine varies across the UK and Europe, and even between regions within one country. It is not so much the medicine that is very different, but the way that different components of care fit together organisationally that are not constant. This hints at one fault line.
That the external doctors are more than just competent is important for another reason. The clinic lists of the visiting doctors are much busier than those of the local doctors, and are full of new patients rather than patients brought back for review. The NHS and the government consider review appointments as wasteful, and that is why all the targets relate to ‘new’ patients. It’s a numbers game: stack them high, don’t let the patients sit down for too long, and process them. Meet those government targets and move in phase with the next election cycle. Consequently, the external provider doctors are being asked to provide episodic care under time pressure; speed dating rather than maintaining a relationship. For most of the time, nobody who actually works in Edinburgh knows what is going on with the patient. But the patients do live in Edinburgh.
Old timers like me know that one of the reasons why review appointments are necessary is that they are a security net, a back up system. In modern business parlance, they add resilience. Like stocks of PPE. In the case of my man, a return appointment would have provided the opportunity to tell him what the hell was going on and to ensure that all that had actually been planned had been carried out. There is supposed to be a beginning, a middle and an end. There wasn’t.
An earlier letter from an external provider doctor was found. It was a well-written summary of the consultation. The patient had a lesion on his leg that was thought clinically to be pre-malignant. The letter stated that if a diagnostic biopsy confirmed this clinical diagnosis — it did — then the patient would require definitive treatment, most likely, surgical. The problem was that in this informal episodic model, the original physician was not there to act on the result; nor to observe that the definitive surgical treatment had not taken place because review appointments are invisible in terms of targets. They are wasteful.
Even before returning to the clinic room, without sight of anything but the blood stained bandage, I knew what was going on. His pre-malignant lesion had, over the period of ‘wasteful’ time, transformed into full-blown cancer. He now had a squamous cell carcinoma. His mortality risk had gone from effectively zero to maybe 5%.
I went back to the clinic room, apologised, explained what had gone on and what needed to happen now, and apologised again. The patient picked up on my mixture of frustration, shame and anger, and it embarrasses me to admit that I had somehow allowed him —mistakenly — to imagine that my emotions were a response to something he had said or done. I apologised again. And then he did say something that fired my anger. I cannot remember the whole sentence but a phrase within it stuck: ‘not for the likes of me’. His response to the gross inadequacy of his care was that it was all people like him could expect.
He was not literally the last patient in dermatology I saw, but his story was the one that told me I had to get out. When a pilot or an airline engineer says that an aircraft is safe to fly there is an unspoken bond between passengers and those who dispense a professional judgement. But this promise is also made by one human to another human. I call it the handshake test, which is why I always shook hands when I introduced myself to patients. This judgement that is both professional and personal has to be compartmentalised away from the likes of sales and marketing, the share price — and government targets or propaganda. This is no longer true of the NHS. The NHS is no longer a clinically led organisation, rather, it is a vehicle for ensuring one political gang or another gains ascendancy over the other at the next election. It is not so much about money, as about control. True, if doctors went down with the plane, in this metaphor, there would be a much better alignment of incentives. Doctors might be yet more awkward. Better still, we might think about where we seat the politicians and their NHS commissars.
Most doctors keep a shortlist of other doctors who they think of as exceptional. These are the ones they would visit themselves or recommend to family. If I had to rank my private shortlist, I know who would come number one. She is not a dermatologist, but a physician of a different sort, and she works far away from Edinburgh. She has been as loyal and tolerant of the NHS as anybody I know — much more than me. Yet she retired before me, and her reasoning and justification were as insightful and practical as her medical abilities. Simply put, she could no longer admit her patients and feel able to reassure them that the care they would receive would be safe. It’s the handshake test.
I don’t shake hands with patients any more.
It hasn’t happened to me often — maybe on only a handful of occasions — but often enough to recognise it, and dread it. I am talking to a patient, trying to second guess the future — how likely is it that their melanoma might stay away for ever, for instance — and I find myself mouthing words that a voice in my head is warning me I will regret saying. And the voice is not so much following my words but anticipating them, so I cannot cite ignorance as an excuse, nor is it a whisper or unclear in any way, and yet I still charge on. A moment later, regret will set in, and this regret I could share with you at that very moment if you were there with me.
The patient was a young man in his early twenties, who lived with his mother, just the two of them at home. He had dark curly hair, was of average height, and he lived for running. This was Newcastle, in the time of Brendan Foster and Steve Cram. He had been admitted with pyrexia, chest pains and a cough. He had bacterial pneumonia, and although he seemed pretty sick, none of us were worried for him.
After a few days, he seemed no better, and we switched antibiotics. Medics reading this will know why. He started to improve within a day or so, and we felt we were in charge, pleased with, and confident of our decisions. This was when I spoke with his mother, updating her on his progress. Yes, he had been very ill; yes, we were certain about his diagnosis; and yes, the change of antibiotics and his response was not unexpected. I then said more. Trying to reassure her, I said that young fit people don’t die from pneumonia any more. That was it. All the demons shuttered.
At this time I was a medical registrar and I supervised a (junior) house officer (HO), and a senior house officer (SHO). In turn, my boss was a consultant physician who looked after ‘general medical’ patients, but his main focus was clinical haematology. In those days the norm was for all of a consultant’s patients to be managed on their own team ward. On our ward, maybe half the patients were general medical, and the others had haematological diseases. Since I was not a haematologist, I was solely tasked with looking after the general medical patients, and mostly acted without the need for close supervision (in a way that was entirely appropriate).
One weekend I was doing a brief ‘business’ ward round on a Sunday morning. Our young man with pneumonia was doing well, his temperature had dropped, and he was laughing and joking. We would have been making plans to let him home soon. The only thing of note was that the houseman reported that the patient had complained of some pain in one calf. I had a look and although the signs were at best minimal I wondered whether he could have had a deep vein thrombosis (DVT). Confirmatory investigations for DVTs in those days were unsatisfactory and not without iatrogenic risk, whilst the risks from anticoagulation in a previously fit young man with no co-morbidities are minimal. We started him on heparin.
A few days later he was reviewed on the consultant’s ward round. I knew that the decision to anti-coagulate would (rightly) come under review. The physical signs once subtle were now non-existent, and the anticoagulation was stopped. A reasonable decision I knew, but one that I disagreed with, perhaps more because of my touchy ego than deep clinical judgement.
Every seven to ten days or so I would be the ‘resident medical officer’ (RMO), meaning I would be on call for unselected medical emergencies. Patients might be referred directly to us by their general practitioner, or as ‘walk-ins’ via casualty (ER). In those days we would usually admit between 10 and 15 patients over a 24-hour period; and we might also see a further handful of patients who we judged did not require hospital admission. Finally, since we were resident, we continued to provide emergency medical care to the whole hospital, including our own preexisting patients.
It was just after 8.30am. The night had been quiet, and I was in high spirits as this was the last time I would act as an RMO. In fact, this was to be the last day of me being a ‘medical registrar’. Shortly after, I would leave Newcastle for Vienna and start a career as an academic dermatologist, a career path that had been planned many years before.
The clinical presentation approaches that of a cliché. A patient with or without various risk factors, but who has been ill from one of a myriad of different conditions, goes to the toilet to move their bowels. They collapse, breathless and go into shock. CPR may or may not help. A clot from their legs has broken free, and blocked the pulmonary trunk. Sufficient blood can no longer circuit from the right side of the heart to the left. The lungs and heart are torn asunder.
When the call went out, as RMO, I was in charge. Nothing we did worked. There is a time to stop, and I ignored it. One of my colleagues took the decision. Often with cardiac arrests, you do not know the patient. That helps. Often the call is about a patient who is old and with multiple preexisting co-morbidities. That is easier, too. But I knew this man or boy; and his mother.
That was the last patient I ever saw in general medicine.
When I was a medical registrar I did GP locums for a single-handed female GP in Newcastle. Doing them was great fun, and the money — she insisted on BMA rates — was always welcome. Nowadays, without specific training in general practice, you can’t act as a locum as I did then. This is probably for the best but, as ever, regulations always come with externalities, one of which is sometimes a reduction in overall job satisfaction.
I worked as a locum over a three period, usually for one week at a time, once or twice a year, covering some of the GP’s annual leave. Weekdays were made up of a morning surgery (8.30 to 10.30 or later), followed by house-calls through lunchtime to early afternoon, and then an evening surgery from 4.30 to around 6:30. I also ran a short Saturday morning surgery. Within the working day I could usually nip home for an hour or so.
From 7pm till the following morning, the Doctors Deputising Service (DDS) took over for emergency calls. They also covered the weekends. The DDS employed other GPs or full-time freelancers. Junior hospital doctors often referred to the DDS as the Dangerous Doctor Service. Whether this moniker was deserved, I cannot say, but seeing patients you don’t know in unfamiliar surroundings is often tricky. Read on.
Normally, the GP would cover the nights herself, effectively being on call 24 hours per day, week in, week out. Before she took leave, she used to proactively manage her patients, letting some of her surgery ‘specials’ or ‘regulars’ know she would be away, and therefore they might be better served by waiting for her to return. Because she normally did her own night-calls, she was aware of how a small group of patients might request night visits that might be judged to be unnecessary. I think the fee the DDS charged to her was dependent on how often a visit was requested, so, as far as was reasonable, she tried to ensure her patients knew that when she was away they would only get a visit from a ‘stranger’ — home night-time call-outs should be for real emergencies. I got the strong impression that her patients were very fond of her, and she of them. Without exception, they were always very welcoming to me, and I loved the work. Yes I got paid, but it was fun medicine, and offered a freedom that you didn’t feel in hospital medicine as a junior (or senior) doctor.
The last occasion I undertook the locum was eventful. I knew that this was going to be the last occasion, as that summer I was moving on from internal medicine to start training in dermatology — leaving for Vienna in early August. A request for a house-call, from forty-year-old man with a headache, came in just as the Friday evening surgery was finishing, a short while after 6.30pm. My penultimate day. I had been hoping to get off sharpish, knowing I would be doing the Saturday morning surgery, but contractually I was covering to 7pm, so my plan was to call at the patient’s house on the way home.
I took his clinical paper notes with me. There was virtually nothing in them, a fact that doctors recognise as a salient observation. He lived, as did most of the surgery’s patients, on a very respectable council estate that literally encircled the surgery. I could have walked, but chose to drive, knowing that since I had locked up the surgery, I could go straight home afterwards.
When I got to his house, his wife was standing outside, waiting for me. She was most apologetic, informing me that her husband was not at home, but had slipped out to take his dog for a walk. I silently wondered why if this was the case, he couldn’t have taken the dog with him to the surgery, saving me a trip. No matter. Grumbling about patient behaviour is not unnatural, but is often the parent of emotions that can cloud clinical judgement. There lie dragons.
The patient’s wife ran to the local park to find her husband, who, in tow with her and the dog, came running at a fair pace back to the house a few minutes later. The story was of a headache on one side of his head, posterior to the temple, that had started a few hours earlier. The headache was not severe, he told me, and he felt well; he didn’t think he had flu. His concern was simply because he didn’t normally get headaches. There was nothing else remarkable about his history; he was not on any medication, and had no preexisting complaints or diseases beyond the occasional cold. Nor did the actual headache provide any diagnostic clues. On clinical examination, he was apyrexial, with a normal pulse and blood pressure, and a thorough neuro exam (as in that performed by somebody who had recently done a neuro job) was normal. No neck stiffness or photophobia and the fundi were visualised and clear. The best I could do was wonder about a hint of erythema on his tympanic membrane on the side of the headache, but there was no local tenderness, there. I worried I was making the signs fit the story.
I told him I couldn’t find a good explanation for his headache, and that my clinical examination of him was essentially normal. There was a remote possibility that he had a middle ear infection, although I said that since he had no history of previous ear infections, this seemed unlikely. I opted to give him some amoxycillin (from my bag) and said that whilst night-time cover would be provided by the DDS, I would be holding a surgery on the Saturday morning in just over 12 hours time. Should he not feel right, he should pop in to see me, or I could visit him again. He and his wife thanked me for coming round, I went home and, as far as I knew, that was the end of the story of my penultimate day as a locum GP. He did not come to my Saturday morning surgery.
Several weeks later, when I was back doing internal medicine and on call for urgent GP referrals, the same GP phoned me up about another of her patients who she thought merited hospital assessment. This was easily sorted, and I then asked her about some of the patients of hers I had seen when I was her locum. There was one in particular, with abdominal pain, whom I had sent into hospital, and I wanted to know what had happened to him. She then told me that the patient had meningitis. There was a moment of confusion: we were not talking about the same patient.
The story of the man with the headache was as follows. I had seen him just before 7pm, apyrexial, fully conscious, with a normal pulse and blood pressure, and no neuro signs. By 8pm his headache was much more severe and his wife put a call into the DDS who saw him before 9pm, but could not find anything abnormal. By 10.30pm he was barely conscious, and his wife called the DDS who were going to be delayed. Soon after, she dialled 999. He was admitted and diagnosed and treated for bacterial meningitis. The GP told me he had made a prompt and complete recovery.
That was the last patient I ever saw in general practice.
“I have been this close to buying a nursing school.” This is not a sentence you expect to hear from a startup founder. Nursing seems a world away from the high-tech whizziness of Silicon Valley. And, to use a venture-capital cliché, it does not scale easily.
This was from an article in the Economist awhile back. As ever, there is a mixture of craziness and novelty. The gist of the article is about Lambda School, a company that matches ‘fast’ training with labour force shortages (hence the nursing angle). When I first read it, I had thought they had already opened a nursing school, but that is not so. Nonetheless, there are aspects that interest me.
We learn that
The Economist chimes in with the standard “Too often students are treated as cash cows to be milked for research funding.” Too true, but to solve this issue we need to massively increase research costings, have meaningful conversations with charities and government (including the NHS) about the way students are forced to involuntarily subsidise research, and cut out a lot of research in universities that is the D of R&D.
But this is not a sensible model for a university. On the other hand it is increasingly evident to me that universities are not suitable places to learn many vocational skills. The obvious immediate problem for Lambda is finding and funding a suitable clinical environment. That is exactly the problem that medical (or dental) schools face. A better model is a sequential one, one which ironically mimics the implicit English model of old: university study, followed by practical hospital clerkships. Just tweak the funding model to allow it.
I have rich memories of general practice, and I mean general practice rather than primary care 1. My earliest memories centre around a single-handed GP, who looked after my family until we left Wales in the early 1970s. His practice was in his house, just off Whitchurch village in Cardiff. You entered by what once may have been the back gate or tradesman’s entrance. Around the corner and a few steps up, you found the waiting room. Originally, I guess, it might have been a washroom or utility room for a maid or housekeeper. By the standards of the Rees abode the house was large.
The external door to the waiting room was opposite the door into the main part of the doctor’s house, and on the adjacent sides were two long benches. They were fun for a little boy to sit on because since your legs couldn’t touch the floor, you could shuffle along as spaces became available. When you did this adults tended to smile at you; I now know why. If you were immobile for too long your thighs might stick to the faux-leather surface; pulling them away fast resulted in a fart like noise, although in those days I was too polite to think out loud.
Once you were called — whether it was by the doctor or his wife I cannot remember— you entered his ‘rooms’. The consulting rooms was by my preferred unit measure — how far I could kick a ball — large, with higher ceilings than we had at home. The floorboards creaked and the carpet was limited to the centre of the room. If there was a need for privacy there was what seemed like a fairly inadequate freestanding curtained frame. For little boys, obviously, no such cover was deemed necessary.
I can remember many home visits: two stand out in particular, mumps, and an episode of heavily infected eczema where my body was covered in thousands of pustules, and where I remember pulling off sheets of skin that had stuck to the bedclothes. The sick-role was respected in our home: if you were ill and off school you were in bed. Well, almost. Certainly, no kicking the ball against the wall.
Naturally, the same GP would look after any visitors to my home. Although my memories are influenced by what my mother told me, on one occasion my Irish grandmother’s valvular heart failure decompressed when she was staying with us (her home was in Dublin). More precisely, I was turfed out of my bed, so she could occupy it. The GP phoned the Cardiff Royal Infirmary explaining that the patient needed admission, and would they oblige? The GP however took ten years-or-so off her true age. Once he was off the phone, my mother corrected him. He knew better: if I had told them the truth they would have refused to admit her, he said. (This was general practice, not state medicine, after all). The memory of this event stuck with me when I was a medical student on a geriatrics attachment in Sunderland circa 1981. Only those under 60 with an MI were deemed suitable for admission to the CCU, with the rest left in a large Nightingale ward with no cardiac monitoring 2. I thought of my father who was then close to 60.
I was lucky enough to be able to recognise this type of general practice — albeit with many much needed changes — as a medical student in Newcastle, and to be taught by some wonderful GPs, and even do some GP locums when I was a medical registrar. And although I had never met the late and great Julian Tudor-Hart face-to-face, we are linked by a couple of mutual Welsh friends, and we exchanged odd emails over the years.
So, why do I recall all of this? Nostalgia? Yes, I own up to that. But more out of anger that what was unique about UK general practice has been replaced by primary care and “population medicine”, and many patients are worse off because of this shift. Worse still, it now seems all is viewed not through the lens of vocation, but by the egregious ‘its just business’. Continuity of care and “personal doctoring” is, and has been, lost.
I write after being provoked by a comment in the London Review of Books. Responding to a terrific article by John Furse on the NHS, Helen Buckingham of the Nuffield Trust states — as many do — that “The reality is that almost all GP practices are already private businesses, and have been since the founding of the NHS.” (LRB 5/12/2019 page 4).
Well, for me, this is pure sophistry. There are businesses and businesses. If you wish, you might call the Catholic Church a business, or Edinburgh university a business, or even the army a business. You might even refer to each of them as a corporation. But to do so, misses all those human motivations that make up civil society. Particularly the ability to look people in the eye and not feel grimy. There is no way on earth that the GP who looked about me would have called what he did a business. Nor was he part of any corporation. And the reason is simple: like many think tanks, many modern corporations — especially the larger ones — have no sense of morality beyond the dollar of the bottom line3, often spending their undoubted skills wilfully arbitraging the imperfections of regulation and honest motivation. It does not have to be this way.
I have previously commented on Abraham Flexner on this site. The Flexner report is the most influential review of US medical education ever published, although some would argue that the changes it recommended were already working their way through the system. For a long time I was unaware of another project of his, an article with the title The Usefulness of Useless Knowledge 1. For me, there are echoes of Bertrand Russell’s In Praise of Idleness and the fact that Flexner’s essay was published at the onset of World War 2 adds anther dimension to the topic.
As for medical education, the ever-growing pressure is to teach so much that many students don’t have time to learn anything. I wish some other comments from Flexner opened any GMC dicta on what a university medical education should be all about.
“Now I sometimes wonder,” he wrote, “whether there would be sufficient opportunity for a full life if the world were emptied of some of the useless things that give it spiritual significance; in other words, whether our conception of what is useful may not have become too narrow to be adequate to the roaming and capricious possibilities of the human spirit.”
The history of science is the history of rejected ideas (and manuscripts). One example I always come back to is the original work of John Wennberg and colleagues on spatial differences in ‘medical procedures’ and the idea that it is not so much medical need that dictates the number of procedures, but that it is the supply of medical services. Simply put: the more surgeons there are, the more procedures that are carried out1. The deeper implication is that many of these procedures are not medically required — it is just the billing that is needed: surgeons have mortgages and tuition loans to pay off. Wennberg and colleagues at Dartmouth have subsequently shown that a large proportion of the medical procedures or treatments that doctors undertake are unnecessary2.
Wennberg’s original manuscript was rejected by the New England Journal of Medicine (NEJM) but subsequently published in Science. Many of us would rate Science above the NEJM, but there is a lesson here about signal and noise, and how many medical journals in particular obsess over procedure and status at the expense of nurturing originality.
Angus Deaton and Anne Case, two economists, the former with a Nobel Prize to his name, tell a similar story. Their recent work has been on the so-called Deaths of Despair — where mortality rates for subgroups of the US population have increased3. They relate this to educational levels (the effects are largely on those without a college degree) and other social factors. The observation is striking for an advanced economy (although Russia had historically seen increased mortality rates after the collapse of communism).
Coming back to my opening statement, Deaton is quoted in the THE
The work on “deaths of despair” was so important to them that they [Deaton and Case] joined forces again as research collaborators. However, despite their huge excitement about it, their initial paper, sent to medical journals because of its health focus, met with rejections — a tale to warm the heart of any academic whose most cherished research has been knocked back.
When the paper was first submitted it was rejected so quickly that “I thought I had put the wrong email address. You get this ping right back…‘Your paper has been rejected’.” The paper was eventually published in Proceedings of the National Academy of Sciences, to a glowing reception. The editor of the first journal to reject the paper subsequently “took us for a very nice lunch”, adds Deaton.
Another medical journal rejected it within three days with the following justification
The editor, he says, told them: “You’re clearly intrigued by this finding. But you have no causal story for it. And without a causal story this journal has no interest whatsoever.”
(‘no interest whatsoever’ — the arrogance of some editors).
Deaton points out that this is a problem not just for medical journals but in economics journals, too; he thinks the top five economics journals would have rejected the work for the same reason.
“That’s the sort of thing you get in economics all the time,” Deaton goes on, “this sort of causal fetish… I’ve compared that to calling out the fire brigade and saying ‘Our house is on fire, send an engine.’ And they say, ‘Well, what caused the fire? We’re not sending an engine unless you know what caused the fire.’
It is not difficult to see the reasons for the fetish on causality. Science is not just a loose-leaf book of facts about the natural or unnatural world, nor is it just about A/B testing or theory-free RCTs, or even just ‘estimation of effect sizes’. Science is about constructing models of how things work. But sometimes the facts are indeed so bizarre in the light of previous knowledge that you cannot ignore them because without these ‘new facts’ you can’t build subsequent theories. Darwin and much of natural history stands as an example, here, but my personal favourite is that provided by the great biochemist Erwin Chargaff in the late 1940s. Wikipedia describes the first of his ‘rules’.
The first parity rule was that in DNA the number of guanine units is equal to the number of cytosine units, and the number of adenine units is equal to the number of thymine units.
Now, in one sense a simple observation (C=G and A=T), with no causal theory. But run the clock on to Watson and Crick (and others), and see how this ‘fact’ gestated an idea that changed the world.
There was a touching obituary of Peter Sleight in the Lancet. Sleight was a Professor of Cardiovascular Medicine at Oxford and the obituary highlighted both his academic prowess and his clinical skills. Hard modalities of knowledge to combine in one person.
Throughout all this, at Oxford’s Radcliffe Infirmary and John Radcliffe Hospital, Sleight remained an expert bedside clinician, who revelled in distinguishing the subtleties of cardiac murmurs and timing the delays of opening snaps.
And then we learn
An avid traveller, Sleight was a visiting professor in several universities; the Oxford medical students’ Christmas pantomime portrayed him as the British Airways Professor of Cardiology. [emphasis added]
This theme must run and run, and student humour is often insightful (and on occasion, much worse). I worked somewhere where the nickname for the local airport was that of a fellow Gold Card professor. We often wondered what his tax status was.
The background is the observation that babies born by Caesarian have different gut flora than those born vaginally. The interest in gut flora is because many believe it relates causally to some diseases. How do you go about investigating such a problem?
Collectively, these seven women gave birth to five girls and two boys, all healthy. Each of the newborns was syringe-fed a dose of breast milk immediately after birth—a dose that had been inoculated with a few grams of faeces collected three weeks earlier from its mother. None of the babies showed any adverse reactions to this procedure. All then had their faeces analysed regularly during the following weeks. For comparison, the researchers collected faecal samples from 47 other infants, 29 of which had been born normally and 18 by Caesarean section. [emphasis added]
I cannot see the future, but like many, I have private models that I use to order the world, and for which I often have very little data. For instance, I think it obvious that the traditional middle-class professions (medicine, lay, veterinary medicine, architecture, dentistry, academia) are increasingly unattractive as careers1. I am not complaining about my choices — far from it; I benefited on the tailwinds of the dramatic social change that wars and other calamities bring. But my take on what has happened to school teachers and teaching is the model for what will happen to many others. I say this with no pleasure: there are few jobs more important. But the tragedy of schoolteaching — which is our tragedy — will continue to unfold as successive gangs of politicians of either armed with nothing more than some borrowed bullet points play to the gallery. Similarly, in higher education within a timescale of almost 40 years, I have seen at first-hand changes that would make me argue that not only are the days of Donnish Dominion(to use Halsey’s phrase2) well and truly over, but that most UK universities will be unable to recruit the brightest to their cause. I think we see that in clinical academia already — and not just in the UK. Amidst all those shiny new buildings moulded for student experience (and don’t forget the wellness centres…); the ennui of corporate mediocrity beckons. The bottom line is the mission statement.
As for medicine, a few quotes below from an FT article from late last year. I assume that without revolutionary change, we will see more and more medical students, and more and more doctors leaving mid-career. If you keep running to stand still, the motivation goes. And that is without all the non-COVID-19 effects of COVID-19.
One of the major factors for doctors is the electronic record system. It takes a physician 15 clicks to order a flu shot for a patient, says Tait. And instead of addressing this problem, healthcare companies end up offering physicians mindfulness sessions and healthy food options in the cafeteria, which only frustrates them further…[emphasis added]
Over the past few years, efforts have been made to increase the number of medical schools in the US to ensure that there is no shortage of doctors. “When you think about how much we’ve invested to create, roughly, 10 to 12 new medical schools in the last decade, at hundreds of millions of dollars per school, just to increase the pipeline of physicians being trained, we also need to think at the far end of the physicians who are leaving medicine because of burnout,” says Sinsky.
Take the case of a final-year resident doctor in New York, who spends a considerable part of his shift negotiating with insurance companies to justify why his patient needs the medicines he prescribed. “When I signed up to be a doctor, the goal was to treat patients, not negotiate with insurance providers,” he says.
According to Tait, 80 per cent of the challenge faced by doctors is down to the organisation where they work, and only 20 per cent could be attributed to personal resilience.
Re the final quote, 80:20 is being generous to the organisations.
Many years ago I was expressing exasperation at what I took to be the layers and layers of foolishness that meant that others couldn’t see the obvious — as defined by yours truly, of course. Did all those wise people in the year 2000 think that gene therapy for cancer was just around the corner, or that advance in genetics was synonymous with advance in medicine, or that the study of complex genetics would, by the force of some inchoate logic, lead to cures for psoriasis and eczema. How could any society function when so many of its parts were just free-riding on error, I asked? Worse still, these intellectual zombies starved the new young shoots of the necessary light of reason. How indeed!
William Bains, he of what I still think of as one of the most beautiful papers I have ever read1, put me right. William understood the world much better than me — or at least he understood the world I was blindly walking into, much better. He explained to me that it was quite possible to make money (both ‘real’ or in terms of ‘professional wealth’) out of ideas that you believed to be wrong as long as two linked conditions were met. First, do not tell other people you believe them to be wrong. On the contrary, talk about them as the next new thing. Second, find others who are behind the curve, and who were willing to buy from you at a price greater than you paid (technical term: fools). At the time, I did not even understand how pensions worked. Finally, William chided me for my sketchy knowledge of biology: he reminded me that in many ecosystems parasites account for much, if not most, of the biomass. He was right; and although my intellectual tastes have changed, the sermon still echoes.
The reason is that corporate tax burdens vary widely depending on where those profits are officially earned. These variations have been exploited by creative problem-solvers at accountancy firms and within large corporations. People who in previous eras might have written symphonies or designed cathedrals have instead saved companies hundreds of billions of dollars in taxes by shifting trillions of dollars of intangible assets across the world over the past two decades. One consequence is that many companies avoid paying any tax on their foreign sales. Another is that many countries’ trade figures are now unusable. [emphasis added].
Trade Wars Are Class Wars: How Rising Inequality Distorts the Global Economy and Threatens International by Matthew C. Klein, & Michael Pettis.
But after completing medical training, Sacks fled the homophobic confines of his nation and family—his mother had called him “an abomination.” Paul Theroux tells Burns that Sacks’s “great luck” was ending up in Los Angeles in 1960, where he found ample “guys, weights, drugs, and hospitals.”
Advance requires those who can imagine new spaces, and medicine is even more hostile today than it was all those years ago. We pretend otherwise, thinking those tick-box courses will suffice, but real diversity of intellect is the touchstone of our future.
My experience is limited, but everything I know suggests that much IT in healthcare diminishes medical care. It may serve certain administrative functions (who is attending what clinic and when etc), and, of course, there are certain particular use cases — such as repeat prescription control in primary care — but as a tool to support the active process of managing patients and improving medical decision making, healthcare has no Photoshop.
In the US it is said that an ER physician will click their mouse over 4000 times per shift, with frustration with IT being a major cause of physician burnout. Published data show that the ratio of patient-facing time to admin time has halved since the introduction of electronic medical records (i.e things are getting less efficient). We suffer slower and worse care: research shows that once you put a computer in the room eye contact between patient and physician drops by 20-30%. This is to ignore the crazy extremes: like the hospital that created PDFs of the old legacy paper notes, but then — wait for it — ordered them online not as a time-sequential series but randomly, expecting the doc to search each one. A new meaning for the term RAM.
There are many proximate reasons for this mess. There is little competition in the industry and a high degree of lock-in because of a failure to use open standards. Then there is the old AT&T problem of not allowing users to adapt and extend the software (AT&T famously refused to allow users to add answering machines to their handsets). But the ultimate causes are that reducing admin and support staff salaries is viewed as more important than allowing patients meaningful time with their doctor; and that those purchasing IT have no sympathy or insight into how doctors work.
As far as UI is concerned — I think this is what personal/interactive computing is about, and so I always start with how the synergies between the human and the system would go best. And this includes inventing/designing a programming language or any other kind of facility. i.e. the first word in “Personal Computing” is “Person”. Then I work my way back through everything that is needed, until I get to the power supply. Trying to tack on a UI to “something functional” pretty much doesn’t work well — it shares this with another prime mistake so many computer people make: trying to tack on security after the fact …[emphasis added]
I will say that I lost every large issue on which I had a firm opinion.
That “scientific management” bungled the algorithm for children’s exam results, verifies a maxim attributed to J.R. Searle, an American philosopher: if you have to add “scientific” to a field, it probably ain’t.
AD.Pellegrini in a letter to the Economist.
I have written elsewhere about this in medicine and science. We used to have physiology, but now some say physiological sciences; we used to have pharmacology, but now often see pharmacological sciences1. And as for medicine, neurology and neurosurgery used to be just fine, but then the PR and money grabbing started so we now have ‘clinical neuroscience’ — except it isn’t. As Herb Simon pointed out many years ago, the professions and professional practice always lose out in the academy.
The following is from Scot Galloway at NYU Stern. He shoots from the hip, and sometimes only thinks afterwards. But he is interesting, brave, and more often right than most. I think I would have hated what he said when I was ready (sic) to go to university. But now, I think I wasn’t, and for medicine in particular, allowing 17 year olds to fall into the clutches of the GMC and their ilk should be a crime against….
Gap years should be the norm, not the exception. An increasingly ugly secret of campus life is that a mix of helicopter parenting and social media has rendered many 18-year-olds unfit for college. Parents drop them off at school, where university administrators have become mental health counselors. The structure of the Corona Corps would give kids (and let’s be honest, they are still kids) a chance to marinate and mature. The data supports this. 90% of kids who defer and take a gap year return to college and are more likely to graduate, with better grades. The Corps should be an option for non-college-bound youth as well.
“We’re going through a Copernican revolution of healthcare, where the patient is going to be at the centre. The gateway to healthcare is not going to be the physician. It’s going to be the smartphone.”…
“Christofer Toumazou, chief scientist at the Institute of Biomedical Engineering at Imperial College London, says there are “megabucks” to be saved by using technology and data to shift the focus of healthcare towards prevention.”
Ahem. I have been reading Seamus O’Mahony’s excellent Can Medicine be Cured in which he does a great job of following up on the crazy hype of big genetics from 20 year ago (and many other areas of sales masquerading as science). The above quotes are from only seven years ago. Still crazy after all these years, sings Paul Simon. Health care excels at adding tech as a new layer of complexity rather than replacing existing actors. And when will people start realising that prevention — which may indeed reduce suffering — will often increase costs. Life is a race against an army of exponential functions.
In the FT
A few months back, I was walking past the entrance of the old Edinburgh Medical School, founded in 1726. A not-so-crazy thought came into my head, one that I could not dismiss: we need to move on from the idea that a Medical School must be situated within a University (and of course, it wasn’t always, anyway). The founding set of ideas that we have struggled with ever since Flexner, we should now recast for a very different world. We need to create something new, something that makes sense in terms of a university and something that puts professional training within a professional context. At present, we fail on both of these accounts. Rather than integrate we should fracture. We need to search out our own new world.
Specialisation and the division of labour is as old as humanity, and of course it goes way back further when we are talking biology. Adam Smith may have formalised why and how it was important economically but he did not invent it. Most specialisation relies on expertise, at least it used to until Crapita and the like started mining the seams of government ignorance.
The quote below is from an article in the Economist in May this year. It is about Public Health England (PHE) and how since they only possessed 290 contact tracers, they needed to call on those wonderful experts in everything, Serco, to help them out. Of course, expertise in such tasks always used to reside with Local Government, not PHE, but Boris and his bunch of Maoists, when they are not having their eyes tested in the fast lane, have decreed that Local Government — along with the opposition, the judges, the education sector and more — are enemies of the people. Given this mindset, we are left with those whose main area of expertise is commercialising ignorance.
Firms such as Serco, a big contractor, are in talks with the government to provide the workforce. It should be possible to train new recruits fairly quickly—the requirements of the job are similar to those of 111 operators, for whom the training time is just four hours. They will work from a script that guides them through the various stages of an interview [emphasis added].
Awhile back, I ended up corresponding with somebody in the Scottish government about how misleading their self-help pages on skin disease were: they contained factual errors, and would mislead people seeking medical help. The content had clearly not been written by a medical practitioner — defined as somebody with domain clinical expertise and who might have actually dealt with patients by shaking hands with them. Asking for validation studies or some sort of empirical evidence to support the content, was unhelpful as the content was supplied by another agency and was commercially ‘confidential’. I didn’t follow up because the person I corresponded with clearly knew that his own position was both untenable, and uncomfortable. Its just business: you know, ‘new ways of working’, ‘direction of travel’, and all those other vacuous suitcase terms that just mark a space where reason or domain expertise used to reside.
Rather than making clever machines, or allowing humans to do what only humans can do1, it seems we are content to make humans behave as stupidly as Excel spreadsheets. 111 is not for BoJo et al.; 111 is for poor people waiting to be levelled up, even if the best way to do that, is to go to straight to A&E. 2
I read about the QALY (quality adjusted life year) during my intercalated degree in 1980-81, when we were exposed to some health economics. It was considered new and interesting at the time. It took me about 10 minutes to sense that it was nonsense, even if I couldn’t quite put my feelings into words that quickly1. The goal was fine, but the methodology was metaphysical in nature, rather than grounded in the world that you could touch with your fingers. At least not if you look at the world through the prism of the natural sciences.
Economists have a disturbing habit of confusing how the world works with their own (strange) ideas of rationality. If only the world could be said to work in a way that was amenable to their methods. When physicists wanted to estimate the speed of light they recognised that they had to create some theory and some technology in order to obtain the correct answer. Embarrassingly — at least from the economists point of view — they had to do some experiments and see if their answer made sense when applied to new observations in the external world. Until they had done this, they stayed shtum.
Not so, for our economists. Their solution is effectively to agree some conventions, and then define what the speed of light should be. Whether their theory explains the way the world really works is neither here-nor-there. So QUALYs became a make-believe that suited both economists and the technocrats in government. The former, because the need for QUALYs became a job creation scheme for health economists (just as evidence based medicine (EBM) became a lifeline for all those epidemiologists who belatedly realised that much of their subject was methodologically deeply flawed). The technocratic governments liked what the economists brought them because it exiled judgement (and hence blame), allowing human suffering to be traded in arbitrage markets from which they could metaphorically wash their hands — ‘just following the science’, ‘just following the science’ (ring any bells?). Many politicians don’t want to do politics, but they do want to stay in power. As do economists2, who appear pathologically obsessed with rank and status3. The Economist had a nice line earlier this year germane to my doubts:
But unlike poets, economists prefer to quantify their analogies—to measure whether thou art 15% or 20% more lovely and more temperate.
But if you think that artificial models that cannot predict the world are still useful — useful in the way the philosophers trolley problems are — then the quote below should indeed make you sit up and stare.
If we’re willing to pay $150,000 for each quality-adjusted extra year of life (a commonly used estimate), then we ought to view a 10% increase in spending per capita as a good investment if it extended average life expectancy by 2.5 days. That number may give readers pause — hence the importance of clarifying our spending priorities and focusing on care that produces real value for patients. With such a focus, we could feel more confident that higher health care spending was worth it.
(Image of NotGeld (emergency money) at top of page from here)
Doctors need three qualifications: to be able to lie and not get caught; to pretend to be honest; and to cause death without guilt.” So wrote Jean Froissart, a diarist of the Middle Ages, after an outbreak of bubonic plague in the 14th century. Fake news then meant rumours that the plague could be cured by sitting in a sewer, eating decade-old treacle or ingesting arsenic.
Someone in your family has fallen ill with a respiratory infection that has already killed large numbers. Your small house means that you do not have enough room to quarantine them. Your have little money, and the hospitals are full. You contact the local public health authority.
Not to worry, you are told: A crew will be by shortly to set up a sturdy, well-ventilated, portable, tiny house in your yard. Once installed, your family member will be free to convalesce in comfort. You can deliver home-cooked meals to their door and communicate through open windows — and a trained nurse will be by for regular examinations. And no, there will be no charge for the house.
A fascinating story by Naomi Klein in the Intercept. Seemingly from a time when government knew what government was for.
This is not a dispatch from some future functional United States, one with a government capable of caring for its people in the midst of spiraling economic carnage and a public health emergency. It’s a dispatch from this country’s past, a time eight decades ago when it similarly found itself in the two-fisted grip of an even deeper economic crisis (the Great Depression), and a surging contagious respiratory illness (tuberculosis).
Whenever I have looked at the CVs of many young doctors or medical students I have often felt saddened at what I take to be the hurdles than many of them have had to jump through to get into medical school. I don’t mean the exams — although there is lots of empty signalling there too — but the enforced attempts to demonstrate you are a caring or committed to the NHS/ charity sector person. I had none of that; nor do I believe it counts for much when you actually become a doctor1. I think it enforces a certain conformity and limits the social breadth of intake to medical school.
However, I did
do things work outside school before going to university, working in a variety of jobs from the age of 14 upwards: a greengrocer’s shop on Saturdays, a chip shop (4-11pm on Sundays), a pub (living in for a while 😃), a few weeks on a pig-farm (awful) and my favourite, working at a couple of petrol stations (7am-10pm). These jobs were a great introduction to the black economy and how wonderfully inventive humanity — criminal humanity— can be. Naturally, I was not tempted😇. Those in the know would even tell you about other types of fraud in different industries, and even that people actually got awarded PhDs by studying and documenting the sociology of these structures (Is that why you are going to uni, I was once asked).
On the theme of that newest of crime genres — cybercrime — there is a wonderful podcast reminding you that if much capitalism is criminal, there is criminal and there is criminal. But many of the iconic structures of modern capitalism — specialisation, outsourcing and the importance of the boundaries between firm and non-firm — are there. Well worth a listen.
I think there is a danger in exaggerating the role of caring and compassion in medicine. I am not saying you do not need them, but rather that I think they are less important that the technical (or professional) skills that are essential for modern medical practice. I want to be treated by people who know how to assess a situation and who can judge with cold reason the results of administering or withholding an intervention. If doctors were once labelled priests with stethoscopes, I want less of the priest bit. Where I think there are faults is in the idea that you can contribute most to humanity by ‘just caring’. The Economist awhile back reported on an initiative from the Centre for Effective Altruism in Oxford. The project labelled the 80,000 hours initiative advises people on which careers they should choose in order to maximise their impact on the world. Impact should be judged not on how much a particular profession does, but on how much a person can do as an individual. Here is a quote relating to medicine:
Medicine is another obvious profession for do-gooders. It is not one, however, on which 80,000 Hours is very keen. Rich countries have plenty of doctors, and even the best clinicians can see only one patient at a time. So the impact that a single doctor will have is minimal. Gregory Lewis, a public-health researcher, estimates that adding an additional doctor to America’s labour supply would yield health benefits equivalent to only around four lives saved.
The typical medical student, however, should expect to save closer to no lives at all. Entrance to medical school is competitive. So a student who is accepted would not increase a given country’s total stock of doctors. Instead, she would merely be taking the place of someone who is slightly less qualified. Doctors, though, do make good money, especially in America. A plastic surgeon who donates half of her earnings to charity will probably have much bigger social impact on the margin than an emergency-room doctor who donates none.
Yes, the slightly less qualified makes me nervous.
Henry Miller died a few months before I started medical school in Newcastle in 1976. At the time of his death he was VC of the university having been Dean of Medicine and Professor of Neurology. By today’s standards he was a larger than life figure. I like reading what he said about medical education, although with hindsight I think he was wrong about many if not most things. But there was a freshness and sense of spirited independence of mind in his writing that we not longer see in those who run our universities (with some notable exceptions such as Louise Richardson). In the time of COVID we should remember the costs of conformity and patronage.
It would be naive to express surprise at the equanimity with which successive governments have regarded the deteriorating hospital service, since it is in the nature of governments to ignore inconvenient situations until they become scandalous enough to excite powerful public pressure. Nor, perhaps, should one expect patients to be more demanding: their uncomplaining stoicism springs from ignorance and fear rather than fortitude; they are mostly grateful for what they receive and do not know how far it falls short of what is possible. It is less easy to forgive ourselves…..Indeed election as president of a college, a vice chancellor, or a member of the University Grants committee usually spells an inevitable preoccupation with the politically practicable, and insidious identification with central authority, and a change of role from informed critic to uncomfortable apologist.
Originally published in the Lancet, 1966,2, 647-54. (This version from ‘Remembering Henry’, edited by Stephen Lock and Heather Windle).
The alternative to science is academic politics, where persistent disagreement is encouraged as a way to create distinctive sub-group identities.
The usual way to protect a scientific discussion from the factionalism of academic politics is to exclude people who opt out of the norms of science. The challenge lies in knowing how to identify them.
I can agree go along with both, but it is in the details that the daemons feast. It appears to me that the ‘norms of science’ argument is itself problematic, reminding me of those silly things you learn at school about the scientific method 1. The historical origin of the concept of the scientific method owed more to attempts to brand certain activities in the eyes of those who were not practicing scientists 2. As a rough approximation, the people who talk about the scientific method tend not to do science. Of course, in more recent times, the use of the term ‘science’ itself has been a flag for obtaining funding, status or approval. Dermatology is now dermatological sciences ; pharmacology is now pharmacological sciences. Even more absurd, in the medical literature I see the term delivery science (and I don’t mean Amazon), or reproducibility science. The demarcation of science from non-science is a hard philosophical problem going back way before Popper; I will not solve it. The danger is that we might end up exiling all those meaningful areas of human rationality that we once — rightly — considered outwith science, but still valued. There is indeed a subject that we might reasonably call medical science(s). It is just not synonymous with the principles and practice of medicine. It is also why political economy is a more useful subject than economics (or worse still, economic sciences).
As with HIV, “an epidemic reveals the fault lines in society. The big one this epidemic has revealed is how we treat the elderly. We often park them in pre-mortuary type institutions and give a bit of money and hope it is OK”.
When the tide goes out you see who is not wearing bathing costumes…
Once there was General Practice, medicine in the image of the late and great Julian Tudor-Hart. Then there was Primary Care. The following article from Pulse made me sit up and wonder whether we have got it right.
Under the five-year contract announced last year, networks were to receive 70% of the funding to employ a pharmacist, a paramedic, a physiotherapist and a physician associate, and 100% of the funding for a hiring social prescriber, by 2023/24… Six more roles will now be added to the scheme from April ‘at the request of PCN clinical directors’ – pharmacy technicians, care co-ordinators, health coaches, dietitians, podiatrists and occupational therapists…PCNs can choose to recruit from the expanded list to ‘make up the workforce they need’…The document added that mental health professionals, including Improving Access to Psychological Therapy (IAPT) therapists, will be added from April 2021 following current pilots…NHS England will also explore the feasibility of adding advanced nurse practitioners (ANPs) to the scheme [emphasis added].
Adam Smith among others pointed out the advantages of specialisation. We owe virtually all of the modern capitalist world to the power of this insight. But we also know that there are opposing forces — and not just those of the Luddites. Just think back to Ronald Coase and the Theory of the Firm. Why do companies not outsource everything? Why are there companies at all? Simply because under some circumstances transaction costs and formalisation of roles and contracts limit outsourcing 1. Contra the English approach is that of the Buurtzorg (links here, here and here) in the Netherlands where it is explicit that many of the tasks undertaken by highly skilled staff do not require high level skills. But — so the argument goes — the approach is more successful, robust and rewarding for both patients and staff. This is closer to the Tudor-Hart model. It really does depend on what sort of widgets you are dealing with, and whether fragmentation of activity improves outcomes, or merely diminishes costs in situations where outcomes are hard to define in an Excel spreadsheet.