Bruce Charlton pointed me to this entry in Wikipedia on Seymour Cray
“One story has it that when Cray was asked by management to provide detailed one-year and five-year plans for his next machine, he simply wrote, “Five-year goal: Build the biggest computer in the world. One year goal: One-fifth of the above.” And another time, when expected to write a multi-page detailed status report for the company executives, Cray’s two sentence report read: “Activity is progressing satisfactorily as outlined under the June plan. There have been no significant changes or deviations from the June plan.”
Which brings to mind Sydney Brenner’s comments that eventually requests for research grant funding will eventually resemble flow diagrams recording who reports to who.
We are living in dark times, and since I have been sifting through the ashes of a career, it is no surprise that failures signal through like radioactive tracers. Below is one.
Through most of my career I have been interested in the relation between science and medicine. In truth, if what matters is what you think about in the shower, I have been more interested in the relation between science and medicine than I have been interested in either activity in isolation. If I were to use a phrase to describe my focus, although it is a term that I would not have used then, I am interested in the epistemological foundations of medical practice. Pompous, I agree. I could use another phrase: what makes medicine and doctors useful? Thinking about statistical inference is a part of this topic, but there is much more to explore.
These issues became closer to my consciousness soon after I moved to Edinburgh. My ideas about what was going on were not shared by many locally, and I was nervous about going public in person rather than in print at a Symposium hosted by the Royal College of Physicians of Edinburgh. My nervousness was well founded: whilst I liked my abstract, my talk went down badly. Not least because it was truly dreadful (and the evident failure still rankles). Jan Vandenbroucke, one of the other speakers and somebody whose work I greatly admire (his paper in the Lancet, Homoeopathy trials: Going nowhere. [Lancet.1997;350:824], was to me the most important paper published in the Lancet in the 1990s), said some kind words to me afterwards, muttering that I had tried to say far too much to an audience that was ill prepared for my speculations. All true, but he was just being kind. It was worse than that.
Anyway, some tidying up deep in my hard drive surfaced the abstract. I still like it, but it is a shame that at the appropriate time I was unable to explain why.
JAMES LIND SYMPOSIUM: From scurvy to systematic reviews and clinical guidelines: how can clinical research lead to better patient care? (31-10-2003, RCPE Edinburgh)
There are three great branches of science: theory, experiment, and computation. (Nick Trefethen)
Advance in the mid-third of the twentieth century, the golden age of medical research, was predicated on earlier discoveries in the nineteenth century in both physiology and medicinal chemistry (1). Genetics dominated biology in the latter third of the twentieth century and many believe changes in medical practice will owe much to genetics over the next third-century (1). I disagree, and I will give an alternative view more credence: in 30 years’ time we will look back more to Neumann and Morgenstern than we will to Watson and Crick. What the Nobel laureate Herbert Simon referred to as The Sciences of the Artificial (2), subjects which have largely been peripheral to medicine, will become central.
Over the last 20 years we have seen the first (largely inadequate, I would add) attempts to explicitly demarcate methods of obtaining and promulgating knowledge about clinical practice (3,4). This has usually taken the form of proselytising a particular set of terms – systematic reviews, evidence-based practice, guidelines and the like, terms that have little to commend them or rigour. What is interesting, however, is that they reflect a long overdue renaissance of interest with the practice of medicine and medical epistemology.
The change of emphasis from the natural to the artificial is being driven by a number of forces, mostly extraneous to biomedicine: the increasing instrumental role of science in medicine and society; the increase in corporatisation of knowledge, whether by private corporations or monopsonistic institutions like the NHS (5); the rising costs of healthcare; and a remaining inability to frame questions with broad support about how to chose between alternative disease states at the level of society (6,7).
I will try to illustrate some of these issues by the use of three examples. First, the widespread use of a mode of statistical inference largely ill-suited to medicine, namely Neyman-Pearson hypothesis testing (decision-making), and the way in which this paradigm has been used to undermine expert opinion (8). Second, I will argue that we need to think much harder about clinical practice and fashion a more appropriate theoretical underpinning for clinical behaviour. Third, I will suggest how UK medical schools, in so far as they remain interested in clinical practice, should look to alternative models, perhaps business and law schools, for ideas of how they should operate (2).
Afterword. The symposium used structured abstracts, a habit that might have a place somewhere in this galaxy, but out of choice I would prefer to live in another one. Anyway, in the published version, it reads:
A fair cop.
Alfred North Whitehead: “Some of the major disasters of mankind have been produced by the narrowness of men with a good methodology” (The Function of Reason).
Comments that seem germane to some of our current day covid-19 debates.
People are always demanding that medical students must learn this or that (obesity, psychiatry, dermatology, ID, eating disorders). The result is curriculum overload, a default in favour of rote learning by many students, and the inhibition of curiosity. It was not meant to be like this, but amongst others, the GMC, the NHS, and others have pushed a vision of university medical education that shortchanges both the students and medical practice over the long term. Short-termism rules. Instead of producing graduates who are ready to learn clinical medicine is an area of their choice, we expect them to somehow come out oven-ready at graduation. I do not believe it is possible to do this to a level of safety that many other professions demand, nor is this the primary job of a university. Sadly, universities have given up on arguing, intimidated by the government and their regulatory commissars, and nervous of losing their monopoly on producing doctors.
But I will make a plea that one area really does deserve more attention within a university : the history of how medical advance occurs. No, I do not mean MCQs asking for the date of birth of Robert Koch or Lord Lister, but a feel for the historical interplay of convention and novelty. Without this our students and our graduates are almost confined to living in the present, unaware of the past, and unable to doubt how different the future will be. Below is one example.
”In 1938 Albert Hofmann, a chemist at the Sandoz Laboratories in Basel, created a series of new compounds from lysergic acid. One of them, later marketed as Hydergine, showed great potential for the treatment of cerebral arteriosclerosis. Another salt, the diethylamide (LSD), he put to one side, but he had “a peculiar presentiment,” as he put it in his memoir LSD: My Problem Child (1980), “that this substance could possess properties other than those established in the first investigations.
In 1943 he prepared a fresh batch of LSD. In the final process of its crystallization, he started to experience strange sensations. He described his first inadvertent “trip” in a letter to his supervisor:
At home I lay down and sank into a not unpleasant, intoxicated-like condition, characterized by extremely stimulated imagination. In a dream-like state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.
After eliminating chloroform fumes as a possible cause, he concluded that a tiny quantity of LSD absorbed through the skin of his fingertips must have been responsible. Three days later he began a program of unsanctioned research and deliberately ingested 250 micrograms of LSD at 4:20 PM. Forty minutes later, he wrote in his lab journal, “Beginning dizziness, feeling of anxiety, visual distortions, symptoms of paralysis, desire to laugh.” He set off home on his bicycle, accompanied by his laboratory assistant. This formal trial of what Hofmann considered a minute dose of LSD had more distressing effects than his first chance exposure:
Every exertion of my will, every attempt to put an end to the disintegration of the outer world and the dissolution of my ego, seemed to be wasted effort. A demon had invaded me, had taken possession of my body, mind, and soul. I jumped up and screamed, trying to free myself from him, but then sank down again and lay helpless on the sofa…. I was taken to another world, another place, another time.
A doctor was summoned but found nothing amiss apart from a marked dilation of his pupils. A fear of impending death gradually faded as the drug’s effect lessened, and after some hours Hofmann was seeing surreal colors and enjoying the play of shapes before his eyes.
Many editors of learned medical journals now automatically turn down publications describing the sort of scientific investigation that Albert Hofmann carried out on himself. Institutional review boards are often scathing in their criticism of self-experimentation, despite its hallowed tradition in medicine, because they consider it subjective and biased. But the human desire to alter consciousness and enrich self-awareness shows no sign of receding, and someone must always go first. As long as care and diligence accompany the sort of personal research conducted by Pollan and Lin, it has the potential to be as revealing and informative as any work on psychedelic drugs conducted within the rigid confines of universities.
Richard Horton in the Lancet writes:
Imagine if the entire edifice of knowledge in medicine was built upon a falsehood. Systematic reviews are said to be the highest standard of evidence-based health care. Regularly updated to ensure that treatment decisions are built on the most up-to-date and reliable science, systematic reviews and meta-analyses are widely used to inform clinical guidelines and decision making. Powerful organisations have emerged to construct a knowledge base in medicine underpinned by the results of systematic reviews. One such organisation is Cochrane, with 11 000 members in over 130 countries. This extraordinary movement of people is justifiably passionate about the idea that it is contributing to better health outcomes for everyone, everywhere. The industry that drives the production of systematic reviews today is financed by some of the most influential agencies in medical research. Cochrane, for example, points to three funders providing over £1 million each—the UK’s National Institute for Health Research (NIHR), the US National Institutes of Health (NIH), and Australia’s National Health and Medical Research Council (NHMRC).
Well, it really is a bit late for all this soul searching. See my earlier post here ‘Mega-silliness’ (commenting on what others had already pointed out); or my Evidence Based Medicine: the Epistemology That Isn’t, written over 20 years ago; and my contribution to the wake (even if I didn’t put my hand in my pocket), Why we should let “evidence-based medicine” rest in peace. The genesis of EBM was as a cult whose foundational myth was that P values could act as a true machine. Those followers who had originally hoped for a place in the promised afterlife, soon settled for paying the bills, and EBM morphed into a career opportunity for those who found accountancy too daring. So, pace John Mayall on Jazz Blues Fusion, don’t come here to listen to an old record. I promise.
Two letters in Lancet Oncology. This bloody story never ends. We have not invented truth machines: judgement has never been exiled from discovery.
Stanley Cohen has died. A special place for those of us hooked on the ectoderm. Some nice comments about him in the Lancet from Geoff Watts.
A May, 1962, issue of the Journal of Biological Chemistry included a deceptively arcane study on the isolation of a protein that could accelerate incisor eruption and eyelid opening in newborn mice. The author, Stanley Cohen, later to become Professor Emeritus of Biochemistry at Vanderbilt University School of Medicine (VUSM) in Nashville, TN, USA, had named his protein “tooth-lid factor”. Cohen’s subsequent studies would not only lead him to rename the protein epidermal growth factor (EGF), but also mark him out as one of the founders of a new area of biology and eventually win him a Nobel Prize.
[says Lawrence Marnett], “When he came here he began studying some growth factors in animal cell extracts. One was of mouse submaxillary gland…It had peptides in it, and when he injected them into newborn mice their teeth broke though earlier than normal, and their eyelids opened sooner.” Cohen’s subsequent studies revealed that his extract worked by stimulating the growth of epidermal cells. Having consequently renamed the material EGF, he devoted the rest of his career to studying it. “He went on to identify the EGF receptor and define target cells that would respond to EGF”, recalls Graham Carpenter, Emeritus Professor of Biochemistry at VUSM, who joined Cohen’s lab in 1973 and worked with him on EGF as a postdoctoral fellow. The EGF receptor proved to be a useful target for drugs, and Cohen’s discoveries opened the door to research on diseases ranging from dementia to cancer. “He understood EGF’s biological importance”, says Carpenter. “But we did not have any idea that this would extend to cancer biology in a major way.”
And as for that most successful of all biology labs, the style of exploration is familiar.
[Graham Carpenter] “In contrast to today, his research group was very small, seldom more than four people—himself, two technicians, and a postdoc…He was central to whatever was going on in the lab.” [Lawrence] Marnett also recalls that determination: “He was one of those guys that was just driven by his desire to understand how things work…It was a classic example of making an observation and then drilling down to try to understand it, not knowing what you’re going to find.” And at that time there was plenty to be found. Cohen, as Marnett puts it, was basically “mining gold”.
Terrific article on Covid-19 (Sars-CoV-2). in the LRB by Rupert Beale. He says written in haste but it doesn’t read that way. It contains some memorable lines.
As the US health secretary Michael Leavitt put it in 2006, ‘anything we say in advance of a pandemic happening is alarmist; anything we say afterwards is inadequate.’
And how do you think hard about research funding for the long term (I am old enough to remember when stroke and dementia were virtually non-subjects as far as ‘good research funding’ was concerned).
Virologists need more than clever tricks: we also need cash. Twenty years ago, funding wasn’t available to study coronaviruses. In 1999, avian infectious bronchitis virus was the one known truly nasty coronavirus pathogen. Only poultry farmers really cared about it, as it kills chickens but doesn’t infect people. In humans there are a number of fairly innocuous coronaviruses, such as OC43 and HKU1, which cause the ‘common cold’. Doctors don’t usually bother testing for them – you have a runny nose, so what?
And note the conditional tense:
The global case fatality rate is above 3 per cent at the moment, and if – reasonable worst case scenario – 30-70 per cent of the 7.8 billion people on earth are infected, that means between 70 and 165 million deaths. It would be the worst disaster in human history in terms of total lives lost. Nobody expects this, because everyone expects that people will comply with efficient public health measures put in place by responsible governments.
And to repeat my own mantra (stolen from elsewhere): the opposite of science is not art, but politics:
The situation isn’t helped by a president [Trump] who keeps suggesting that the virus isn’t that bad, it’s a bit like flu, we will have a vaccine soon: stopping flights from China was enough. Tony Fauci, the director of the National Institute of Allergy and Infectious Disease, deftly cut across Trump at a White House press briefing. No, it isn’t only as bad as flu, it’s far more dangerous. Yes, public health measures will have to be put in place and maintained for many months. No, a vaccine isn’t just around the corner, it will take at least 18 months. Fauci was then ordered to clear all his press briefings on Covid-19 with Mike Pence in advance: the vice president’s office is leading the US response to the virus. ‘You don’t want to go to war with a president,’ Fauci remarked.
And Beale ends by quoting an ID colleague.
This is not business as usual. This will be different from what anyone living has ever experienced. The closest comparator is 1918 influenza.
Caution: pace the author, ‘This is a fast-moving situation, and the numbers are constantly changing – certainly the ones I have given here will be out of date by the time you read this.’
Link. (London Review of Books: Vol. 42 No. 5, 5 March 2020: “Wash your Hands”: Rupert Beale)
I have spend a lot of time recently sifting through the detritus of a career. Finally — well, I hope, finally — I have managed to sort out my books. All neatly indexed in Delicious Library, and now for once the virtual location mirrors the physical location. For how long I do not know. Since I often buy books based on reviews, I used to put a copy of the review in with the book (a habit I have dropped but need to restart). I rediscovered this one by David Colquhoun (DC) reviewing ‘The Diet Delusion’ by Gary Taubes in the BMJ (with the unexpurgated text on his own web site).
I am a big fan of DC as he has lived though the rise and decline of much higher education in the UK. And he remains fearless and honest, qualities that are not always at the forefront of the modern university. Quoting the great Robert Merton he writes:
“The organization of science operates as a system of institutionalized vigilance, involving competitive cooperation. In such a system, scientists are at the ready to pick apart and assess each new claim to knowledge. This unending exchange of critical appraisal, of praise and punishment, is developed in science to a degree that makes the monitoring of children’s behavior by their parents seem little more than child’s play”.
“The institutionalized vigilance, “this unending exchange of critical judgment”, is nowhere to be found in the study of nutrition, chronic disease, and obesity, and it hasn’t been for decades.”
On Taubes and his (excellent book):
It took Taubes five years to write this book, and he has nothing to sell apart from his ideas. No wonder it is so much better than a scientist can produce. Such is the corruption of science by the cult of managerialism that no university would allow you to spend five years on a book
(as would be expected the BMJ omitted the punch line — they would, wouldn’t they?)
There is also a neat quote from Taubes in one of the comments on DC’s page from Beth@IDblog, one that I will try hard not to forget:
Taubes makes a point at the end of the Dartmouth medical grand rounds video that I think is important: “I’m not trying to convince you that it’s true, I’m trying to convince you that it should be taken seriously.”
Today is my last day of (paid) work, and of course a day that will be infamous for many more people for other more important reasons. Europe and my professional life have been intertwined for near on 40 years. In the mid 1980s I went to start my dermatological career in Vienna. I had been a student at Newcastle and done junior jobs there, as well as some research on skin physiology with Sam Shuster as an undergraduate student. Sam rightly thought I should now move somewhere else — see how others did things before returning — and he suggested Paris, or Vienna under Klaus Wolff. Vienna was, and perhaps still is, the centre of the dermatological universe, and has been since the mid 19th century. Now, even if I haven’t got very far into this post — it is a day for nostalgia — so allow me an aside: The German literature Problem.
As I have hinted at above, in many ways there have only been two schools of dermatology: the French school, and the German school. The latter has been dominant. Throughout the second half of the 19th century dermatology was a ‘German speaking’ subject. To follow it you would be wise to know German, and better still to have visited the big centres in Germany, Switzerland or Austria. And like most of the modern research university, German medicine and science was the blueprint for the US and then belatedly — and with typos— for England (Scotland, reasonably, had taken a slightly different path).
All of the above I knew, but when I returned to Newcastle after my first sojourn away (a later one was to Strasbourg), I naturally picked up on all these allusions to the German literature, but they were accompanied by sniggering by those who had been around longer than me. Indeed there seemed to be a ‘German Literature Problem’. Unbeknown to me, Sam had written “das problem ” up in ‘World Medicine’, but World Medicine had been killed off by those from Mordor, so here is a synopsis.
The German literature seemed so vast that whenever somebody described a patient with what they were convinced must be a ‘new syndrome’, some bright spark would say that it had already been described, and that it was to be found in the German literature. Now the synoptic Handbuch der Hautkrankheiten on our shelves in the library in Newcastle ran to over 10 weighty volumes. And that was just the start. But of course only German speaking dermatologists (and we had one) could meaningfully engage in this conversation. Dermatology is enough of a a nightmare even in your own mother tongue. Up to the middle of the 20th century however, there were indeed separate literatures in German, French and English (in the 1960’s the newly formed ESDR had to sort out what language was going to be used for its presentations).
Sam’s sense of play now took over (with apologies to Shaw: nothing succeeds like excess). It appeared that all of dermatology had already been previously described, but more worryingly for the researchers, the same might be true of skin science. In his article in World Medicine he set out to describe his meta-science investigation into this strange phenomenon. Sam has an unrivalled ability to take simple abstract objects — a few straight lines, a circle, a square — and meld them into an argument in the form of an Escher print. A print that you know is both real, unreal and illegal. Imagine a dastardly difficult 5 x 5 Rubik’s cube (such as the one my colleagues recently bought me for my retirement). You move and move and move the individual facets, then check each whole face in turn. All aligned, problem solved. But then you look in the mirror: whilst the faces are all perfect in your own hands, that is not what is apparent in the mirror. This is my metaphor for Sam’s explanation. Make it anymore explicit, and the German literature problem rears its head. It’s real — of a sort. Anyway, this was all in the future (which didn’t exist at that time), so lets get back to Vienna.
Having left general medical jobs behind in Newcastle, armed with my BBC language tapes and guides, I spent a month travelling through Germany from north to south. I stayed with a handful of German medical students who I had taught in Newcastle when I was a medical registrar (a small number of such students used to spend a year with us in Newcastle). Our roles were now reversed: they were now my teachers. At the end of the month I caught the night train in Ulm, arriving in Vienna early one morning.
Vienna was majestic — stiff collared, yes — but you felt in the heart of Europe. A bit of Paris, some of Berlin and the feel of what lay further east: “Wien ist die erste Balkanstadt”. For me, it was unmistakably and wonderfully foreign.
It was of course great for music, too. No, I couldn’t afford the New Year’s Day Concerts, but there were cheap seats at the Staatsoper, more modest prices at the Volksoper, and more to my taste, some European jazz and rock music. I saw Ultravox sing — yes, what else— “Vienna” in Vienna. I saw some people from the ECM label (eg Eberhard Weber), a style of European jazz music that has stayed with me since my mid teens. And then there was the man (for me) behind ‘The Thrill is Gone’.
I saw BB King on a double bill with Miles Davies at the Stadthalle. Two very different styles of musician. I was more into Miles Davies then, but he was not then at his best (as medics in Vienna found out). I was, however, very familiar with the ‘Kings’ (BB, Freddie, Albert etc) after being introduced to them via their English interpreters. Clapton’s blue’s tone on ‘All Your Love’ with John Mayall’s Bluesbreakers still makes the hairs on my neck stand up (fraternal thanks to ‘Big Al’ for the introduction).
The YouTube video at the top of the page is wonderful (Montreux 1993), but there is a later one below, taken from Crossroads in 2010 which moves me even more. He is older, playing with a bunch of craftsmen, but all still pupils before the master.
But — I am getting there — germane to my melancholia on this day is a video featuring BB King and John Mayer. Now there is a trope that there are two groups of people who like John Mayer: girlfriends; and guitarists who understand just how bloody good he is. As EC pointed out, the problem with John Mayer is that he realises just how good he is. True.
But the banter at the beginning of the video speaks some eternal truths about craft, expertise, and the onward march of all culture — including science. Mayer plays a few BB King licks, teasing King that he is ‘stealing’ them. He continues, it was as though he was ‘stealing silverware from somebody’s house right in front of them’. King replies: ”You keep that up and I’m going to get up and go”. Both know it doesn’t work that way. Whatever the provenance of the phrase ‘great artists steal, not copy’, except in the most trivial way you cannot steal or copy culture: people discover it in themselves by stealing what masters show them might be there in their pupils. Teachers just help people find what they suspect or hope is there. The baton gets handed on. The thrill goes on. And on.
Well, I doubt if any readers of these scribblings will be shocked. After all TIJABP. But this piece by the editor of PNAS wonders if the day of meaningful editing is over. I hope not. Looking backwards over my several hundred papers, the American Journal of Human Genetics was the most rigorous and did the most to improve our manuscript.
“Communication” remains in the vocabulary of scientific publishing—for example, as a category of manuscript (“Rapid Communications”) and as an element of a journal name (Nature Communications)—not as a vestigial remnant but as a vital part of the enterprise. The goal of communicating effectively is also why grammar, with its arcane, baffling, or even irritating “rules,” continues to matter. With the rise of digital publishing, attendant demands for economy and immediacy have diminished the role of copyeditor. The demands are particularly acute in journalism. As The New York Times editorial board member Lawrence Downs (4) lamented, “…in that world of the perpetual present tense—post it now, fix it later, update constantly—old-time, persnickety editing may be a luxury…. It will be an artisanal product, like monastery honey and wooden yachts.” Scientific publishing is catching up to journalism in this regard.
Being a renowned scientist doesn’t ensure success. On the same day that molecular biologist Carol Greider won a Nobel prize in 2009, she learnt that her recently submitted grant proposal had been rejected. “Even on the day when you win the Nobel prize,” she said in a 2017 graduation speech at Cold Spring Harbor Laboratory in New York, “sceptics may question whether you really know what you’re doing.”
My earliest conscious memory of disease and doctors was in the management of my atopic dermatitis. Here is Sam Shuster writing poetically about atopic dermatitis in ‘World Medicine’ in 1983.
A dozen years of agony; years of sleeplessness for child and parents, years of weeping, itching, scaling skin, the look and feel of which is detested.
The poverty of our treatments is made all the worse by the unfair raising of expectations: I don’t mean by obvious folk remedies; I mean medical folk remedies like the recent pseudoscientific dietary treatments which eliminate irrelevant allergens. There neither is nor ever was good evidence for a dietary mechanism. And as for cows’ milk, I would willingly drown its proponents in it. We have nothing fundamental for the misery of atopic eczema and that’s why I would like to see a real treatment—not one of those media breakthroughs, and not another of those hope raising nonsenses like breast-feeding: I mean a real and monstrously effective treatment. Not one of your P<.05 drugs the effect of which can only be seen if you keep your eyes firmly double-blind, I mean a straightforward here today and gone tomorrow job, an Aladdin’s salve—put it on and you have a new skin for old.
Nothing would please me more in the practice of clinical dermatology than never again to see a child tearing its skin to shreds and not knowing how long it will be before it all stops, if indeed it does.
Things are indeed better now, but not as much we need: we still don’t understand the itch nor can we easily block the neural pathways involved. Nor has anything replaced the untimely murder of ‘World Medicine’. A glass of milk has never looked the same since, either.
There is an interesting review in the Economist of the ‘Great Pretender: The Undercover Mission that Changed out Understanding of Madness,’ written by Susan Cahalan. The book is the story of the American psychologist David Rosenhan who “recruited seven volunteers to join him in feigning mental illness, to expose what he called the ‘undoubtedly counter-therapeutic’ culture of his country’s psychiatry”.
Rosenthal’s studies are well known and were influential, and some might argue that may have had have a beneficial effect on subsequent patient care. The question is whether they were true. The review states:
in the end Rosenham emerges as an unpalatable symptom of a wider academic malaise”.
As for the ‘malaise’, the reviewer goes on:
Many of psychology’s most famous experiments have recently been discredited or devalued, the author notes. Immense significance has been attached to Stanley Milgram’s shock tests and Philip Zimbardo’s Stanford prison experiment, yet later re-runs have failed to reproduce their findings. As Ms Cahalan laments, the feverish reports on the undermining of such theories are a gift to people who would like to discredit science itself.
I have a few disjointed thoughts on this. There are plenty of other considered critiques of the excesses of modern medical psychiatry. Anthony Clare’s ‘Psychiatry in Dissent’ was for me the best introduction to psychiatry. And Stuart Sutherland’s “Breakdown’ was a blistering and highly readable attack on medical (in)competence as much as the subject itself (Sutherland was a leading experimental psychologist, and his account is autobiographical). And might the cross-country diagnostic criteria studies not have happened without Rosenham’s work?
As for undermining science (see the quote above), I think unreliable medical science is widespread, and possibly there is more of it than in many past periods. Simple repetition of experiments is important but not sufficient, and betrays a lack of of understanding of why some science is so powerful.
Science owes its success to its social organisation: conjectures and refutations, to use Popper’s terms, within a community. Just repeating an experiment under identical conditions is not sufficient. Rather you need to use the results of one experiment to inform the next, and with the accumulation of new results, you need to build a larger and larger edifice which whilst having greater explanatory power is more and more intolerant of errors at any level. Building large structures out of Lego only works because of the precision engineering of each of the component bricks. But any errors only become apparent when you add brick-on-brick. When a single investigator or group of investigators have skin in the game during this process — and where experimentation is possible — science is at its strongest (the critiques can of course come from anywhere).
An alternative process is when the results of a series of experiments are so precise and robust that everyday life confirms them: the lights go on when I click the switch. This harks back to the reporting of science as ‘demonstrations’.
By these two standards much medical science may be unreliable. First, because the fragmentation of enquiry discourages the creation of broad explanatory theories or tests of the underlying hypotheses. The ‘testing’ is more whether a publishable unit can be achieved rather than nature understood. Second, in many RCTs or technology assessments there is little theoretical framework on which to challenge nature. Nor can everyday practice act as the necessary feedback loop in the way the tight temporal relationship between flipping the switch and seeing the light turn on can.
I spent near on ten years thinking about automated skin cancer detection. There are various approaches you might use — cyborg human/machine hybrids were my personal favourite — but we settled on more standard machine learning approaches. Conceptually what you need is straightforward: data to learn from, and ways to lever the historical data to the future examples. The following quote is apposite.
One is that, for all the advances in machine learning, machines are still not very good at learning. Most humans need a few dozen hours to master driving. Waymo’s cars have had over 10m miles of practice, and still fall short. And once humans have learned to drive, even on the easy streets of Phoenix, they can, with a little effort, apply that knowledge anywhere, rapidly learning to adapt their skills to rush-hour Bangkok or a gravel-track in rural Greece.
You see exactly the same thing with skin cancer. With a relatively small number of examples, you can train (human) novices to be much better than most doctors. By contrast, with the machines you need literally hundreds and thousands of examples. Even when you start with large databases, as you parse the diagnostic groups, you quickly find out that for many ‘types’ you have only a few examples to learn from. The rate limiting factor becomes acquiring mega-databases cheaply. The best way to do this is to change data acquisition from a ‘research task’ to a matter of grabbing data that was collected routinely for other purposes (there is a lot of money in digital waste — ask Google).
Noam Chomsky had a few statements germane to this and much else that gets in the way of such goals (1).
Plato’s problem: How can we know so much when the evidence is do slight.
Orwell’s problem: How do we remain so ignorant when the evidence is so overwhelming.
(1): Noam Chomsky: Ideas and Ideals, Cambridge University Press, (1999). Neil Smith.
Obituaries are a source of much joy and enlightenment. None more so than those in the Economist. Last week’s was devoted to the ’60’s photographer Terry O’Neill (you can see some of his iconic images here.
Stars had been his subject since 1962, when he was sent to photograph a new band at the Abbey Road Studios. The older blokes at the Sketch scorned that kind of work, but the young were clearly on the rise, and he was by far the youngest photographer in Fleet Street at the time. At the studios, to get a better light, he took the group outside to snap them holding their guitars a bit defensively: John, Paul, George and Ringo. Next day’s Sketch was sold out, and he suddenly found himself with the run of London and all the coming bands, free to be as creative as he liked. A working-class kid from Romford whose prospects had been either the priesthood or a job in the Dagenham car plant, like his dad, had the world at his feet. He wouldn’t have had a prayer, he thought, in any other era.
And obviously it couldn’t last. In a couple of years he would find a proper job, as both the Beatles and the Stones told him they were going to. For it was hardly serious work to point your Leica at someone and go snap, snap.
The reason I found this particularly interesting is the way social mobility appeared to work and the way it was tied to genuine innovation and social change. I have always loved the trope that when jobs are plentiful, and your committments minimal, you can literally tell the boss to FO on a Friday and start another job on the Monday. Best of all you can experiment and experiment lifts all. This to me is one of the best 1960’s rock n’ roll stories.
If you lift your head above the parapet in universities you come across various conventional wisdoms. One relates to ‘mental wellbeing’ or ‘mental issues’, and another is the value of education in increasing social mobility. My problem is that in both cases there seem (to me at least) many important questions that remain unanswered. For the former, are we talking about mental illness (as in disease) or something else? How robust is the data — aside from self-reporting? The widely reported comments from the former President of the Royal College of Pyschiatrists receive no answer (at last not in my institution). An example: I have sat in a meeting in which one justification for ‘lecture capture’ (recording of live lectures) was to assist students with ‘mental health issues’. But do they help in this context? Do we trust self-reflection in this area? Under what conditions do we think they help or harm?
Enhancing life chances and social mobility is yet another area that I find difficult. I picked up on a comment from Martin Wolf in the FT
We also believe that changing individual characteristics, principally via education, will increase social mobility. But this is largely untrue. We need to be far more honest.
He was referring to the work of John Goldthorpe in Oxford. Digging just a little beneath the surface made me realise that much of what I had believed may not true. Goldthorpe writes:
However, a significant change has occurred in that while earlier, in what has become known as the golden age of mobility, social ascent predominated over social descent, the experience of upward mobility is now becoming less common and that of downward mobility more common. In this sense, young people today face less favourable mobility prospects than did their parents or their grandparents.
This research indicates that the only recent change of note is that the rising rates of upward, absolute mobility of the middle decades of the last century have levelled out. Relative rates have remained more or less constant back to the interwar years. According to this alternative view, what can be achieved through education, whether in regard to absolute or relative mobility, appears limited.
[Jnl Soc. Pol. (2013), 42, 3, 431–450 Cambridge University Press 2013 doi:10.1017/S004727941300024X]
There is a witty exchange in Propect between the journalist (JD) and Goldthorpe (JG).
JD: Would you say that this is something that politicians, in particular, tend not to grasp?
JG: Yes. Tony Blair, for instance, was totally confused about this distinction [between absolute and relative rates of mobility]. He couldnʼt see that the only way you can have more upward mobility in a relative perspective is if you have more downward mobility at the same time. I remember being in a discussion in the Cabinet Office when Geoff Mulgan was one of Blairʼs leading advisors. It took a long time to get across to Mulgan the distinction between absolute and relative rates, but in the end he got it. His response was: “The Prime Minister canʼt go to the country on the promise of downward mobility!”
On both these topics I am conflicted. And on both these topics there are the tools that characterize scholarly inquiry to help guide action: this is what universities should be about. I am however left with a strong suspicion that few are interested in digging deep, rather we choose sound bites over understanding. Working in a university often feels like the university must be somewhere else. That is the optimistic version.
There is an article this week in Nature about how some funders are explicitly funding grant proposals randomly (lotteries). The cynic might say they have been doing this for a longtime.
Frank Davidoff had a telling phrase about clinical expertise. He likened it to “Dark Matter”. Dark Matter makes up most of the universe, but we know very little about it. In the clinical arena I have spent a lot of time reading and thinking about ‘expertise’, without developing any grand unifying themes of my own worth sharing. But we live in a world where ‘expertise’ in many domains is under assault, and I have no wise thoughts to pull together what is happening. I do however like (as ever) some nice phrases from Paul Graham. I can’t see any roadmap here just perspectives and shadows.
When experts are wrong, it’s often because they’re experts on an earlier version of the world.
Instead of trying to point yourself in the right direction, admit you have no idea what the right direction is, and try instead to be super sensitive to the winds of change.
Putt’s Law: “Technology is dominated by two types of people, those who understand what they do not manage and those who manage what they do not understand.”
Putt’s Corollary: “Every technical hierarchy, in time, develops a competence inversion.” with incompetence being “flushed out of the lower levels” of a technocratic hierarchy, ensuring that technically competent people remain directly in charge of the actual technology while those without technical competence move into management.
The quote below is from a paper in PNAS on how students misjudge their learning and what strategies maximise learning. The findings are not surprising (IMHO) but will, I guess, continue to be overlooked (NSS anybody?). As I mention below, it is the general point that concerns me.
Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom.
In this report, we identify an inherent student bias against active learning that can limit its effectiveness and may hinder the wide adoption of these methods. Compared with students in traditional lectures, students in active classes perceived that they learned less, while in reality they learned more. Students rated the quality of instruction in passive lectures more highly, and they expressed a preference to have “all of their physics classes taught this way,” even though their scores on independent tests of learning were lower than those in actively taught classrooms. These findings are consistent with the observations that novices in a subject are poor judges of their own competence (27⇓–29), and the cognitive fluency of lectures can be misleading (30, 31). Our findings also suggest that novice students may not accurately assess the changes in their own learning that follow from their experience in a class.
The authors go on:
These results also suggest that student evaluations of teaching should be used with caution as they rely on students’ perceptions of learning and could inadvertently favor inferior passive teaching methods over research-based active pedagogical approaches….
As I say above, it is the general rather than the particular that concerns me. Experience and feeling are often poor guides to action. We are, after all, creatures that represent biology’s attempt to see whether contemplation can triumph over reflex. There remains a fundamental asymmetry between expert and novice, and if there isn’t, there is little worth learning (or indeed worth paying for).
The following is from an advert for a clinical academic in a surgical specialty, one with significant on call responsibilities. (It is not from Edinburgh).
‘you will be able to define, develop, and establish a high quality patient-centred research programme’
‘in addition to the above, you will be expected to raise substantial research income and deliver excellent research outputs’
Leaving aside the debasement of language, I simply cannot believe such jobs are viable long term. Many years ago, I was looked after by a surgical academic. A few years later he/she moved to another centre, and I was puzzled as to why he/she had made this career move. I queried a NHS surgeon in the same hospital about this career path. “Bad outcomes”, was the response. She/He needed a clean start somewhere else…
Traditional non-clinical academic careers include research, teaching and administration. Increasingly it is recognised that it is rarely possible to all three well. For clinical academics the situation is worse, as 50% of your time is supposed to be devoted to providing patient care. Over time the NHS workload has become more onerous in that consultants enjoy less support from junior doctors and NHS hospitals have become much less efficient.
All sorts of legitimate questions can be asked about the relation between expertise and how much of your time is devoted to that particular role. For craft specialities — and I would include dermatology, pathology, radiology in this category — there may be ways to stay competent. Subspecialisation is one approach (my choice) but even this may be inadequate. In many areas of medicine I simply do not believe it is possible to maintain acceptable clinical skills and be active in meaningful research.
Sam Shuster always drilled in to me that there were only two reasons academics should see patients: to teach on them, and to foster their research. Academics are not there to provide ‘service’. Some juniors recognise this issue but are reticent about speaking openly about it. But chase the footfall, or lack of it, into clinical academic careers.
Terrific interview with Sydney Brenner about the second greatest scientific revolution of the 20th century.
I think it’s really hard to communicate that because I lived through the entire period from its very beginning, and it took on different forms as matters progressed. So it was, of course, wonderful. That’s what I tell students. The way to succeed is to get born at the right time and in the right place. If you can do that then you are bound to succeed. You have to be receptive and have some talent as well…
To have seen the development of a subject, which was looked upon with disdain by the establishment from the very start, actually become the basis of our whole approach to biology today. That is something that was worth living for.
This goes for more than science and stretches out into far more mundane aspects of life. Is there any alternative?
One of the mantras of psychometrics 101 is that you cannot have validity without reliability. People expel this phrase, like others equilibrate after eating curry and nan-breads with too much gassy beer. In truth, the Platonic obsession with reliability diminishes validity. The world of science and much professional practice, remains messy, and vague until it is ‘done’. The search space for those diamonds of sense and order remains infinite.
Many years in the making, DSM-5 appeared in 2013, to a chorus of criticism; Harrington summarises this crisply (Gary Greenberg’s 2013 Book of Woe gives a painful blow-by-blow account). Harrington suggests that the proliferating symptom categories ceased to carry conviction; in the USA, the leadership of the US National Institutes of Health pivoted away from the DSM approach—“100% reliability 0% validity”, as Harrington writes—stating they would only fund projects with clearly defined biological hypotheses. The big players in the pharmaceutical industry folded their tents and withdrew from the field, turning to more tractable targets, notably cancer. For some mental health problems, psychological therapies, such as cognitive behaviour therapy (CBT), are becoming more popular, sometimes in combination with pharmacotherapy; as Harrington points out, even as far back as the 1970s, trials had shown that CBT outperformed imipramine as a treatment for depression.
Biological psychiatry’s decline and fall | Anne Harrington, Mind Fixers: Psychiatry’s Troubled Search for the Biology of Mental Illness, W W Norton (2019), p. 384, US$ 27·95, ISBN: 9780393071221 – ScienceDirect
I used to use the phrase — with apologies to Freud — ‘eppendorf envy’ to describe the bias in much medical innovation whereby useful advance pretended it owed its magic to ‘basic’ science. Doctors wore white coats in order to sprinkle the laboratory magic on as a veneer. But I like this cognate term also: innovation theatre.
To be fair to the banks, they weren’t the first institutions to recognise the PR value of what Rich Turrin has dubbed innovation theatre. Many institutions before them had cottoned on to the fact that it was a way to score easy points with the public and investors. Think of high impact campaigns featuring “the science bit” for L’Oréal’s Elvive shampoo or Tefal appliance ads: “We have the technology because we have the brains”.
The financial sector has seen enough innovation theatre | Financial Times. The orignal reference is here.
You can dice the results in various ways, but software is indeed eating the world — and the clinic. The (slow) transition to this new world will be interesting and eventful. A good spectator sport for some of us. (Interesting to note that this study in Lancet Oncology received no specific funding. Hmmm).
Direct URL for this post.
The quote below was from a piece in the Lancet by Richard Horton.
Reading [Bertrand]Russell today is a resonant experience. Existential fears surround us. Yet today seems a long way from the dream of Enlightenment. Modern science is a brutally competitive affair. It is driven by incentives to acquire money (research funding), priority (journal publication), and glory (prizes and honours). Science’s metrics of success embed these motivations deep in transnational scientific cultures. At The Lancet, while we resist the idea that Impact Factors measure our achievements, we are not naive enough to believe that authors do not judge us by those same numbers. It is hard not to capitulate to a narrow range of indicators that has come to define success and failure. Science, once a powerful force to overturn orthodoxy, has created its own orthodoxies that diminish the possibility of creative thought and experiment. At this moment of planetary jeopardy, perhaps it is time to rethink and restate the purpose of science.
I am just musing on this. We like to think that ‘freedom’ was necessary for a modern wealthy state. We are not so certain, now. We used to think that certain freedoms of expression underpinned the scientific revolution. We are having doubts about this, too. Maybe it is possible to have atom bombs and live in a cesspool of immorality. Oops…
Direct URL for this post.
I have removed the name of the institution only because so many queue to sell their vapourware in this manner
Precision Medicine is a revolution in healthcare. Our world-leading biomedical researchers are at the forefront of this revolution, developing new early diagnostics and treatments for chronic diseases including cancer, cardiovascular disease, diabetes, arthritis and stroke. Partnering with XXXXX, the University of XXXX has driven … vision in Precision Medicine, including the development of a shitload of infrastructure to support imaging, molecular pathology and precision medicine clinical trials…… XXXXXX is now one of the foremost locations in a three mile radius to pursue advances in Precision Medicine.
And He declared to them, “It is written: ‘My house will be called a house of prayer. But you are making it ‘a den of robbers.'” Matthew 21:13
Direct URL for this post.
Mr Sammallahti is not a recluse, nor lacking in ambition. He travels the world taking photographs; a book, “Here Far Away”, was published in 2012; another, of bird pictures, comes out later this year. But he shuns the art scene, believing that commercial pressures undermine quality. He does not lecture and rarely gives interviews. In 1991 he received an unprecedented 20-year grant from the Finnish government. Its sole condition was that he should concentrate on photography, so he gave up teaching. “I want to work in peace,” he explains, “to be free to fail.”
Direct URL for this post.
Smith was supported by earnings from his professorship at Glasgow, where a university teacher’s earnings depended on fees collected directly from students in the class. This contrasted with Oxford, where Smith had spent six unhappy years, and where, he observed, the dons had mostly given up even the pretence of teaching.
But Smith relinquished his professorship in 1763, and the writing of ‘Wealth…’ and the remainder of his career was financed by the Duke of Buccleuch, who as a young man employed Smith as a tutor.
Direct URL for this post.
Digging deep into some of my old notes, I came across this obituary of John Ziman written by Jerry Ravetz. I know both through their written work and was lucky enough to meet and chat briefly with John Ziman not long before he died. Ziman’s book “Real Science” is for me the classic account of what has happened to science as it moved from a ‘way of life’ to a job.
Jerry Ravetz writes:
I first became aware of him through his 1960 radio talk Scientists – Gentlemen Or Players?, where he observed how a career in science was starting to change, from being a vocation to being a job.
There was a paradox running through his later career, to which he must have been sensitive. He was a “Renaissance man” in a way highly desirable for a scientist, but he did not exert the influence that he might have hoped to. This was due less to the passion he deployed in argument than the times in which he found himself. The age of such eminent scientist-savants as JBS Haldane, JD Bernal and Joseph Needham was passing, while a new generation of socially responsible scientists had yet to establish itself. Those who reminded scientists of their social responsibilities were viewed with suspicion; and those who had stopped doing research were treated as defectors.
Direct URL for this post.