Research

The search space is always bigger…

by reestheskin on 02/07/2020

Comments are disabled

Defining the appropriate probability space is often a non-trivial bit of statistics. It is often where you have to end up leaving statistics and formal reasoning behind. The following quote puts this in a more bracing manner.

There are no lobby groups for companies that do not exist.

The same goes for research and so much of what makes the future captivating.

On being an eternal student.

by reestheskin on 22/06/2020

Comments are disabled

Freeman Dyson died February 28th this year. There are many obituaries of this great mind and eternal rebel. His book, Disturbing the Universe, is for me one of a handful that gets the fundamental nature of discovery in science and how science interacts with other modes of being human. His intellectual bravery and honesty shine through most of his writings. John Naughton had a nice quote from him a short while back.

Some mathematicians are birds, others are frogs. Birds fly high in the air and survey broad vistas of mathematics out to the far horizon. They delight in concepts that unify our thinking and bring together diverse problems from different parts of the landscape. Frogs live in the mud below and see only the flowers that grow nearby. They delight in the details of particular objects, and they solve problems one at a time. I happen to be a frog, but many of my best friends are birds. The main theme of my talk tonight is this. Mathematics needs both birds and frogs.

In truth he was both frog and an albatross. Here are some words from his obituary in PNAS.

During the Second World War, Dyson worked as a civilian scientist for the Royal Air Force’s Bomber Command, an experience that made him a life-long pacifist. In 1941, as an undergraduate at Trinity College, Cambridge, United Kingdom, he found an intellectual role model in the famed mathematician G. H. Hardy, who shared two ideas that came to define Dyson’s trajectory: “A mathematician, like a painter or a poet, is a maker of patterns,” and “Young men should prove theorems; old men should write books.”

Heeding the advice of his undergraduate mentor, Dyson returned to his first love of writing. He became well-known to a wide audience by his books Disturbing the Universe (1979) (1) and Infinite in All Directions (1988) (2), and his many beautiful essays for The New Yorker and The New York Review of Books. In 2018, he published his autobiography, Maker of Patterns (3), largely composed of letters that he sent to his parents from an early age on.

And as for us eternal students, at least I have one thing in common.

…Dyson never obtained an official doctorate of philosophy. As an eternal graduate student, a “rebel” in his own words, Dyson was unafraid to question everything and everybody. It is not surprising that his young colleagues inspired him the most.

Freeman J. Dyson 1923–2020: Legendary physicist, writer, and fearless intellectual explorer | PNAS

The beginning is where its at.

by reestheskin on 12/06/2020

Comments are disabled

The best way to foster mediocrity is to found a Center for Excellence.

This is a quote from a comment by DrOFnothing on a good article by Rich DeMillo a few years back. It reminds me of my observation than shiny new research buildings often mean that the quality (but maybe not the volume) of reseach will deteriorate. This is just intellectual regression to the mean. You get the funding for the new building based on the trajectory of those who were in the old building — but with a delay. Scale, consistency  and originality have a troubled relationship. Just compare the early flowerings of jazz-rock fusion (below)  with the technically masterful but ultimately sterile stuff that came later.

Prose

by reestheskin on 09/06/2020

Comments are disabled

The Nobel laureate David Hubel commented somewhere that reading most modern scientific papers was like chewing sawdust. Certainly it is rare nowadays to see the naked honesty of Watson and Crick’s classic opening paragraphs, or the melody not being drowned out by the the metrical percussion.

WE wish to suggest a structure for the salt of deoxyribose nucleic acid (D.N.A.). This structure has novel features which are of considerable biological interest [emphasis added].

A structure for nucleic acid has already been proposed by Pauling and Corey1. They kindly made their manuscript available to us in advance of publication. Their model consists of three intertwined chains, with the phosphates near the fibre axis, and the bases on the outside. In our opinion, this structure is unsatisfactory for two reasons : (1) We believe that the material which gives the X-ray diagrams is the salt, not the free acid. Without the acidic hydrogen atoms it is not clear what forces would hold the structure together, especially as the negatively charged phosphates near the axis will repel each other. (2) Some of the van der Waals distances appear to be too small.

And then there is that immortal understated penultimate paragraph.

It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.

Well here is another one that impresses me even if I can claim no expertise in this domain. It is from the prestigious journal Physical Review, is 27 words in length, with one number, one equation and one reference.  Via Fermat’s Library @fermatslibrary

 

 

 

 

 

Via Fermat’s Library @fermatslibrary

The slow now

by reestheskin on 18/05/2020

Comments are disabled

Education is intellectual infrastructure.  So is science.  They have very high yield, but delayed payback.  Hasty societies that can’t span those delays will lose out over time to societies that can.  On the other hand, cultures too hidebound to allow education to advance at infrastructural pace also lose out.

Stewart Brand.

Would a medal do?

by reestheskin on 22/04/2020

Comments are disabled

Interesting article from a final-year PhD student in Bristol. She writes:

Around one week before lockdown, Public Health England sent a message to UK universities; it needed their help to find PhD students, postdocs and other researchers to carry out diagnostic testing in London.

Despite the urgency of the call, the email didn’t mention pay or whether researchers should have permission from their grant funders to up and leave lab projects. It also omitted any details on accommodation or travel support for those of us living outside the capital…Then, on 2 April, we received another email, apparently from Public Health England (PHE), which was circulated to everyone in our faculty calling on us to join a “scientific reserve to support regional Covid-19 testing operations”.

The email cautioned that the work would be hard, and would require ‘five or seven day on/off shift patterns with long shifts’. No mention again of whether funders approved. Are the companies that provide testing or the reagents for testing getting paid, I wonder? She speculates as to  whether the government will be generous to her and others like her in the coming economic crisis.

My assumption is probably not: it will ask us to get ourselves in debt to the tune of tens of thousands of pounds to get the skills the country needs, but not pay us to work once we have them.

Well said.

Dan Greenberg RIP

by reestheskin on 17/04/2020

Comments are disabled

Daniel S. Greenberg (1931–2020) has died. Nice obituary about him and why he mattered in this week’s Science.

Daniel S. Greenberg (1931–2020) | Science

At the time, the idea of a journalist-written section in a publication devoted to publishing research papers was highly unusual, and so was the approach that Dan and his team took. They covered basic research policy in much the same way a business reporter would cover development of economic policy: as a set of competing interests…[emphasis added].

However, it was not greeted with universal enthusiasm. In a preface to the second edition, Dan noted that it sparked “reactions that flowed from the belief that the scientific community should be exempt from the types of journalistic inquiries that are commonplace to other segments of our society.” He called that attitude “nonsense.”

Following your interest ‘du jour’

by reestheskin on 10/04/2020

Comments are disabled

Many years ago I acted as the host of a visiting US dermatologist. He was due to give lectures in Edinburgh and elsewhere in the UK. I knew of him and, as I remember things, had spoken with him previously on a few occasions at US meetings. However, I did know of his research and, although it was in an area of skin research that was a long way from my professional interests, I admired it. He has taken a group of skin diseases (blistering aka bullous diseases) and explained at the molecular level what they all had in common. He had pulled back the veil, and shown the underlying unity of things that up until then had appeared different. I consider his work a beautiful example of clinical science.

I spent the day with him showing him around Edinburgh. We talked science and, as is often the case, the ‘meta’ of science: how is it done, how is it funded, what is good, and of course how the proposed work has been turned down for funding (’no track’ record’) etc.

At that time he was at NIH in the US. He stated that the great advantage of NIH — as he saw it then — was that if something bugged you at 9am you could spend the rest of the day (or week) thinking and reading about it. He wasn’t seeing patients, he wasn’t doing any formal teaching and admin was minimal. But he was insistent that blisters were ‘his problem’. If nobody would fund his interest in blisters he would do something else. Move to a university or a full time clinical role, but he wasn’t going to work on any other problem.

I know he did move several years later (to become Chair at an Ivy Laguen school) although I do not know the exact reasons why. I wouldn’t be surprised if kids going to college made him review his finances.

But the idea of spending you day in ‘your thoughts’ reading and asking questions appeals to some lifelong academics. Retirement has its pleasures (it is just the pay cheque that is missing).

Five year plans

by reestheskin on 02/04/2020

Comments are disabled

Bruce Charlton pointed me to this entry in Wikipedia on Seymour Cray

“One story has it that when Cray was asked by management to provide detailed one-year and five-year plans for his next machine, he simply wrote, “Five-year goal: Build the biggest computer in the world. One year goal: One-fifth of the above.” And another time, when expected to write a multi-page detailed status report for the company executives, Cray’s two sentence report read: “Activity is progressing satisfactorily as outlined under the June plan. There have been no significant changes or deviations from the June plan.”

Which brings to mind Sydney Brenner’s comments that eventually requests for research grant funding will eventually resemble flow diagrams recording who reports to who.

All (insert name) careers end in failure

by reestheskin on 27/03/2020

Comments are disabled

We are living in dark times, and since I have been sifting through the ashes of a career, it is no surprise that failures signal through like radioactive tracers. Below is one.

Through most of my career I have been interested in the relation between science and medicine. In truth, if what matters is what you think about in the shower, I have been more interested in the relation between science and medicine than I have been interested in either activity in isolation. If I were to use a phrase to describe my focus, although it is a term that I would not have used then, I am interested in the epistemological foundations of medical practice. Pompous, I agree. I could use another phrase: what makes medicine and doctors useful? Thinking about statistical inference is a part of this topic, but there is much more to explore.

These issues became closer to my consciousness soon after I moved to Edinburgh. My ideas about what was going on were not shared by many locally, and I was nervous about going public in person rather than in print at a Symposium hosted by the Royal College of Physicians of Edinburgh. My nervousness was well founded: whilst I liked my abstract, my talk went down badly. Not least because it was truly dreadful (and the evident failure still rankles). Jan Vandenbroucke, one of the other speakers and somebody whose work I greatly admire (his paper in the Lancet, Homoeopathy trials: Going nowhere. [Lancet.1997;350:824], was to me the most important paper published in the Lancet in the 1990s), said some kind words to me afterwards, muttering that I had tried to say far too much to an audience that was ill prepared for my speculations. All true, but he was just being kind. It was worse than that.

Anyway, some  tidying up deep in my hard drive surfaced the abstract. I still like it, but it is  a shame that at the appropriate time I was unable to explain why. 

JAMES LIND SYMPOSIUM: From scurvy to systematic reviews and clinical guidelines: how can clinical research lead to better patient care? (31-10-2003, RCPE Edinburgh)

Guidelines, Automata, Science and Algorithms

There are three great branches of science: theory, experiment, and computation. (Nick Trefethen)

Advance in the mid-third of the twentieth century, the golden age of medical research, was predicated on earlier discoveries in the nineteenth century in both physiology and medicinal chemistry (1). Genetics dominated biology in the latter third of the twentieth century and many believe changes in medical practice will owe much to genetics over the next third-century (1). I disagree, and I will give an alternative view more credence: in 30 years’ time we will look back more to Neumann and Morgenstern than we will to Watson and Crick. What the Nobel laureate Herbert Simon referred to as The Sciences of the Artificial (2), subjects which have largely been peripheral to medicine, will become central.

Over the last 20 years we have seen the first (largely inadequate, I would add) attempts to explicitly demarcate methods of obtaining and promulgating knowledge about clinical practice (3,4). This has usually taken the form of proselytising a particular set of terms – systematic reviews, evidence-based practice, guidelines and the like, terms that have little to commend them or rigour. What is interesting, however, is that they reflect a long overdue renaissance of interest with the practice of medicine and medical epistemology.

The change of emphasis from the natural to the artificial is being driven by a number of forces, mostly extraneous to biomedicine: the increasing instrumental role of science in medicine and society; the increase in corporatisation of knowledge, whether by private corporations or monopsonistic institutions like the NHS (5); the rising costs of healthcare; and a remaining inability to frame questions with broad support about how to chose between alternative disease states at the level of society (6,7).

I will try to illustrate some of these issues by the use of three examples. First, the widespread use of a mode of statistical inference largely ill-suited to medicine, namely Neyman-Pearson hypothesis testing (decision-making), and the way in which this paradigm has been used to undermine expert opinion (8). Second, I will argue that we need to think much harder about clinical practice and fashion a more appropriate theoretical underpinning for clinical behaviour. Third, I will suggest how UK medical schools, in so far as they remain interested in clinical practice, should look to alternative models, perhaps business and law schools, for ideas of how they should operate (2).

  1. Rees J. Complex disease and the new clinical sciences. Science 2002; 296:698-700.
  2. Simon HA. The sciences of the artificial. Cambridge, Mass. MIT Press; 1969
  3. Rees J. Evidence-based medicine: the epistemology that isn’t. J Am Acad Dermatol 2000; 43:727-9.
  4. Rees J. Two cultures? J Am Acad Dermatol 2002; 46:313-16.
  5. Hacking I. The emergence of probability: a philosophical study of early ideas about probability, induction and statistical inference. Cambridge: Cambridge University Press; 1975.
  6. Ziman J. Real science. Cambridge: Cambridge University
  7. Ziman J. Non-instrumental roles of science. Sci Eng Ethics
    2003;9:17-27.
  8.  Gigerenzer G, Swijtink Z, Porter T et al. The empire of chance: how probability changed science and everyday life. Cambridge: CUP; 1989.

Afterword. The symposium used structured abstracts, a habit that might have a place somewhere in this galaxy, but out of choice I would prefer to live in another one. Anyway, in the published version, it reads:

  • Methods: Not submitted
  • Results: Not submitted
  • Conclusions: Not submitted

A fair cop.

On the hazards of a good methodology

by reestheskin on 24/03/2020

Comments are disabled

Alfred North Whitehead: “Some of the major disasters of mankind have been produced by the narrowness of men with a good methodology” (The Function of Reason).

Comments that seem germane to some of our current day covid-19 debates.

’Scuse Me while I Kiss the Sky

by reestheskin on 23/03/2020

Comments are disabled

’Scuse Me While I Kiss the Sky | by A.J. Lees | The New York Review of Books

People are always demanding that medical students must learn this or that (obesity, psychiatry, dermatology, ID, eating disorders). The result is curriculum overload, a default in favour of rote learning by many students, and the inhibition of curiosity. It was not meant to be like this, but amongst others, the GMC, the NHS, and others have pushed a vision of university medical education that shortchanges both the students and medical practice over the long term. Short-termism rules. Instead of producing graduates who are ready to learn clinical medicine is an area of their choice, we expect them to somehow come out oven-ready at graduation. I do not believe it is possible to do this to a level of safety that many other professions demand, nor is this the primary job of a university. Sadly, universities have given up on arguing, intimidated by the government and their regulatory commissars, and nervous of losing their monopoly on producing doctors.

But I will make a plea that one area really does deserve more attention within a university : the history of how medical advance occurs. No, I do not mean MCQs asking for the date of birth of Robert Koch or Lord Lister, but a feel for the historical interplay of convention and novelty. Without this our students and our graduates are almost confined to living in the present, unaware of the past, and unable to doubt how different the future will be. Below is one example.

”In 1938 Albert Hofmann, a chemist at the Sandoz Laboratories in Basel, created a series of new compounds from lysergic acid. One of them, later marketed as Hydergine, showed great potential for the treatment of cerebral arteriosclerosis. Another salt, the diethylamide (LSD), he put to one side, but he had “a peculiar presentiment,” as he put it in his memoir LSD: My Problem Child (1980), “that this substance could possess properties other than those established in the first investigations.

In 1943 he prepared a fresh batch of LSD. In the final process of its crystallization, he started to experience strange sensations. He described his first inadvertent “trip” in a letter to his supervisor:

At home I lay down and sank into a not unpleasant, intoxicated-like condition, characterized by extremely stimulated imagination. In a dream-like state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors.

After eliminating chloroform fumes as a possible cause, he concluded that a tiny quantity of LSD absorbed through the skin of his fingertips must have been responsible. Three days later he began a program of unsanctioned research and deliberately ingested 250 micrograms of LSD at 4:20 PM. Forty minutes later, he wrote in his lab journal, “Beginning dizziness, feeling of anxiety, visual distortions, symptoms of paralysis, desire to laugh.” He set off home on his bicycle, accompanied by his laboratory assistant. This formal trial of what Hofmann considered a minute dose of LSD had more distressing effects than his first chance exposure:

Every exertion of my will, every attempt to put an end to the disintegration of the outer world and the dissolution of my ego, seemed to be wasted effort. A demon had invaded me, had taken possession of my body, mind, and soul. I jumped up and screamed, trying to free myself from him, but then sank down again and lay helpless on the sofa…. I was taken to another world, another place, another time.

A doctor was summoned but found nothing amiss apart from a marked dilation of his pupils. A fear of impending death gradually faded as the drug’s effect lessened, and after some hours Hofmann was seeing surreal colors and enjoying the play of shapes before his eyes.

Lees writes:

Many editors of learned medical journals now automatically turn down publications describing the sort of scientific investigation that Albert Hofmann carried out on himself. Institutional review boards are often scathing in their criticism of self-experimentation, despite its hallowed tradition in medicine, because they consider it subjective and biased. But the human desire to alter consciousness and enrich self-awareness shows no sign of receding, and someone must always go first. As long as care and diligence accompany the sort of personal research conducted by Pollan and Lin, it has the potential to be as revealing and informative as any work on psychedelic drugs conducted within the rigid confines of universities.

No old (nor broken) records here

by reestheskin on 21/03/2020

Comments are disabled

Richard Horton in the Lancet writes:

Imagine if the entire edifice of knowledge in medicine was built upon a falsehood. Systematic reviews are said to be the highest standard of evidence-based health care. Regularly updated to ensure that treatment decisions are built on the most up-to-date and reliable science, systematic reviews and meta-analyses are widely used to inform clinical guidelines and decision making. Powerful organisations have emerged to construct a knowledge base in medicine underpinned by the results of systematic reviews. One such organisation is Cochrane, with 11 000 members in over 130 countries. This extraordinary movement of people is justifiably passionate about the idea that it is contributing to better health outcomes for everyone, everywhere. The industry that drives the production of systematic reviews today is financed by some of the most influential agencies in medical research. Cochrane, for example, points to three funders providing over £1 million each—the UK’s National Institute for Health Research (NIHR), the US National Institutes of Health (NIH), and Australia’s National Health and Medical Research Council (NHMRC).

Well, it really is a bit late for all this soul searching. See my earlier post here ‘Mega-silliness’ (commenting on what others had already pointed out); or my Evidence Based Medicine: the Epistemology That Isn’t, written over 20 years ago;  and my contribution to the wake (even if I didn’t put my hand in my pocket), Why we should let “evidence-based medicine” rest in peace. The genesis of EBM was as a cult whose foundational myth was that P values could act as a true machine. Those followers who had originally hoped for a place in the promised afterlife, soon settled for paying the bills, and EBM morphed into a career opportunity for those who found accountancy too daring. So, pace John Mayall on Jazz Blues Fusion, don’t come here to listen to an old record. I promise.

Offline: The gravy train of systematic reviews – The Lancet

Statistical significance and clinical evidence

Two letters in Lancet Oncology. This bloody story never ends. We have not invented truth machines: judgement has never been exiled from discovery.

Zombies.

Mining gold in them teeth

by reestheskin on 18/03/2020

Comments are disabled

Stanley Cohen has died. A special place for those of us hooked on the ectoderm. Some nice comments about him in the Lancet from Geoff Watts.

A May, 1962, issue of the Journal of Biological Chemistry included a deceptively arcane study on the isolation of a protein that could accelerate incisor eruption and eyelid opening in newborn mice. The author, Stanley Cohen, later to become Professor Emeritus of Biochemistry at Vanderbilt University School of Medicine (VUSM) in Nashville, TN, USA, had named his protein “tooth-lid factor”. Cohen’s subsequent studies would not only lead him to rename the protein epidermal growth factor (EGF), but also mark him out as one of the founders of a new area of biology and eventually win him a Nobel Prize.

[says Lawrence Marnett], “When he came here he began studying some growth factors in animal cell extracts. One was of mouse submaxillary gland…It had peptides in it, and when he injected them into newborn mice their teeth broke though earlier than normal, and their eyelids opened sooner.” Cohen’s subsequent studies revealed that his extract worked by stimulating the growth of epidermal cells. Having consequently renamed the material EGF, he devoted the rest of his career to studying it. “He went on to identify the EGF receptor and define target cells that would respond to EGF”, recalls Graham Carpenter, Emeritus Professor of Biochemistry at VUSM, who joined Cohen’s lab in 1973 and worked with him on EGF as a postdoctoral fellow. The EGF receptor proved to be a useful target for drugs, and Cohen’s discoveries opened the door to research on diseases ranging from dementia to cancer. “He understood EGF’s biological importance”, says Carpenter. “But we did not have any idea that this would extend to cancer biology in a major way.”

And as for that most successful of all biology labs, the style of exploration  is familiar.

[Graham Carpenter] “In contrast to today, his research group was very small, seldom more than four people—himself, two technicians, and a postdoc…He was central to whatever was going on in the lab.” [Lawrence] Marnett also recalls that determination: “He was one of those guys that was just driven by his desire to understand how things work…It was a classic example of making an observation and then drilling down to try to understand it, not knowing what you’re going to find.” And at that time there was plenty to be found. Cohen, as Marnett puts it, was basically “mining gold”.

We too needed cash (then)

by reestheskin on 11/03/2020

Comments are disabled

Terrific article on Covid-19 (Sars-CoV-2). in the LRB by Rupert Beale. He says written in haste but it doesn’t read that way. It contains some memorable lines.

As the US health secretary Michael Leavitt put it in 2006, ‘anything we say in advance of a pandemic happening is alarmist; anything we say afterwards is inadequate.’

And how do you think hard about research funding for the long term (I am old enough to remember when stroke and dementia were virtually non-subjects as far as ‘good research funding’ was concerned).

Virologists need more than clever tricks: we also need cash. Twenty years ago, funding wasn’t available to study coronaviruses. In 1999, avian infectious bronchitis virus was the one known truly nasty coronavirus pathogen. Only poultry farmers really cared about it, as it kills chickens but doesn’t infect people. In humans there are a number of fairly innocuous coronaviruses, such as OC43 and HKU1, which cause the ‘common cold’. Doctors don’t usually bother testing for them – you have a runny nose, so what?

And note the conditional tense:

The global case fatality rate is above 3 per cent at the moment, and if – reasonable worst case scenario – 30-70 per cent of the 7.8 billion people on earth are infected, that means between 70 and 165 million deaths. It would be the worst disaster in human history in terms of total lives lost. Nobody expects this, because everyone expects that people will comply with efficient public health measures put in place by responsible governments.

And to repeat my own mantra (stolen from elsewhere): the opposite of science is not art, but politics:

The situation isn’t helped by a president [Trump] who keeps suggesting that the virus isn’t that bad, it’s a bit like flu, we will have a vaccine soon: stopping flights from China was enough. Tony Fauci, the director of the National Institute of Allergy and Infectious Disease, deftly cut across Trump at a White House press briefing. No, it isn’t only as bad as flu, it’s far more dangerous. Yes, public health measures will have to be put in place and maintained for many months. No, a vaccine isn’t just around the corner, it will take at least 18 months. Fauci was then ordered to clear all his press briefings on Covid-19 with Mike Pence in advance: the vice president’s office is leading the US response to the virus. ‘You don’t want to go to war with a president,’ Fauci remarked.

And Beale ends by quoting an ID colleague.

This is not business as usual. This will be different from what anyone living has ever experienced. The closest comparator is 1918 influenza.

Caution: pace the author, ‘This is a fast-moving situation, and the numbers are constantly changing – certainly the ones I have given here will be out of date by the time you read this.’

Link. (London Review of Books: Vol. 42 No. 5,  5 March 2020: “Wash your Hands”: Rupert Beale)

Tidying up

by reestheskin on 04/03/2020

Comments are disabled

I have spend a lot of time recently sifting through the detritus of a career. Finally — well, I hope, finally — I have managed to sort out my books. All neatly indexed in Delicious Library, and now for once the virtual location mirrors the physical location. For how long I do not know. Since I often buy books based on reviews, I used to put a copy of the review in with the book (a habit I have dropped but need to restart). I rediscovered this one by David Colquhoun (DC) reviewing ‘The Diet Delusion’ by Gary Taubes in the BMJ (with the unexpurgated text on his own web site).

I am a big fan of DC as he has lived though the rise and decline of much higher education in the UK. And he remains fearless and honest, qualities that are not always at the forefront of the modern university. Quoting the great Robert Merton he writes:

“The organization of science operates as a system of institutionalized vigilance, involving competitive cooperation. In such a system, scientists are at the ready to pick apart and assess each new claim to knowledge. This unending exchange of critical appraisal, of praise and punishment, is developed in science to a degree that makes the monitoring of children’s behavior by their parents seem little more than child’s play”.

He adds:

“The institutionalized vigilance, “this unending exchange of critical judgment”, is nowhere to be found in the study of nutrition, chronic disease, and obesity, and it hasn’t been for decades.”

On Taubes and his (excellent book):

It took Taubes five years to write this book, and he has nothing to sell apart from his ideas. No wonder it is so much better than a scientist can produce. Such is the corruption of science by the cult of managerialism that no university would allow you to spend five years on a book

(as would be expected the BMJ omitted the punch line — they would, wouldn’t they?)

There is also a neat quote from Taubes in one of the comments on DC’s page from Beth@IDblog, one that I will try hard not to forget:

Taubes makes a point at the end of the Dartmouth medical grand rounds video that I think is important: “I’m not trying to convince you that it’s true, I’m trying to convince you that it should be taken seriously.”

The thrill is gone

by reestheskin on 31/01/2020

Comments are disabled

Today is my last day of (paid) work, and of course a day that will be infamous for many more people for other more important reasons. Europe and my professional life have been intertwined for near on 40 years. In the mid 1980s I went to start my dermatological career in Vienna. I had been a student at Newcastle and done junior jobs there, as well as some research on skin physiology with Sam Shuster as an undergraduate student. Sam rightly thought I should now move somewhere else — see how others did things before returning — and he suggested Paris, or Vienna under Klaus Wolff. Vienna was, and perhaps still is, the centre of the dermatological universe, and has been since the mid 19th century. Now, even if I haven’t got very far into this post — it is a day for nostalgia — so allow me an aside: The German literature Problem.

The German Literature

As I have hinted at above, in many ways there have only been two schools of dermatology: the French school, and the German school. The latter has been dominant. Throughout the second half of the 19th century dermatology was a ‘German speaking’ subject. To follow it you would be wise to know German, and better still to have visited the big centres in Germany, Switzerland or Austria. And like most of the modern research university, German medicine and science was the blueprint for the US and then belatedly — and with typos— for England (Scotland, reasonably, had taken a slightly different path).

All of the above I knew, but when I returned to Newcastle after my first sojourn away (a later one was to Strasbourg), I naturally picked up on all these allusions to the German literature, but they were accompanied by sniggering by those who had been around longer than me. Indeed there seemed to be a ‘German Literature Problem’. Unbeknown to me, Sam had written “das problem ” up in ‘World Medicine’, but World Medicine had been killed off by those from Mordor, so here is a synopsis.

The German literature seemed so vast that whenever somebody described a patient with what they were convinced must be a ‘new syndrome’, some bright spark would say that it had already been described, and that it was to be found in the German literature. Now the synoptic Handbuch der Hautkrankheiten on our shelves in the library in Newcastle ran to over 10 weighty volumes. And that was just the start. But of course only German speaking dermatologists (and we had one) could meaningfully engage in this conversation. Dermatology is enough of a a nightmare even in your own mother tongue. Up to the middle of the 20th century however, there were indeed separate literatures in German, French and English (in the 1960’s the newly formed ESDR had to sort out what language was going to be used for its presentations).

Sam’s sense of play now took over (with apologies to Shaw: nothing succeeds like excess). It appeared that all of dermatology had already been previously described, but more worryingly for the researchers, the same might be true of skin science. In his article in World Medicine he set out to describe his meta-science investigation into this strange phenomenon. Sam has an unrivalled ability to take simple abstract objects — a few straight lines, a circle, a square — and meld them into an argument in the form of an Escher print. A print that you know is both real, unreal and illegal. Imagine a dastardly difficult 5 x 5 Rubik’s cube (such as the one my colleagues recently bought me for my retirement). You move and move and move the individual facets, then check each whole face in turn. All aligned, problem solved. But then you look in the mirror: whilst the faces are all perfect in your own hands, that is not what is apparent in the mirror. This is my metaphor for Sam’s explanation. Make it anymore explicit, and the German literature problem rears its head. It’s real — of a sort. Anyway, this was all in the future (which didn’t exist at that time), so lets get back to Vienna.

Night train to Wien

Having left general medical jobs behind in Newcastle, armed with my BBC language tapes and guides, I spent a month travelling through Germany from north to south. I stayed with a handful of German medical students who I had taught in Newcastle when I was a medical registrar (a small number of such students used to spend a year with us in Newcastle). Our roles were now reversed: they were now my teachers. At the end of the month I caught the night train in Ulm, arriving in Vienna early one morning.

Vienna was majestic — stiff collared, yes — but you felt in the heart of Europe. A bit of Paris, some of Berlin and the feel of what lay further east: “Wien ist die erste Balkanstadt”.  For me, it was unmistakably and wonderfully foreign.

It was of course great for music, too. No, I couldn’t afford the New Year’s Day Concerts, but there were cheap seats at the Staatsoper, more modest prices at the Volksoper, and more to my taste, some European jazz and rock music. I saw Ultravox sing — yes, what else— “Vienna” in Vienna. I saw some people from the ECM label (eg Eberhard Weber), a style of European jazz music that has stayed with me since my mid teens. And then there was the man (for me) behind ‘The Thrill is Gone’.

The Thrill

I saw BB King on a double bill with Miles Davies at the Stadthalle. Two very different styles of musician. I was more into Miles Davies then, but he was not then at his best (as medics in Vienna found out). I was, however, very familiar with the ‘Kings’ (BB, Freddie, Albert etc) after being introduced to them via their English interpreters. Clapton’s blue’s tone on ‘All Your Love’ with John Mayall’s Bluesbreakers still makes the hairs on my neck stand up (fraternal thanks to ‘Big Al’ for the introduction).

The YouTube video at the top of the page is wonderful (Montreux 1993), but there is a later one below, taken from Crossroads in 2010 which moves me even more. He is older, playing with a bunch of craftsmen, but all still pupils before the master.

But — I am getting there — germane to my melancholia on this day is a video featuring BB King and John Mayer. Now there is a trope that there are two groups of people who like John Mayer: girlfriends; and guitarists who understand just how bloody good he is. As EC pointed out, the problem with John Mayer is that he realises just how good he is. True.

But the banter at the beginning of the video speaks some eternal truths about craft, expertise, and the onward march of all culture — including science. Mayer plays a few BB King licks, teasing King that he is ‘stealing’ them. He continues, it was as though he was ‘stealing silverware from somebody’s house right in front of them’. King replies: ”You keep that up and I’m going to get up and go”. Both know it doesn’t work that way. Whatever the provenance of the phrase ‘great artists steal, not copy’, except in the most trivial way you cannot steal or copy culture: people discover it in themselves by stealing what masters show them might be there in their pupils. Teachers just help people find what they suspect or hope is there. The baton gets handed on. The thrill goes on. And on.

On one word after another

by reestheskin on 28/01/2020

Comments are disabled

Well, I doubt if any readers of these scribblings will be shocked. After all TIJABP. But this piece by the editor of PNAS wonders if the day of meaningful editing is over. I hope not. Looking backwards over my several hundred papers, the American Journal of Human Genetics was the most rigorous and did the most to improve our manuscript.

On a subject no one wants to read about (about which no one wants to read?) | PNAS

“Communication” remains in the vocabulary of scientific publishing—for example, as a category of manuscript (“Rapid Communications”) and as an element of a journal name (Nature Communications)—not as a vestigial remnant but as a vital part of the enterprise. The goal of communicating effectively is also why grammar, with its arcane, baffling, or even irritating “rules,” continues to matter. With the rise of digital publishing, attendant demands for economy and immediacy have diminished the role of copyeditor. The demands are particularly acute in journalism. As The New York Times editorial board member Lawrence Downs (4) lamented, “…in that world of the perpetual present tense—post it now, fix it later, update constantly—old-time, persnickety editing may be a luxury…. It will be an artisanal product, like monastery honey and wooden yachts.” Scientific publishing is catching up to journalism in this regard.

One thing I won’t miss in retirement

by reestheskin on 27/01/2020

Comments are disabled

Being a renowned scientist doesn’t ensure success. On the same day that molecular biologist Carol Greider won a Nobel prize in 2009, she learnt that her recently submitted grant proposal had been rejected. “Even on the day when you win the Nobel prize,” she said in a 2017 graduation speech at Cold Spring Harbor Laboratory in New York, “sceptics may question whether you really know what you’re doing.”

Secrets to writing a winning grant

The itch that rashes

by reestheskin on 15/01/2020

Comments are disabled

My earliest conscious memory of disease and doctors was in the management of my atopic dermatitis. Here is Sam Shuster writing poetically about atopic dermatitis in ‘World Medicine’ in 1983.

A dozen years of agony; years of sleeplessness for child and parents, years of weeping, itching, scaling skin, the look and feel of which is detested.

The poverty of our treatments is made all the worse by the unfair raising of expectations: I don’t mean by obvious folk remedies; I mean medical folk remedies like the recent pseudoscientific dietary treatments which eliminate irrelevant allergens. There neither is nor ever was good evidence for a dietary mechanism. And as for cows’ milk, I would willingly drown its proponents in it. We have nothing fundamental for the misery of atopic eczema and that’s why I would like to see a real treatment—not one of those media breakthroughs, and not another of those hope raising nonsenses like breast-feeding: I mean a real and monstrously effective treatment. Not one of your P<.05 drugs the effect of which can only be seen if you keep your eyes firmly double-blind, I mean a straightforward here today and gone tomorrow job, an Aladdin’s salve—put it on and you have a new skin for old.

Nothing would please me more in the practice of clinical dermatology than never again to see a child tearing its skin to shreds and not knowing how long it will be before it all stops, if indeed it does.

Things are indeed better now, but not as much we need: we still don’t understand the itch nor can we easily block the neural pathways involved. Nor has anything replaced the untimely murder of ‘World Medicine’. A glass of milk has never looked the same since, either.

You don’t have to be mad to read this

by reestheskin on 13/01/2020

Comments are disabled

There is an interesting review in the Economist of the ‘Great Pretender: The Undercover Mission that Changed out Understanding of Madness,’ written by Susan Cahalan. The book is the story of the American psychologist David Rosenhan who “recruited seven volunteers to join him in feigning mental illness, to expose what he called the ‘undoubtedly counter-therapeutic’ culture of his country’s psychiatry”.

Rosenthal’s studies are well known and were influential, and some might argue that may have had have a beneficial effect on subsequent patient care. The question is whether they were true. The review states:

in the end Rosenham emerges as an unpalatable symptom of a wider academic malaise”.

As for the ‘malaise’, the reviewer goes on:

Many of psychology’s most famous experiments have recently been discredited or devalued, the author notes. Immense significance has been attached to Stanley Milgram’s shock tests and Philip Zimbardo’s Stanford prison experiment, yet later re-runs have failed to reproduce their findings. As Ms Cahalan laments, the feverish reports on the undermining of such theories are a gift to people who would like to discredit science itself.

I have a few disjointed thoughts on this. There are plenty of other considered critiques of the excesses of modern medical psychiatry. Anthony Clare’s ‘Psychiatry in Dissent’ was for me the best introduction to psychiatry. And Stuart Sutherland’s “Breakdown’ was a blistering and highly readable attack on medical (in)competence as much as the subject itself (Sutherland was a leading experimental psychologist, and his account is autobiographical). And might the cross-country diagnostic criteria studies not have happened without Rosenham’s work?

As for undermining science (see the quote above), I think unreliable medical science is widespread, and possibly there is more of it than in many past periods. Simple repetition of experiments is important but not sufficient, and betrays a lack of of understanding of why some science is so powerful.

Science owes its success to its social organisation: conjectures and refutations, to use Popper’s terms, within a community. Just repeating an experiment under identical conditions is not sufficient. Rather you need to use the results of one experiment to inform the next, and with the accumulation of new results, you need to build a larger and larger edifice which whilst having greater explanatory power is more and more intolerant of errors at any level. Building large structures out of Lego only works because of the precision engineering of each of the component bricks. But any errors only become apparent when you add brick-on-brick. When a single investigator or group of investigators have skin in the game during this process — and where experimentation is possible — science is at its strongest (the critiques can of course come from anywhere).

An alternative process is when the results of a series of experiments are so precise and robust that everyday life confirms them: the lights go on when I click the switch. This harks back to the reporting of science as ‘demonstrations’.

By these two standards much medical science may be unreliable. First, because the fragmentation of enquiry discourages the creation of broad explanatory theories or tests of the underlying hypotheses. The ‘testing’ is more whether a publishable unit can be achieved rather than nature understood. Second, in many RCTs or technology assessments there is little theoretical framework on which to challenge nature. Nor can everyday practice act as the necessary feedback loop in the way the tight temporal relationship between flipping the switch and seeing the light turn on can.

Knowing more than we can see

by reestheskin on 05/12/2019

Comments are disabled

I spent near on ten years thinking about automated skin cancer detection. There are various approaches you might use — cyborg human/machine hybrids were my personal favourite — but we settled on more standard machine learning approaches. Conceptually what you need is straightforward: data to learn from, and ways to lever the historical data to the future examples. The following quote is apposite.

One is that, for all the advances in machine learning, machines are still not very good at learning. Most humans need a few dozen hours to master driving. Waymo’s cars have had over 10m miles of practice, and still fall short. And once humans have learned to drive, even on the easy streets of Phoenix, they can, with a little effort, apply that knowledge anywhere, rapidly learning to adapt their skills to rush-hour Bangkok or a gravel-track in rural Greece.

Driverless cars are stuck in a jam – Autonomous vehicles

You see exactly the same thing with skin cancer. With a relatively small number of examples, you can train (human) novices to be much better than most doctors. By contrast, with the machines you need literally hundreds and thousands of examples. Even when you start with large databases, as you parse the diagnostic groups, you quickly find out that for many ‘types’ you have only a few examples to learn from. The rate limiting factor becomes acquiring mega-databases cheaply. The best way to do this is to change data acquisition from a ‘research task’ to a matter of grabbing data that was collected routinely for other purposes (there is a lot of money in digital waste — ask Google).

Noam Chomsky had a few statements germane to this and much else that gets in the way of such goals (1).

Plato’s problem: How can we know so much when the evidence is do slight.

Orwell’s problem: How do we remain so ignorant when the evidence is so overwhelming.

(1): Noam Chomsky: Ideas and Ideals, Cambridge University Press, (1999). Neil Smith.

Downward mobility for all!

by reestheskin on 02/12/2019

Comments are disabled

Obituaries are a source of much joy and enlightenment. None more so than those in the Economist. Last week’s was devoted to the ’60’s photographer Terry O’Neill (you can see some of his iconic images here.

Stars had been his subject since 1962, when he was sent to photograph a new band at the Abbey Road Studios. The older blokes at the Sketch scorned that kind of work, but the young were clearly on the rise, and he was by far the youngest photographer in Fleet Street at the time. At the studios, to get a better light, he took the group outside to snap them holding their guitars a bit defensively: John, Paul, George and Ringo. Next day’s Sketch was sold out, and he suddenly found himself with the run of London and all the coming bands, free to be as creative as he liked. A working-class kid from Romford whose prospects had been either the priesthood or a job in the Dagenham car plant, like his dad, had the world at his feet. He wouldn’t have had a prayer, he thought, in any other era.

And obviously it couldn’t last. In a couple of years he would find a proper job, as both the Beatles and the Stones told him they were going to. For it was hardly serious work to point your Leica at someone and go snap, snap.

Obituary: Terry O’Neill died on November 16th – Catching the moment

The reason I found this particularly interesting is the way social mobility appeared to work and the way it was tied to genuine innovation and social change. I have always loved the trope that when jobs are plentiful, and your committments minimal, you can literally tell the boss to FO on a Friday and start another job on the Monday. Best of all you can experiment and experiment lifts all. This to me is one of the best 1960’s rock n’ roll stories.

If you lift your head above the parapet in universities you come across various conventional wisdoms. One relates to ‘mental wellbeing’ or ‘mental issues’, and another is the value of education in increasing social mobility. My problem is that in both cases there seem (to me at least) many important questions that remain unanswered. For the former, are we talking about mental illness (as in disease) or something else? How robust is the data — aside from self-reporting? The widely reported comments from the former President of the Royal College of Pyschiatrists receive no answer (at last not in my institution). An example: I have sat in a meeting in which one justification for ‘lecture capture’ (recording of live lectures) was to assist students with ‘mental health issues’. But do they help in this context? Do we trust self-reflection in this area? Under what conditions do we think they help or harm?

Enhancing life chances and social mobility is yet another area that I find difficult. I picked up on a comment from Martin Wolf in the FT

We also believe that changing individual characteristics, principally via education, will increase social mobility. But this is largely untrue. We need to be far more honest.

He was referring to the work of John Goldthorpe in Oxford. Digging just a little beneath the surface made me realise that much of what I had believed may not true. Goldthorpe writes:

However, a significant change has occurred in that while earlier, in what has become known as the golden age of mobility, social ascent predominated over social descent, the experience of upward mobility is now becoming less common and that of downward mobility more common. In this sense, young people today face less favourable mobility prospects than did their parents or their grandparents.

This research indicates that the only recent change of note is that the rising rates of upward, absolute mobility of the middle decades of the last century have levelled out. Relative rates have remained more or less constant back to the interwar years. According to this alternative view, what can be achieved through education, whether in regard to absolute or relative mobility, appears limited.

[Jnl Soc. Pol. (2013), 42, 3, 431–450 Cambridge University Press 2013 doi:10.1017/S004727941300024X]

There is a witty exchange in Propect between the journalist (JD) and Goldthorpe (JG).

JD: Would you say that this is something that politicians, in particular, tend not to grasp?

JG: Yes. Tony Blair, for instance, was totally confused about this distinction [between absolute and relative rates of mobility]. He couldnʼt see that the only way you can have more upward mobility in a relative perspective is if you have more downward mobility at the same time. I remember being in a discussion in the Cabinet Office when Geoff Mulgan was one of Blairʼs leading advisors. It took a long time to get across to Mulgan the distinction between absolute and relative rates, but in the end he got it. His response was: “The Prime Minister canʼt go to the country on the promise of downward mobility!”

On both these topics I am conflicted. And on both these topics there are the tools that characterize scholarly inquiry to help guide action: this is what universities should be about. I am however left with a strong suspicion that few are interested in digging deep, rather we choose sound bites over understanding. Working in a university often feels like the university must be somewhere else. That is the optimistic version.

Not only God plays dice

There is an article this week in Nature about how some funders are explicitly funding grant proposals randomly (lotteries). The cynic might say they have been doing this for a longtime.

Dark matters.

by reestheskin on 21/11/2019

Comments are disabled

Frank Davidoff had a telling phrase about clinical expertise. He likened it to “Dark Matter”. Dark Matter makes up most of the universe, but we know very little about it. In the clinical arena I have spent a lot of time reading and thinking about ‘expertise’,  without developing any grand unifying themes of my own worth sharing. But we live in a world where ‘expertise’ in many domains is under assault, and I have no wise thoughts to pull together what is happening. I do however like (as ever) some nice phrases from Paul Graham. I can’t see any roadmap here just perspectives and shadows.

When experts are wrong, it’s often because they’re experts on an earlier version of the world.

any

Instead of trying to point yourself in the right direction, admit you have no idea what the right direction is, and try instead to be super sensitive to the winds of change.

How to Be an Expert in a Changing World

Medicine is just one technology

by reestheskin on 08/11/2019

Comments are disabled

From Wikipedia.

Putt’s Law: “Technology is dominated by two types of people, those who understand what they do not manage and those who manage what they do not understand.”

 

Putt’s Corollary: “Every technical hierarchy, in time, develops a competence inversion.” with incompetence being “flushed out of the lower levels” of a technocratic hierarchy, ensuring that technically competent people remain directly in charge of the actual technology while those without technical competence move into management.

Putt’s Law and the Successful Technocrat – Wikipedia

Strangers to ourselves (and student learning).

by reestheskin on 04/11/2019

Comments are disabled

The quote below is from a paper in PNAS on how students misjudge their learning and what strategies maximise learning. The findings are not surprising (IMHO) but will, I guess, continue to be overlooked (NSS anybody?). As I mention below, it is the general point that concerns me.

Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom.

In this report, we identify an inherent student bias against active learning that can limit its effectiveness and may hinder the wide adoption of these methods. Compared with students in traditional lectures, students in active classes perceived that they learned less, while in reality they learned more. Students rated the quality of instruction in passive lectures more highly, and they expressed a preference to have “all of their physics classes taught this way,” even though their scores on independent tests of learning were lower than those in actively taught classrooms. These findings are consistent with the observations that novices in a subject are poor judges of their own competence (27⇓–29), and the cognitive fluency of lectures can be misleading (30, 31). Our findings also suggest that novice students may not accurately assess the changes in their own learning that follow from their experience in a class.

Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom | PNAS

The authors go on:

These results also suggest that student evaluations of teaching should be used with caution as they rely on students’ perceptions of learning and could inadvertently favor inferior passive teaching methods over research-based active pedagogical approaches….

As I say above, it is the general rather than the particular that concerns me. Experience and feeling are often poor guides to action. We are, after all, creatures that represent biology’s attempt to see whether contemplation can triumph over reflex. There remains a fundamental asymmetry between expert and novice, and if there isn’t, there is little worth learning (or indeed worth paying for).

  • The title is from Timothy Wilson’s book, Strangers to Ourselves, a good place to start on this topic. However, this whole topic is of much more importance than just how we teach students. Lessons for medicine too.

The art of the insoluble

by reestheskin on 30/10/2019

Comments are disabled

The following is from an advert for a clinical academic in a surgical specialty, one with significant on call responsibilities. (It is not from Edinburgh).

‘you will be able to define, develop, and establish a high quality patient-centred research programme’

‘in addition to the above, you will be expected to raise substantial research income and deliver excellent research outputs’

Leaving aside the debasement of language, I simply cannot believe such jobs are viable long term. Many years ago, I was looked after by a surgical academic. A few years later he/she moved to another centre, and I was puzzled as to why he/she had made this career move. I queried a NHS surgeon in the same hospital about this career path. “Bad outcomes”, was the response. She/He needed a clean start somewhere else…

Traditional non-clinical academic careers include research, teaching and administration. Increasingly it is recognised that it is rarely possible to all three well. For clinical academics the situation is worse, as 50% of your time is supposed to be devoted to providing patient care. Over time the NHS workload has become more onerous in that consultants enjoy less support from junior doctors and NHS hospitals have become much less efficient.

All sorts of legitimate questions can be asked about the relation between expertise and how much of your time is devoted to that particular role. For craft specialities — and I would include dermatology, pathology, radiology in this category — there may be ways to stay competent. Subspecialisation is one approach (my choice) but even this may be inadequate. In many areas of medicine I simply do not believe it is possible to maintain acceptable clinical skills and be active in meaningful research.

Sam Shuster always drilled in to me that there were only two reasons academics should see patients: to teach on them, and to foster their research. Academics are not there to provide ‘service’. Some juniors recognise this issue but are reticent about speaking openly about it. But chase the footfall, or lack of it, into clinical academic careers.

A time for everything

by reestheskin on 22/10/2019

Comments are disabled

Terrific interview with Sydney Brenner about the second greatest scientific revolution of the 20th century.

I think it’s really hard to communicate that because I lived through the entire period from its very beginning, and it took on different forms as matters progressed. So it was, of course, wonderful. That’s what I tell students. The way to succeed is to get born at the right time and in the right place. If you can do that then you are bound to succeed. You have to be receptive and have some talent as well…

To have seen the development of a subject, which was looked upon with disdain by the establishment from the very start, actually become the basis of our whole approach to biology today. That is something that was worth living for.

This goes for more than science and stretches out into far more mundane aspects of life. Is there any alternative?

Kings Review