The quotes below are from an article in the FT (awhile back). They echo one of my rules, a rule that is more of the exception that proves the rule. Just as “no good lab has space” (because the bench space will always be taken up because many will want to work there), so when the grand new building arrives, the quality of work will already be past its peak (because how else would you have justified your future except by looking back). It is all about edge people, and just as social change usually starts at the edge, so do good ideas.
The principle of benign neglect may well operate on a larger scale. Consider Building 20, one of the most celebrated structures at Massachusetts Institute of Technology. The product of wartime urgency, it was designed one afternoon in the spring of 1943, then hurriedly assembled out of plywood, breeze-blocks and asbestos. Fire regulations were waived in exchange for a promise that it would be pulled down within six months of the war’s end; in fact the building endured, dusty and uncomfortable, until 1998.
During that time, it played host not only to the radar researchers of Rad Lab (nine of whom won Nobel Prizes) but one of the first atomic clocks, one of the first particle accelerators, and one of the first anechoic chambers — possibly the one in which composer John Cage conceived 4’33. Noam Chomsky revolutionised linguistics there. Harold Edgerton took his high-speed photographs of bullets hitting apples. The Bose Corporation emerged from Building 20; so did computing powerhouse DEC; so did the hacker movement, via the Tech Model Railroad Club.
Building 20 was a success because it was cheap, ugly and confusing. Researchers and departments with status would be placed in sparkling new buildings or grand old ones — places where people would protest if you nailed something to a door. In Building 20, all the grimy start-ups were thrown in to jostle each other, and they didn’t think twice about nailing something to a door — or, for that matter, for taking out a couple of floors, as Jerrold Zacharias did when installing the atomic clock.
Somewhat reminiscent of Stewart Brand’s ‘How Buildings Learn’
In climate science, you can check out of the lab anytime you like, but you can never leave.
Dave Reay, University of Edinburgh, quoted in Nature this week.
This is from an article by Stephen Senn in Nature. He keeps making this point — for the very good reason that people want to pretend there is no problem. But there is.
Personalized medicine aims to match individuals with the therapy that is best suited to them and their condition. Advocates proclaim the potential of this approach to improve treatment outcomes by pointing to statistics about how most drugs — for conditions ranging from arthritis to heartburn — do not work for most people. That might or might not be true, but the statistics are being misinterpreted. There is no reason to think that a drug that shows itself to be marginally effective in a general population is simply in want of an appropriate subpopulation in which it will perform spectacularly.
When you treat patients with chronic diseases such as psoriasis, it quickly becomes clear that there is considerable within person variation is response to treatments. We do not understand what this variation is due to. What we do know however, is that assuming variation in response between people at single time points may be misleading in that we have no measure of within person variance. This is only one of the problems. But hey, precision, personalised.. whatever: it shifts units (as Frank Zappa once said of Michael Jackson).
Genome-wide study of hair colour in UK Biobank explains most of the SNP heritability.
Michael D. Morgan, Erola Pairo-Castineira, Konrad Rawlik, Oriol Canela-Xandri, Jonathan Rees, David Sims, Albert Tenesa & Ian J. Jackson
[Link to Nature Comm paper] https://doi.org/10.1038/s41467-018-07691-z
My guess is this is likely my last ‘research paper’ (although I now choose to redefine what counts as research). But not my last ‘thinking paper’. I cannot help but contrast the sheer volume of activity with that from our original papers on red hair. Things seemed so much simpler when we were young. But it is a nice coda to a career fugue.
This is from an editorial in the NEJM, discussing the results of a trial of a synthetic peanut antigen to facilitate tolerance. Prevously the ‘raw’ stuff had been shown to be useful. The synethic version will of course cost a lot, and might be considered IPR created through regulatory arbitrage.
AR101 and other, similar products such as CA002, which is being developed by the Cambridge group, would therefore appear to have a role in initial dose escalation. The potential market for these products is believed to be billions of dollars. It is perhaps salutary to consider that in the study conducted by the Cambridge group, children underwent desensitization with a bag of peanut flour costing peanuts.
Costing penauts: I wish I had said that
Leading universities should pledge to actually read the work of applicants for research positions rather than use controversial metrics during the selection process, a Nobel prizewinner has argued.
No, not a spoof, but words from Harold Varmus. Sydney Brenner, a good while back, observed that people tended not to read papers anymore, they just xeroxed them.
Modesty seems to be under negative selection — among modern scientists, at least. So I warmed to this comment on a report of some recent work on the genetics of Africa and hunter-gatherers.
Deepti Gurdasani, a genetic epidemiologist at the Wellcome Sanger Institute in Hinxton, UK. But it’s plausible, she adds. “There is literally nothing in Africa that is not possible since we have no idea what humans were doing on the continent 5,000 years ago.”
This is from an article in Nature.
Under pressure to turn out productive lab members quickly, many PhD programmes in the biomedical sciences have shortened their courses, squeezing out opportunities for putting research into its wider context. Consequently, most PhD curricula are unlikely to nurture the big thinkers and creative problem-solvers that society needs.
That means students are taught every detail of a microbe’s life cycle but little about the life scientific. They need to be taught to recognize how errors can occur. Trainees should evaluate case studies derived from flawed real research, or use interdisciplinary detective games to find logical fallacies in the literature. Above all, students must be shown the scientific process as it is — with its limitations and potential pitfalls as well as its fun side, such as serendipitous discoveries and hilarious blunders.
And from a letter in response
My father designed stellar-inertial guidance systems for reconnaissance aircraft and, after he retired, would often present his work to physics and engineering students. When they asked him what they should study to prepare for such a career, he would reply: “Read the classics,” by which he meant Aristotle, Ralph Waldo Emerson, Jean-Jacques Rousseau and Blaise Pascal.
The best scientific and technical progress does not come out of a box. It is more likely to emerge from trying to fit wild, woolly and tangential ideas into useful societal and economic contexts.
As the historian Norman Davies once said:
“Since no one is judged competent to offer an opinion beyond their own particular mineshaft, beasts of prey have been left to prowl across the prairie unchecked.”
Or as the Economist once put it”
“…professors fixated on crawling alone the frontiers of knowledge with a magnifying glass.”
This is the tragedy of our age: 90% right and 100% wrong. And that is even before we get to medicine.
Yet despite these innovations and those to come, quantitative risk prediction in medicine has been available for several decades, based on more classical statistical learning from more structured data sources. Despite reports that risk models outperform physicians in prognostic accuracy, application in actual clinical practice remains limited.
It seems unlikely that incremental improvements in discriminative performance of the kind typically demonstrated in machine learning research will ultimately drive a major shift in clinical care. In this Viewpoint, we describe 4 major barriers to useful risk prediction that may not be easily overcome by new methods in machine learning and, in some instances, may be more difficult to overcome in the era of big data.
The hype cycle marches on.
How is it that publishers can continue to make profits of 30–40%? How can Elsevier get away with charging, as described in the film, $10,702 for an annual subscription to Biomaterials? It’s partly that if you are a major research university you need access to all journals not just some of them, says Richard Price of Academia.edu, a platform for academics to share research papers. It’s a question of moral hazard, explains Stuart Shieber, a Harvard professor of computer science: the consumers of the research, the academics, are not the people who have to pay. It’s the libraries who pay, and the academics remain insensitive to price…..
In addition, publishers sell bundles of journals. It’s like cable television, you get a few things you do want along with a lot you don’t, explains one librarian. But unlike cable television you don’t know what others are paying—because publishers do secret deals with libraries.
Yes. But it speaks volumes about universities, too.
When working in Africa in the 1980s with my good friend Victor Pretorius, I heard a legend about an important tribe in Central Africa, the Masai. The legend claimed that a genius member of the tribe in the nineteenth century or earlier had the idea that cow’s urine was the safest fluid for washing cooking utensils. Compared with the previous practice of using far from clean river water, it avoided the dangers of dysentery and probably saved many lives. This simple and effective public heath practice was cast out by medical missionaries who had quite different ideas, more religious than medical, about what was clean and what was dirty. Neither the original genius, nor the missionaries, knew anything about the epidemiology of water-borne disease. Whether or not there is any substance to this legend, it has stayed in my mind as a metaphor appropriate for many of our problems today. Inventions such as Newcomen’s steam engine, Faraday’s electrical machines, and the idea that fresh urine is a sterile fluid, all came long before their scientific understanding.
James Lovelock, A Rough Ride to the Future. This is like so much of real discovery in clinical medicine, although the academy gets to write the history of how it is supposed to work.
For a baseline life expectancy of 80 years:
Well these are all taken from John Ioannadis’ article in JAMA. He asks : “Could these results possibly be true?”
The great financial crash led to some (but not enough) soul-searching about the state of academic economics and, in turn, the academy. Whole swathes of the modern research university are geared to the production of unreliable knowledge. There is money in it. Without wishing to understate in any way Ioannadis’ major contributions, we have known that there are fundamental methodological flaws in much of observational epidemiology for a long time (for instance see the late Alvan Feinstein’s article in Science). A must read.
(The Challenge of Reforming Nutritional Epidemiologic Research John P. A. Ioannidis, AMA. Published online August 23, 2018. doi:10.1001/jama.2018.11025)
From an obituary of Paul Boyer.
“Paul Boyer was approaching the finish line of his career when he risked everything with a jaw-dropping proposal. He addressed one of the most important, as-then-unanswered questions in biochemistry”
“We were attending a UCLA seminar in 1972 when I noticed that he wasn’t paying attention to the speaker. Afterwards, Paul approached us in a very excited state. This was surprising because he was known for his calm demeanour. He confessed that he had spent the hour thinking about old unexplained data. He asked: “What would you say if I told you that it doesn’t take energy to make ATP at the catalytic site of ATP synthase,” (as was universally held at the time) “but rather that it takes energy to get ATP off the catalytic site?” This was a eureka moment.
As is often the case with transformational ideas, early reactions were negative. When the Journal of Biological Chemistry rejected our manuscript containing data supporting this concept, Boyer told me without animosity that he could see why they would do that — “It was a very striking claim.”
Well, I have never had an idea to compare with this. But sitting through talks that do not light my fire, I have always found conducive to thinking creatively about something else. Its similar to the way that some writers practice their craft better in a coffee shop than in a silent office. Intellectual white noise.
Remember: the best ideas are not in the literature. If they were…..
Besides, “university league tables are like sausages: the more you know about how they are made, the less you want to [do with] them”.
“Research was structurally unprofitable even if you scored really well in the research excellence framework,” he claims. “It’s being financed by surpluses on taught master’s. I think that’s fine because part of the reason people came on the taught programmes was because the place was very highly ranked in research, and they thought they were going to be sitting at the feet of the best economists around. Academics had to understand the dynamic and deliver the teaching because that was what was paying for the research. Yet because of the history of underfunding [undergraduate] students [before the introduction of £9,000 tuition fees in 2012], a kind of mood gained ground in British universities that [all] students were an unprofitable activity.
Science 21 June 2013: 1394-1399.
For most alumni, university fundraising may seem to be uncoordinated and lacking in focus—an assortment of phone calls, solicitous letters, and invitations to a class reunion. But for Steven Rum, it’s a science. And the goal is to carry out more research.
Rum is senior vice president for development and chief fundraiser for Johns Hopkins Medicine in Baltimore, Maryland. Last year, his team had a banner year, raising $318 million. Their approach places the physician scientists at Hopkins on the donor front lines. The goal is to turn the positive feelings of “grateful patients” into support for new research, faculty chairs, academic scholarships, bricks and mortar, or simply defraying the cost of running a multibillion-dollar medical center.
Rum has 65 full-time fundraisers on a staff of 165. Each one is responsible for meeting weekly with physicians—their “caseloads” range from a dozen to more than 30 docs—to discuss which of their patients might be potential donors. The conversation is designed to help them identify what Rum calls a donor’s “qualifying interest” and connect it to their “capacity,” that is, the ability to make a donation.
More often than not, Rum’s team finds that sweet spot…..
”Ideally, I’d like to have one gift officer manage no more than six doctors,” he says.
Article in Nature. I largely agree, although my views are as much based on the hype-upon-hype that characterises so much of medical research, especially cancer. I do not have a reference, but whatever one’s views about the late David Horrobin, his Lancet article about cancer trials — written when he was dying from lymphoma — is worth a read. What a mess!
Key quotes from this article:
In 2017, my colleagues and I completed a study of all 48 cancer drugs approved by the European Medicines Agency between 2009 and 2013 (C. Davis et al. Br. Med. J. 359, j4530; 2017). Of the 68 clinical indications for these drugs (reasons to use a particular drug on a patient), only 24 (35%) demonstrated evidence of a survival benefit at the time of approval. Even fewer provided evidence of an improved quality of life for symptoms such as pain, tiredness and loss of appetite (7 trials; 10%). Most indications (36 of 68) still lacked such evidence three or more years after approval. Other groups in other regions have observed similar trends. For example, a 2015 study demonstrated that only a small proportion of cancer drugs approved by the FDA improved survival or quality of life (C. Kim and V. Prasad JAMA Intern. Med. 175, 1992–1994; 2015).
But the key point he makes is:
I believe that the low bar also undermines innovation and wastes money.
When assessments — whether in medicine or education — are flawed the loss in value is not in short term financial costs, but in what might have happened 10 years down the road.
There are periodic evaluations, but a poor result means losing only a fraction of your funding, says Schuman, who previously held one of the plum positions in U.S. science: as an investigator funded by the Howard Hughes Medical Institute on a 5-year contract. “I did not realize how the renewal clock of 5 years dissuaded me from going for risky ideas until I became a Max Planck director,” she says.
Cue, David Bowie:
The following is an excerpt from a review in press with Acta. You can see the full article with DOI 10.2340/00015555-2916 here
From the solar constant to thong bikinis and all stops in between.
A review of: “Sun Protection: A risk management approach.” Brian Diffey. IOP Publishing, Bristol, UK. ISBN 978-0-7503-1377-3 (ebook) ISBN 978-0-7503-1378-0 (print) ISBN 978-0-7503-1379-7 (mobi)
Leo Szilard was one of half a dozen or so physical scientists who, having attended the same Budapest gymnasium, revolutionised twentieth century physics. In 1934, whilst working in London, he realised that if one neutron hit an atom which then released two further neutrons, a chain reaction might ensue. Fearing of the consequences, he tried to keep the discovery secret by assigning the patent to the British Admiralty. In 1939, he authored the letter, that Einstein signed, warning the then US President of the coming impact of nuclear weapons.
After the war, in revulsion at the uses to which his physics had been applied, he swapped physics for biology. There was a drawback, however. Szilard liked to think in a hot bath, and he liked to think a lot. Once his interests had turned to biology he remarked that he could no longer enjoy a long uninterrupted bath — he was forever having to leave his bath, to check some factual detail (before returning to think some more). Biology seemed to lack the deep simplifying foundations of the Queen of Sciences.
But all of that media can’t really replace the socializing, networking, and simply fun that happened as part of (or sometimes despite) the conference formula.
I don’t know how to fix conferences, but the first place I’d start on that whiteboard is by getting rid of all of the talks, then trying to find different ways to bring people together — and far more of them than before.
I no longer go to many conferences, and that is a good thing. But fixing them is a problem, not least because many academic conferences are businesses that collect money that supports other activities. This is not always bad, but is often not good. ‘Getting rid of the talks’ is of course attractive. Leo Szilard once suggested that you should stand up, briefly report your conclusions, then sit down. Only if the audience were sceptical of your results would you have to speak for longer. As for size, there is no single right size. However the best conferences I have every attended were all small, with less than 40 people. But I wouldn’t t have got to these small ones, unless I had gone to the big ones.
The surge in open-access predatory journals is making it harder for contributors and readers to distinguish these from legitimate publications — a confusion that is fostered by the predatory-journal industry. One solution could be to deploy a variant of a well-established quality-control test. The scientific community could submit replicate test articles several times a year to a wide array of open-access journals, suspect and non-suspect.
From Steven N Goodman who, as ever, is worth reading. Of course, in one sense, it is a question of serial monogamy, or polygamy.
After earning his medical degree in 1951 he trained in hospitals in Montreal. “To my surprise I also found I enjoyed clinical medicine,” he wrote in his Nobel prize biography. Then he quipped, “It took three years of hospital training after graduation, a year of internship and two of residency in neurology, before that interest finally wore off.”
This is from the obituary of Ben Barres [link]
An utterly committed researcher, Professor Barres would regularly work until 2am or 3am. He “slept on the floor of my small office”, recalled Professor Raff. “Every morning when I arrived and opened the door, it would whack him in the head – he eventually learned to sleep facing the opposite direction.”
Somewhere, I cannot remember where, after one of his seminars, his intellectual depth (Ben Barres) was judged more favourably to that of his ‘sister’. His sister was his his former ‘self’, Barbara Barres. Such a neat experimental design to tease apart causality.
I too worked somewhere where people slept overnight in the lab, although I think the deciding factor there was an inability to find or pay for a suitable flat, rather than enthusiasm
David Hubel, on statistics: “We could hardly get excited about an effect so feeble as to require statistics for its demonstration.”
I came across this (below), in my end of year clear out. And even if this was 2016, rather than 2017, it is as good a thought to open 2018 with, as any other. It is from a review of “Life’s Greatest Secret: The Race to Crack the Genetic Code”, by Matthew Cobb. The review is by H Allen Orr. NYRB
Finally, and perhaps most important, Life’s Greatest Secret highlights the power of the beautiful experiment in science. Though Cobb pays less attention to this subject than he might have, the period of scientific history that he surveys was the golden age of the beautiful experiment in biology. Biologists of the time—including Nirenberg with his UUU, Crick and Brenner with their triplet code work, and others including Matthew Meselson, Franklin Stahl, and Joshua Lederberg—were masters of the sort of experiment that, through some breathtakingly simple manipulation, allowed a decisive or nearly decisive solution to what previously seemed a hopelessly complex problem. Such experiments represent a species of intellectual art that is little appreciated outside a narrow circle of scientists……..
But the larger lesson of Life’s Greatest Secret is one that may be worth remembering. When scientists require definitive answers, not merely suggestive patterns, they require experiments that are decisive and, if all goes well, beautiful.
There is something about teaching that makes you a better researcher. I know this is very countercultural wisdom, but I believed it all along. Luria, Magasanik, and Levinthal all believed it. Levinthal and Luria both had a very strong influence on me in this regard.
An (old) interview with David Botstein, in PloS genetics. Link
At least we are spared the ‘research led teaching’ mission statements.
“For example, I studied Physics, so I learned about how physicists think… and it is not how most people think. They have these tricks which turn difficult problems into far easier problems. The main lesson I took away from Physics is that you can often take an impossibly hard problem and simply represent it differently. By doing so, you turn something that would take forever to solve into something that is accessible to smart teenagers.”
But the opposite is now much more common. I think there are whole swathes of modern institutional and corporate life, that are designed to make the simple, complicated. At best, simple may sometimes be wrong, but complicated is usually useless — or much worse. I seem to remember Paul Jannsen, when asked why we do not seem to be able to discover revolutionary new drugs like we once did, respond: ‘in those days the idea of obviousness still existed’.
Yan Lecun of Facebook wrote:
In the history of science and technology, the engineering artefacts have almost always preceded the theoretical understanding: the lens and the telescope preceded optics theory, the steam engine preceded thermodynamics, the airplane preceded flight aerodynamics, radio and data communication preceded information theory, the computer preceded computer science.
This is so true for (much) medicine, too. The journal comes after the discovery.
Take salary: as Mrs. Neal told us during her crash course, you’ll carry your whole life the compound price of an un-negotiated first salary.
From Frederic Filloux in the Monday Note. A great article which, whilst focussed on the topic of journalism schools, has bags of relevance to future and therefore present day medical schools. The professional schools have a lot in common.
This was a quote from an article by an ex-lawyer who got into tech and writing about tech. Now some of by best friends are lawyers, but this chimed with something I came across by Benedict Evans on ‘why you must pay sales people commissions’. The article is here (the video no longer plays for me).
The opening quote poses a question:
I felt a little odd writing that title [ why you must pay sales people commissions]. It’s a little like asking “Why should you give engineers big monitors?” If you have to ask the question, then you probably won’t understand the answer. The short answer is: don’t, if you don’t want good engineers to work for you; and if they still do, they’ll be less productive. The same is true for sales people and commissions.
The argument is as follows:
Imagine that you are a great sales person who knows you can sell $10M worth of product in a year. Company A pays commissions and, if you do what you know you can do, you will earn $1M/year. Company B refuses to pay commissions for “cultural reasons” and offers $200K/year. Which job would you take? Now imagine that you are a horrible sales person who would be lucky to sell anything and will get fired in a performance-based commission culture, but may survive in a low-pressure, non-commission culture. Which job would you take?
But the key message for me is:
Speaking of culture, why should the sales culture be different from the engineering culture? To understand that, ask yourself the following: Do your engineers like programming? Might they even do a little programming on the side sometimes for fun? Great. I guarantee your sales people never sell enterprise software for fun. [emphasis mine].
Now why does all this matter? Well personally, it still matters a bit, but it matters less and less. I am towards the end of my career, and for the most part I have loved what I have done. Sure, the NHS is increasingly a nightmare place to work, but it has been in decline most of my life: I would not recommend it unreservedly to anybody. But I have loved my work in a university. Research was so much fun for so long, and the ability to think about how we teach and how we should teach still gives me enormous pleasure: it is, to use the cliche, still what I think about in the shower. The very idea of work-life balance was — when I was young and middle-aged at least — anathema. I viewed my job as a creative one, and building things and making things brought great pleasure. This did not mean that you had to work all the hours God made, although I often did. But it did mean that work brought so much pleasure that the boundary between my inner life and what I got paid to do was more apparent to others than to me. And in large part that is still true.
Now in one sense, this whole question matters less and less to me personally. In the clinical area, many if not most clinicians I know now feel that they resemble those on commission more than the engineers. Only they don’t get commission. Most of my med school year who became GPs will have bailed out. And I do not envy the working lives of those who follow me in many other medical specialties in hospital. Similarly, universities were once full of academics who you almost didn’t need to pay, such was their love for the job. But modern universities have become more closed and centrally managed, and less tolerant of independence of mind.
In one sense, this might go with the turf — I was 60 last week. Some introspection, perhaps. But I think there really is more going on. I think we will see more and more people bailing out as early as possible (no personal plans, here), and we will need to think and plan for the fact that many of our students will bail out of the front line of medical practice earlier than we are used to. I think you see the early stirrings of this all over: people want to work less than full-time; people limit their NHS work vis a vis private work; some seek administrative roles in order to minimise their face-to-face practice; and even young medics soon after graduation are looking for portfolio careers. And we need to think about how to educate our graduates for this: our obligations are to our students first and foremost.
I do not think any of these responses are necessarily bad. But working primarily in higher education, has one advantage: there are lost of different institutions, and whilst in the UK there is a large degree of groupthink, there is still some diversity of approach. And if you are smart and you fall outwith the clinical guilds / extortion rackets, there is no reason to stay in the UK. For medics, recent graduates, need to think more strategically. The central dilemma is that depending on your specialty, your only choice might appear to be to work for a monopolist, one which seeks to control not so much the patients cradle-to-grave, but those staff who fall under its spell, cradle-to-grave. But there are those making other choices — just not enough, so far.
An aside. Of course, even those who have achieved the most in research do not alway want to work for nothing, post retirement. I heard the following account first hand from one of Fred Sanger’s previous post-docs. The onetime post-doc was now a senior Professor, charged with opening and celebrating a new research institution. Sanger — a double Laureate — would be a great catch as a speaker. All seemed will until the man who personally created much of modern biology realised the date chosen was a couple of days after he was due to retire from the LMB. He could not oblige: the [garden] roses need me more!
No, I could not run, lead or guide a pharma company. Tough business. But I just hope their executives do not really believe most of what they say. So, in a FT article about the new CE of Novartis, Vas Narasimhan, has vowed to slash drug development costs; there will be a productivity revolution; 10-25 per cent could be cut from the cost of trials if digital technology were used to carry them out more efficiently.
And of course:
(he) plans to partner with, or acquire, artificial intelligence and data analytics companies, to supplement Novartis’s strong but “scattered” data science capability.
I really think of our future as a medicines and data science company, centred on innovation and access (read that again, parenthesis mine)
And to add insult to injury:
Dr Narasimhan cites one inspiration as a visit to Disney World with his young children where he saw how efficiently people were moved around the park, constantly monitored by “an army of Massachusetts Institute of Technology-trained data scientists”.
And not that I am a lover of Excel…
No longer rely on Excel spreadsheets and PowerPoint slides, but instead “bring up a screen that has a predictive algorithm that in real time is recalculating what is the likelihood our trials enrol, what is the quality of our clinical trials”
Just recall that in 2000 it would have been genes / genetics / genomics rather than data / analytics / AI / ML etc
So, looks to me like lots of cost cutting and optimisation. No place for a James Black, then.