That picture that changed everything. Nice piece in Nature tells the story. (Image: NASA)
In climate science, you can check out of the lab anytime you like, but you can never leave.
Dave Reay, University of Edinburgh, quoted in Nature this week.
This is from an interview with Geoffrey Hinton who — to paraphrase Peter Medawar’s comments about Jim Watson — has something to be clever about. The article is worth reading in full, but here are a few snippets.
Now if you send in a paper that has a radically new idea, there’s no chance in hell it will get accepted, because it’s going to get some junior reviewer who doesn’t understand it. Or it’s going to get a senior reviewer who’s trying to review too many papers and doesn’t understand it first time round and assumes it must be nonsense. Anything that makes the brain hurt is not going to get accepted. And I think that’s really bad…
What we should be going for, particularly in the basic science conferences, is radically new ideas. Because we know a radically new idea in the long run is going to be much more influential than a tiny improvement. That’s I think the main downside of the fact that we’ve got this inversion now, where you’ve got a few senior guys and a gazillion young guys.
I would make a few comments:
All has been said before, I know, but no apology will be forthcoming.
Genome-wide study of hair colour in UK Biobank explains most of the SNP heritability.
Michael D. Morgan, Erola Pairo-Castineira, Konrad Rawlik, Oriol Canela-Xandri, Jonathan Rees, David Sims, Albert Tenesa & Ian J. Jackson
[Link to Nature Comm paper] https://doi.org/10.1038/s41467-018-07691-z
My guess is this is likely my last ‘research paper’ (although I now choose to redefine what counts as research). But not my last ‘thinking paper’. I cannot help but contrast the sheer volume of activity with that from our original papers on red hair. Things seemed so much simpler when we were young. But it is a nice coda to a career fugue.
Leading universities should pledge to actually read the work of applicants for research positions rather than use controversial metrics during the selection process, a Nobel prizewinner has argued.
No, not a spoof, but words from Harold Varmus. Sydney Brenner, a good while back, observed that people tended not to read papers anymore, they just xeroxed them.
Modesty seems to be under negative selection — among modern scientists, at least. So I warmed to this comment on a report of some recent work on the genetics of Africa and hunter-gatherers.
Deepti Gurdasani, a genetic epidemiologist at the Wellcome Sanger Institute in Hinxton, UK. But it’s plausible, she adds. “There is literally nothing in Africa that is not possible since we have no idea what humans were doing on the continent 5,000 years ago.”
This is from an article in Nature.
Under pressure to turn out productive lab members quickly, many PhD programmes in the biomedical sciences have shortened their courses, squeezing out opportunities for putting research into its wider context. Consequently, most PhD curricula are unlikely to nurture the big thinkers and creative problem-solvers that society needs.
That means students are taught every detail of a microbe’s life cycle but little about the life scientific. They need to be taught to recognize how errors can occur. Trainees should evaluate case studies derived from flawed real research, or use interdisciplinary detective games to find logical fallacies in the literature. Above all, students must be shown the scientific process as it is — with its limitations and potential pitfalls as well as its fun side, such as serendipitous discoveries and hilarious blunders.
And from a letter in response
My father designed stellar-inertial guidance systems for reconnaissance aircraft and, after he retired, would often present his work to physics and engineering students. When they asked him what they should study to prepare for such a career, he would reply: “Read the classics,” by which he meant Aristotle, Ralph Waldo Emerson, Jean-Jacques Rousseau and Blaise Pascal.
The best scientific and technical progress does not come out of a box. It is more likely to emerge from trying to fit wild, woolly and tangential ideas into useful societal and economic contexts.
As the historian Norman Davies once said:
“Since no one is judged competent to offer an opinion beyond their own particular mineshaft, beasts of prey have been left to prowl across the prairie unchecked.”
Or as the Economist once put it”
“…professors fixated on crawling alone the frontiers of knowledge with a magnifying glass.”
This is the tragedy of our age: 90% right and 100% wrong. And that is even before we get to medicine.
When working in Africa in the 1980s with my good friend Victor Pretorius, I heard a legend about an important tribe in Central Africa, the Masai. The legend claimed that a genius member of the tribe in the nineteenth century or earlier had the idea that cow’s urine was the safest fluid for washing cooking utensils. Compared with the previous practice of using far from clean river water, it avoided the dangers of dysentery and probably saved many lives. This simple and effective public heath practice was cast out by medical missionaries who had quite different ideas, more religious than medical, about what was clean and what was dirty. Neither the original genius, nor the missionaries, knew anything about the epidemiology of water-borne disease. Whether or not there is any substance to this legend, it has stayed in my mind as a metaphor appropriate for many of our problems today. Inventions such as Newcomen’s steam engine, Faraday’s electrical machines, and the idea that fresh urine is a sterile fluid, all came long before their scientific understanding.
James Lovelock, A Rough Ride to the Future. This is like so much of real discovery in clinical medicine, although the academy gets to write the history of how it is supposed to work.
This is from David Hubel, although the citation is not to hand.
Most importantly, today’s organization of science tends to deprive a young scientist of one of the most important learning experiences, that of thinking up a project of one’s own and carrying it through; deciding for oneself, independently, whether to persist or to give up and switch over to something else.
I read this book so long ago I cannot remember when. But Perutz had a way with words ( as well as molecules).
What is Life? helped to make influential biologists out of several physicists: Crick, Seymour Benzer and Maurice Wilkins, among others. But there’s no indication from contemporary reviews that many biologists grasped the real significance of Schrödinger’s code-script as a kind of active program for the organism. Some in the emerging science of molecular biology were critical. Linus Pauling and Max Perutz were both damning about the book in 1987, on the centenary of Schrödinger’s birth. Pauling considered negative entropy a “negative contribution” to biology, and castigated Schrödinger for a “vague and superficial” treatment of life’s thermodynamics. Perutz grumbled that “what was true in his book was not original, and most of what was original was known not to be true even when the book was written”.
From an obituary of Paul Boyer.
“Paul Boyer was approaching the finish line of his career when he risked everything with a jaw-dropping proposal. He addressed one of the most important, as-then-unanswered questions in biochemistry”
“We were attending a UCLA seminar in 1972 when I noticed that he wasn’t paying attention to the speaker. Afterwards, Paul approached us in a very excited state. This was surprising because he was known for his calm demeanour. He confessed that he had spent the hour thinking about old unexplained data. He asked: “What would you say if I told you that it doesn’t take energy to make ATP at the catalytic site of ATP synthase,” (as was universally held at the time) “but rather that it takes energy to get ATP off the catalytic site?” This was a eureka moment.
As is often the case with transformational ideas, early reactions were negative. When the Journal of Biological Chemistry rejected our manuscript containing data supporting this concept, Boyer told me without animosity that he could see why they would do that — “It was a very striking claim.”
Well, I have never had an idea to compare with this. But sitting through talks that do not light my fire, I have always found conducive to thinking creatively about something else. Its similar to the way that some writers practice their craft better in a coffee shop than in a silent office. Intellectual white noise.
Remember: the best ideas are not in the literature. If they were…..
‘True science thrives best in glass houses where everyone can look in. When the windows are blacked out, as in war, the weeds take over; when secrecy muffles criticism, charlatans and cranks flourish’.
Max Perutz (1914-), Austrian born biochemist. Shared 1962 Nobel Prize for X-ray crystallography of haemoglobin.
These are a few words from the author of “Lost in Math: How Beauty Leads Physics Astray”, but they speak to me at least of an intellectual honesty that is (as the author argues) increasingly rare in the academy.
I am not tenured and I do not have a tenure-track position, so not like someone threatened me. I presently have a temporary contract which will run out next year. What I should be doing right now is applying for faculty positions. Now imagine you work at some institution which has a group in my research area. Everyone is happily producing papers in record numbers, but I go around and say this is a waste of money. Would you give me a job? You probably wouldn’t. I probably wouldn’t give me a job either.
What typically happens when I write about my job situation is that everyone offers me advice. This is very kind, but I assure you I am not writing this because I am asking for help. I will be fine, do not worry about me. Yes, I don’t know what I’ll do next year, but something will come to my mind.
What needs help isn’t me, but academia: The current organization amplifies rather than limits the pressure to work on popular and productive topics. If you want to be part of the solution, the best starting point is to read my book.
A quote from an earlier post I particularly like”
While the book focuses on physics, my aim is much more general. The current situation in the foundations of physics is a vivid example for how science fails to self-correct. The reasons for this failure, as I lay out in the book, are unaddressed social and cognitive biases. But this isn’t a problem specific to the foundations of physics. It’s a problem that befalls all disciplines, just that in my area the prevalence of not-so-scientific thinking is particularly obvious due to the lack of data.
I would make two observations. First, I think science is self-correcting — in the long run, at least. Just not when measured in lifetimes. Second, this takes me back to John Horgan’s book, and in particular how some domains of science are more easily corruptible that others (to be less combative, I might say, ‘less robust’). If you want to understand the modern medical research complex, you have to understand this.
And no, I wouldn’t have thought the effect was measurable. Wrong again.
From the results presented here it is clear that there has been a slow but steady decline in the frequency of certain variants in the Icelandic gene pool that are associated with educational attainment. It is also clear that education attained does not explain all of the effect. Hence, it seems that the effect is caused by a certain capacity to acquire education that is not always realized.
Fine thoughts, with words and a life to match
The departure of scientific reality from what common sense suggests is going on (the sun going round the Earth, for example) no longer threatens political institutions, but it threatens the human psyche just as much as it did in Galileo’s day. Dr Hawking’s South Pole of time was 13.7 billion years in the past—three times as old as the Earth. His mathematics showed that the universe, though finite in time, might be infinite in space.
No philosophy that puts humanity anywhere near the centre of things can cope with facts like these. All that remains is to huddle together in the face of the overwhelmingness of reality. Yet the sight of one huddled man in a wheelchair constantly probing, boldly and even cheekily demonstrating the infinite reach of the human mind, gave people some hope to grasp, as he always wished it would.
The Economist’s obit of Stephen Hawking
Article in Nature. I largely agree, although my views are as much based on the hype-upon-hype that characterises so much of medical research, especially cancer. I do not have a reference, but whatever one’s views about the late David Horrobin, his Lancet article about cancer trials — written when he was dying from lymphoma — is worth a read. What a mess!
Key quotes from this article:
In 2017, my colleagues and I completed a study of all 48 cancer drugs approved by the European Medicines Agency between 2009 and 2013 (C. Davis et al. Br. Med. J. 359, j4530; 2017). Of the 68 clinical indications for these drugs (reasons to use a particular drug on a patient), only 24 (35%) demonstrated evidence of a survival benefit at the time of approval. Even fewer provided evidence of an improved quality of life for symptoms such as pain, tiredness and loss of appetite (7 trials; 10%). Most indications (36 of 68) still lacked such evidence three or more years after approval. Other groups in other regions have observed similar trends. For example, a 2015 study demonstrated that only a small proportion of cancer drugs approved by the FDA improved survival or quality of life (C. Kim and V. Prasad JAMA Intern. Med. 175, 1992–1994; 2015).
But the key point he makes is:
I believe that the low bar also undermines innovation and wastes money.
When assessments — whether in medicine or education — are flawed the loss in value is not in short term financial costs, but in what might have happened 10 years down the road.
Günter taught us to distinguish experiments that should be done from those that could be done; he taught us to cherish the paradox over the obvious next thing. Importantly, Günter excelled at standing up firmly for one’s convictions in the face of controversy.
Born in Buckinghamshire in 1942, Sulston described his young self as a mechanically minded artisan who preferred science to sport
From an obituary of John Sulstan (by Judith Kible), whom I meet only once when some of our red hair work was featured on the Christmas Lectures. But the phrase harks back to a true characterisation of some types of science. Tool makers; and theorists.
It is a truism that you never understand anything unless you can understand it more than one way. I like this one:
When he and his colleagues spun ClearMotion out of the Massachusetts Institute of Technology in 2008, their intention was to use bumps in the road to generate electricity. They had developed a device designed to be attached to the side of a standard shock absorber. As the suspension moved up and down, hydraulic fluid from the absorber would be forced through their device, turning a rotor that generated electricity. But, just as a generator and an electric motor are essentially the same, except that they run in opposite directions, so ClearMotion’s engineers realised that running their bump-powered generator backwards would turn it into an ideal form of suspension. And that seemed a much better line of business. They therefore designed a version in which the rotor is electrically powered and pumps hydraulic fluid rapidly into and out of the shock absorber. The effect is to level out a rough road by pushing the wheels down into dips and pulling them up over bumps.
The following is an excerpt from a review in press with Acta. You can see the full article with DOI 10.2340/00015555-2916 here
From the solar constant to thong bikinis and all stops in between.
A review of: “Sun Protection: A risk management approach.” Brian Diffey. IOP Publishing, Bristol, UK. ISBN 978-0-7503-1377-3 (ebook) ISBN 978-0-7503-1378-0 (print) ISBN 978-0-7503-1379-7 (mobi)
Leo Szilard was one of half a dozen or so physical scientists who, having attended the same Budapest gymnasium, revolutionised twentieth century physics. In 1934, whilst working in London, he realised that if one neutron hit an atom which then released two further neutrons, a chain reaction might ensue. Fearing of the consequences, he tried to keep the discovery secret by assigning the patent to the British Admiralty. In 1939, he authored the letter, that Einstein signed, warning the then US President of the coming impact of nuclear weapons.
After the war, in revulsion at the uses to which his physics had been applied, he swapped physics for biology. There was a drawback, however. Szilard liked to think in a hot bath, and he liked to think a lot. Once his interests had turned to biology he remarked that he could no longer enjoy a long uninterrupted bath — he was forever having to leave his bath, to check some factual detail (before returning to think some more). Biology seemed to lack the deep simplifying foundations of the Queen of Sciences.
Already UK Biobank has transformed our understanding of health and disease, improving diagnosis and care for those with cancer and rare diseases. But if every participant has their genome sequenced, the prospects for understanding and treating disease, including obesity and mental health disorders, will be extraordinary. We do not know what we will find, but we can be confident it will transform our understanding of what it is to be healthy and what it is to be sick.
I love statistics, but I am just not very good at it, and find much of it extremely counter intuitive (which is why it is ‘fun’). The Monty Hall problem floored me, but then Paul Erdos got it wrong too (I am told), so I am in good — and numerate — company. During my intercalated degree in addition to a research methods tutorials (class size, n=2), we had one three hour stats practical each week (class size, n=10). We each used a Texas calculator, and working out a SD demanded concentration. Never mind, that during the rest of the week we were learning how to use FORTRAN and SPSS on a mainframe, ‘slowing’ down the process was useful.
Medicine has big problems with statistics although it is often not so much to do with ‘mathematical’ statistics but evidence in a broader sense. IMHO the biggest abusers are the epidemiologists and the EBM merchants with their clickbait NNT and the like. But I do think this whole field deserves much greater attention in undergraduate education, and cannot help but feel that you need much more small group teaching over a considerable period of time. Otherwise, it just degenerates into ‘What is this test for?’ exam fodder style of learning.
The problems we have within both medicine and medical research have been talked about for a long while. Perhaps things are improving, but it is only more recently that this topic has been acknowledged as a problem amongst practising scientists (rather than medics). This topic certainly resurfaces with increased frequency, and there have been letters on it in Nature recently. I like this one:
Too many practitioners who discuss the misuse of statistics in science propose technical remedies to a problem that is essentially social, cultural and ethical (see J. Leek et al. Nature 551, 557–559; 2017). In our view, technical fixes are doomed. As Steven Goodman writes in the article, there is nothing technically wrong with P values. But even when they are correct and appropriate, they can be misunderstood, misrepresented and misused — often in the haste to serve publication and career. P values should instead serve as a check on the quality of evidence.
I think you could argue with the final sentence of this (selected) quote, but they are right about the big picture: narrow technical solutions are not the problem here. Instead, we are looking at a predictable outcome of the corruption of what being a scientist means.
The Osborne effect is described in Wikipedia as follows:
The Osborne effect is a term referring to the unintended consequences of a company announcing a future product, unaware of the risks involved or when the timing is misjudged, which ends up having a negative impact on the sales of the current product. This is often the case when a product is announced too long before its actual availability. This has the immediate effect of customers canceling or deferring orders for the current product, knowing that it will soon be obsolete, and any unexpected delays often means the new product comes to be perceived as vaporware, damaging the company’s credibility and profitability.
AI and associated technologies will have major effects in some areas of medicine. Think skin cancer diagnosis, for certain; or this weekend story in the FT on eye disease; and radiology and pathology. This then begs the question, whether these skills are so central to expertise within a clinical domain, that students should think hard about these areas as a career. Of course, diagnosis of skin lesions is not all a clinical expert in this domain does. Ditto, ophthalmologists do more than look at retinas. Automated ECG readers have not put cardiologists out of work, after all. And many technical advances increase — not reduce — workloads.
But at some stage, people might want to start wondering if some areas of medicine are (not) going to be secure as long term careers. The Osborne metaphor should be a warning about how messy all this could be. Hype, has costs.
The surge in open-access predatory journals is making it harder for contributors and readers to distinguish these from legitimate publications — a confusion that is fostered by the predatory-journal industry. One solution could be to deploy a variant of a well-established quality-control test. The scientific community could submit replicate test articles several times a year to a wide array of open-access journals, suspect and non-suspect.
From Steven N Goodman who, as ever, is worth reading. Of course, in one sense, it is a question of serial monogamy, or polygamy.
Not the usual stuff Nobel Laureates spiel, but take a look. An article in Quartz is here and Wikipedia is useful on him
I like the ‘biogibberish’ epithet. And cannot help but suspect he would agree with David Hubel’s line that reading most papers now is like chewing sawdust. But you can see the fire still burns: you have to be dissatisfied with the state of the universe. How polite or angry you are is a question of personal style.
As each year passes, the once celebrated barriers between man and the other animals become less secure. Once we were the only tool makers, once we were the ones who discovered drugs or used technology. This report is about how finches commandeer cigarette butts for a new purpose.
That idea has been around, though never proved, since 2012. This was when Dr Suárez-Rodríguez showed that nests which had butts woven into them were less likely to contain bloodsucking parasites than were nests that did not. What she was unable to show was whether the nest-builders were collecting discarded cigarettes deliberately for their parasite-repelling properties, or whether that parasite protection was an accidental consequence of butts being a reasonably abundant building material.
And finches, again! Where would biology be without Darwin’s finches?
David Hubel, on statistics: “We could hardly get excited about an effect so feeble as to require statistics for its demonstration.”
I came across this (below), in my end of year clear out. And even if this was 2016, rather than 2017, it is as good a thought to open 2018 with, as any other. It is from a review of “Life’s Greatest Secret: The Race to Crack the Genetic Code”, by Matthew Cobb. The review is by H Allen Orr. NYRB
Finally, and perhaps most important, Life’s Greatest Secret highlights the power of the beautiful experiment in science. Though Cobb pays less attention to this subject than he might have, the period of scientific history that he surveys was the golden age of the beautiful experiment in biology. Biologists of the time—including Nirenberg with his UUU, Crick and Brenner with their triplet code work, and others including Matthew Meselson, Franklin Stahl, and Joshua Lederberg—were masters of the sort of experiment that, through some breathtakingly simple manipulation, allowed a decisive or nearly decisive solution to what previously seemed a hopelessly complex problem. Such experiments represent a species of intellectual art that is little appreciated outside a narrow circle of scientists……..
But the larger lesson of Life’s Greatest Secret is one that may be worth remembering. When scientists require definitive answers, not merely suggestive patterns, they require experiments that are decisive and, if all goes well, beautiful.
“For example, I studied Physics, so I learned about how physicists think… and it is not how most people think. They have these tricks which turn difficult problems into far easier problems. The main lesson I took away from Physics is that you can often take an impossibly hard problem and simply represent it differently. By doing so, you turn something that would take forever to solve into something that is accessible to smart teenagers.”
But the opposite is now much more common. I think there are whole swathes of modern institutional and corporate life, that are designed to make the simple, complicated. At best, simple may sometimes be wrong, but complicated is usually useless — or much worse. I seem to remember Paul Jannsen, when asked why we do not seem to be able to discover revolutionary new drugs like we once did, respond: ‘in those days the idea of obviousness still existed’.
This is a term I first learned from Clark Glamour and colleagues in Android Epistemology. Dermofit was a failed attempt to try and invent such a prosthesis.
Thinkers and thinking societies build tools that enhance their own thinking. When the speed of the positive feedback increases rapidly, we see a scientific and cultural revolution. When grit is put into the cogs or the base metals diluted, the opposite happens.
Last week I was giving a talk about tech, medicine and medical education, and for the life of me could not remember the following example, showing how key representation is to our intellectual toolbox. Worse, I knew it had an Edinburgh connection. Wikipedia has more.
This is from an article in Nature. And the problem is resolving differences in experimental results between labs.
But subtle disparities were endless. In one particularly painful teleconference, we spent an hour debating the proper procedure for picking up worms and placing them on new agar plates. Some batches of worms lived a full day longer with gentler technicians. Because a worm’s lifespan is only about 20 days, this is a big deal. Hundreds of e-mails and many teleconferences later, we converged on a technique but still had a stupendous three-day difference in lifespan between labs. The problem, it turned out, was notation — one lab determined age on the basis of when an egg hatched, others on when it was laid.
Now my title is from Blake:
He who would do good to another must do it in Minute Particulars: general Good is the plea of the scoundrel, hypocrite, and flatterer, for Art and Science cannot exist but in minutely organized Particulars.
And yet, I think I am using the quote in a way he would have strongly disagreed with. Some of the time ‘Minute particulars’ are not the place to be if you want to change the world. Especially in biology.