In 1978, the distinguished professor of psychology Hans Eysenck delivered a scathing critique of what was then a new method, that of meta-analysis, which he described as “an exercise in mega-silliness.”
Matthew Page and David Moher here in a commentary on a paper by the ever ‘troublesome’ John Ioannidis, in his article titled, “The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses”
To which, some of use would say, this was all predictable when the EBM bandwagon jumped on the idea that collating some information, and ignoring other information was ‘novel’. Science advances by creating and testing coherent theories of how the world works. Adding together ‘treatment effects’ is messier, and more prone to error. Just because you can enter data in a spreadsheet, doesn’t mean you should.
Another great video of Alan Kay, explaining how intellectual revolutions occur ( ‘appoint people who are not amenable to management’)
Interesting story in Nature highlighting instances where instead of doing post-docs, young biologists have raised funding to set up their own companies. Of course, most start-ups fail, but then most really interesting research projects should fail. Y Combinator is getting into this area, which surprised me. As the age of getting your first grant gets higher, and with the increasingly dysfunctional nature of much academic (medical) science, the attractions are obvious. I was sceptical (and still am) that the ‘software’ model would work in this area.
There was a story in the FT a few weeks back (paywall). It concerned the painting ‘Portrait of a Man’, by the Dutch artist Frans Hals. Apparently, the Louvre had wanted to buy the painting some time back, but were unable to raise the funds. However, a few weeks ago, the painting was declared a “modern forgery” by Sotheby’s — trace elements of synthetic 20th-century materials have been discovered in it. The story has a wider resonance however. The FT writes:
But if anything the fake Hals merely highlights an existing problem in how we determine attribution. In their quest to confirm attributions, dealers and auction houses seek the imprimatur of independent, usually academic, experts. Often that person’s “expertise” is deduced by whether they have published anything on a particular artist. But the skills required to publish a book are different to those needed to recognise whether a painting is genuine. Many academics are also fine connoisseurs. One of the few to doubt the attribution to Parmigianino of the St Jerome allegedly connected to Ruffini was the English scholar, David Ekserdjian. But too often the market values being a published writer over having a good “eye”.
Here is a non trivial problem: how can we designate expertise, and to what extent can you formalise it. In some domains — research for example — it is easier than in others. But as anybody who reads Nature or the broadsheets knows, research publication is increasingly dysfunctional, partly because of the scale of modern science; partly because the ‘personal knowledge’ and community has been exiled; and partly because it has become subjugated to academic accountancy because the people running universities cannot admit that they do not possess the necessary judgment to predict the future. To use George Steiner’s tidy phrase, there is also the ‘stench of money’.
But the real danger is when the ‘research model’ is used in areas where it not only does not work, but does active harm. I wrote some time back in a paper in PLoS Medicine:
Herbert Simon, the polymath and Nobel laureate in economics, observed many years ago that medical schools resembled schools of molecular biology rather than of medicine . He drew parallels with what had happened to business schools. The art and science of design, be it of companies or health care, or even the type of design that we call engineering, lost out to the kudos of pure science. Producing an economics paper densely laden with mathematical symbols, with its patently mistaken assumptions about rational man, was a more secure way to gain tenure than studying the mess of how real people make decisions.
Many of the important problems that face us cannot be solved using the paradigm that has come to dominate institutional science (or I fear, the structures of many universities). For many areas (think: teaching or clinical expertise), we need to think in ‘design’ mode. We are concerned more with engineering and practice, than is normal in the world of science. I do not know to what extent this expertise can be formalised — it certainly isn’t going to be as easy as whether you published in ‘glossy’ or ’non-glossy’ cover journals, but reputations existed long before the digital age and the digital age offers new opportunities. Publishing science is one skill, diagnosing is another, but there is a lot of dark matter linking the two activities. What seems certain to me, is that we have got it wrong, and we are accelerating in the wrong direction.
I try to avoid writing on this topic, finding it too depressing — although not as depressing as I once did, as I am closer to the end rather than the beginning. And there are signs of hope, just not where they once were.
There is an editorial in Nature titled ‘Early-career researchers need fewer burdens and more support’. It makes depressing reading. The contrast is with a talk on YouTube I listened to a few days back, by the legendary computer engineer (and Turing award winner and much else) Alan Kay, in which he points out that things were really much better in the 1960s and people at the time knew they were much better. Even within my short career, things were much better in 1990 than 2000, 2000 than 2010 and so on. When people ask me, is it sensible to pursue a career in science, I am nervous about offering advice. Science is great. Academia, in many places, is great. But you can only do most science or academia in a particular environment, and there are few places that I would want to work in if I were starting out. And I might not get into any of them, anyway (Michael Eisen’s comment: never a better time to do science, never a worse time to be a scientist’). I will share a few anecdotes.
Maybe 10-15 years ago I was talking to somebody who — with no exaggeration — I would describe as one of the UKs leading biologists. This person described how one of their offspring was at university and had, for the first few years not taken his/ her studies too seriously. Then things changed, and they wondered about doing a PhD and following a ‘classical’ scientific career. The senior biologist expressed concern, worried that there was now no sensible career in science, and that much as though he/she had enjoyed their career, he/she could not longer recommend it. There was some guilt, but your children are your children.
The second, was a brief conversation with the late physicist John Ziman. I had read some of Ziman’s work — his ‘Real Science’ is for me essential reading for anybody who wants to understand what has happened to the Mertonian norms, and why science is often increasingly dysfunctional — but he shared a bit of his life history with me. When he was appointed as a lecturer at Cambridge in physics, the topic of his lectures was ‘new’ and there were no established books. So he set out to remedy the situation and spent the first two years writing such a book (still available, I think), and after that, turned his attention back to physics research, and later much more (‘you have to retire to have the time to do serious work’). He commented that this would simply be impossible now.
With respect to medicine, there has been attempts for most of my life to develop schemes to encourage and support young trainees. I benefited from them, but I question whether they target the real problem. There are a number of issues.
First, the model of training of clinical academics in medicine is unusual. Universities tend to want external funders to support the research training of clinical academics (Fellowships), but that is a model with severe limitations. Nurturing talent is a core business of the universities, and they need to devote resource to it. It is their resposibility. Of course, they need to train and support academics, not just researchers. This is what career progression within academia is about: lecturer, reader, professor etc. What medical schools want to do is to off load the risk on to the person, and then only buy when the goods have been tasted. In a competitive world, where other career options are open, this might not work well. Worst of all, it funnels a large number of institutions — institutions that should show diversity of approaches — into the lowest common denominator of what is likely to be funded by the few central funders. Until you have independence of mind and action, you cut your chances of changing the world. (Yes, I hear you say, there is not enough money, but most universities need to cut back on ‘volume’.)
The second issue, is about whether the focus should be on schemes encouraging young people into science. I know I may sound rather curmudgeonly, but I worry that much activity relating to pursuing certain careers is reminiscent of ‘wonga like’ business models. I think we should do better. If youngsters look at what life is like at 40, 50 and 60 or beyond, and like it, they might move in that direction. You would not need to encourage them — we are dealing with bright people. A real problem for science funding is that for many individuals, it resembles a subsistence society, with little confidence about long term secure funding, and little resilience against changes in political will. Just look at Brexit. I remember once hearing somebody who had once considered a science career telling me that it seemed to him that most academics spent their life writing grants, and feeling uncomfortable about replacing what they wanted to do, with what might be funded. Conversations about funding occupied more time than serous thinking. I listened nervously.
Finally, I take no pleasure in making the point, but I do not see any reason to imagine that things will get better over a ten or twenty year period. One of my favourite quotes of the economist Kenneth Galbraith, is to the effect that the denigration of value judgement is one of the ways the scientific establishment maintains its irrelevance. I think there is a lot in that phrase. If we were to ask the question, what is more critical: understanding genetics, or understanding how institutions work, I know where my focus wold be be. I suspect there is more fun there too, just that much of the intellectual work might not be within academia’s walls.
Note: After writing this I worried that people would think that I was opposing schemes to encourage young people, or that I failed to understand that we have to treat those with new ideas differently. That was not my intention. Elsewhere I have quoted Christos Papadimitriou, and he gets my world view, too.
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
The goal of the new CFF [Cystic Fibrosis Foundation, a US patient charity] Therapeutics Lab, says Preston W. Campbell III, the foundation’s CEO and president, is to generate and share tools, assays, and lead compounds, boosting its partners’ chances of finding treatments. Frustration with academic technology transfer agreements was a key motivation, he notes. University-based researchers funded by the foundation have to seek approval from their institution’s legal department before sharing assays, cells, or any intellectual property, a hurdle that can take a year to negotiate. “This was killing us,” Campbell says, “ but if we created our own laboratory, we could not only focus on the things we wanted to focus on, we could also share them freely.” Science
Well you really could not make this up. From the EFF:
On August 30, 2016, the Patent Office issued U.S. Patent No. 9,430,468, titled; “Online peer review and method.” The owner of this patent is none other than Elsevier, the giant academic publisher. When it first applied for the patent, Elsevier sought very broad claims that could have covered a wide range of online peer review. Fortunately, by the time the patent actually issued, its claims had been narrowed significantly. So, as a practical matter, the patent will be difficult to enforce. But we still think the patent is stupid, invalid, and an indictment of the system….
Before discussing the patent, it is worth considering why Elsevier might want a government granted monopoly on methods of peer review. Elsevier owns more than 2000 academic journals. It charges huge fees and sometimes imposes bundling requirements whereby universities that want certain high profile journals must buy a package including other publications. Universities, libraries, and researchers are increasingly questioning whether this model makes sense.
Avoid Elsevier. This is a world that should no longer exist.
The (medical) future is here, just unevenly distributed
The lessons from Glybera, the first gene therapy to be sold in Europe, still loom large. It cures a genetic condition that causes a dangerously high amount of fat to build up in the blood system. Priced at $1m, the product has only been bought once since 2012 and stands out as a commercial disaster. Economist
“This is a terrific essay. The keystone of science’s power and the continued survival of a civilisation based on — and at the mercy of — science, is contained in the following:
‘As Jacob Bronowski (1956) said – in science truth is all-of-a-piece: either we are truthful always and about everything; or else the dishonesty ramifies, the rot spreads, and rapidly we are being honest about nothing.’
External audit, as we have seen over the last quarter century in many human domains, does not work. All too often it is merely a tool for rendering deceit invisible. Integrity is not a bolt on for our survival, but a bit of our biological machinery that is struggling against the loss of the ‘personal’.
If we look back to the writings of Merton, Lewis Thomas, Peter Medawar, John Ziman, and the like, it is clear we lack a coherent and deep view of what has happened to modern science and — because science is integral to the modern world — our civilisation. This essay sets the tone for what must follow.”
“Johnson deftly states, “the curse of this age of microspecialization and the proliferation of ‘’omics’ is to separate the ridiculome from the relevantome.””
From a review of George Johnson’s ‘The Cancer Chronicles’ in Science.
“On this note, he quotes mathematician and family friend Jacques Hadamard, apparently complaining about a student who asked for a thesis topic, “Can you imagine that? If he has no topic of his own, he should not even think of a Ph.D.!””
Which brings to mind David Hubel’s practice of trying to persuade students not to do a PhD — he only wanted the ones who ‘really’ wanted it, rather than those who were just judged able.
From a book review of Fractalist in Science
“Scandals, however, raised questions about whether to trust U.S. researchers. In 1964, news broke that 22 patients at the Jewish Chronic Disease Hospital in Brooklyn had been injected with cancer cells without their knowledge”
NEJM. Worth a read.
No, not ‘up North’, but a neat way to check whether people have been sloppy or dishonest. The following from the Economist
The GRIM test, short for granularity-related inconsistency of means, is a simple way of checking whether the results of small studies of the sort beloved of psychologists (those with fewer than 100 participants) could be correct, even in principle.
Full PeerJ reprint here.
[Professor Glyn David] highlighted the requirement across the sector to cross subsidise loss-making research with student fees. ‘So many of us have been arguing the real issue isn’t student fees, it’s proper funding of research.’
This is the view from Australia, but what they do, we do a little later.
“Overtime pay for postdoctoral scientists is welcome — but could mean fewer positions.” Nature
This is from the US, and it only refers to those below a certain salary. When I worked in France, in Pierre Chambon’s rather large laboratory, wives or girlfriends were referred to as ‘Chambon widows’.
A couple of sentences from this article on the value of interdisciplinary research got me thinking — or at least pulling some memes off my dusty intellectual shelf of clutter. The article is about Ian Goldin, and some ideas I am sure he talks about in his new book, which I haven’t read.
He added that “one of the reasons” for the 2008 financial crisis was that “people lost their ethics, their judgement, and their wisdom” because of disciplinary silos.
I agree. I remember the Economist putting it more harshly: …’professors fixated on crawling along the frontiers of knowledge with a magnifying glass’[Economist, December 10th 2011]. Economics, a bit like psychiatry in medicine, is the canary in the mine. Nor would teaching mandatory ethics courses (‘I am certified in ethics A+!), do very much. Enron’s management were stars at HBS. This is one of the tragedies of many modern universities, so busy edging their way up largely meaningless ranking scales, that they are unable to tackle the problems society faces.
Golden was quoted as saying, ‘[there is] a “real pressure” on universities to be “thinking ahead” and teaching information that will remain relevant when current students “reach their mid-careers”’.
There are two aspects to this. One is that the whole idea of education is a way of hedging against a changing environment. If the world was constant, we could dispense with much (but not all) education — training would suffice. This is just another way of saying advance comes from when sons do not do what their fathers did (‘20th century physics was made by the sons of coblers’. Substitute your gender, please). But from a teaching perspective there is another facet to think about. We cannot adequately judge how well we educate our students over the short term (alone). Yes, they can pass finals. Yes, they can take a history etc. But the test of education is how well they behave and think 20 years down the line. This is a large search space that we can only navigate using theories about what makes the world change, and what makes people push at the boundaries: do not cite Cronbach’s alpha, at me. But in examinations and certification, like so much else in science and society, we are blinded by the apparent certitude of short term goals. And the allure of summary measures, rather than the messiness of the real world.
This comment (and phrase) from Bruce Schneier struck a chord with me.
NYU professor Helen Nissenbaum gave an excellent lecture at Brown University last month, where she rebutted those who think that we should not regulate data collection, only data use: something she calls “big data exceptionalism.” Basically, this is the idea that collecting the “haystack” isn’t the problem; it what is done with it that is………Under this framework, the problem with wholesale data collection is not that it is used to curtail your freedom; the problem is that the collector has the power to curtail your freedom. Whether they use it or not, the fact that they have that power over us is itself a harm.
Of course, as Alan Kay said, we need ‘big ideas’ rather than assuming ‘big data’ will do our thinking for us. This is not to deny that large data sets are not useful, nor that they do not allow you to answer questions, you might not have been able to do so before. But A/B testing only gets you so far. And beware technicians who want to mould nature to their method, not vice versa; or change what meaningful consent means.
Daniel Sarewitz in Nature
The quality problem has been widely recognized in cancer science, in which many cell lines used for research turn out to be contaminated. For example, a breast-cancer cell line used in more than 1,000 published studies actually turned out to have been a melanoma cell line. The average biomedical research paper gets cited between 10 and 20 times in 5 years, and as many as one-third of all cell lines used in research are thought to be contaminated, so the arithmetic is easy enough to do: by one estimate, 10,000 published papers a year cite work based on contaminated cancer cell lines. Metastasis has spread to the cancer literature……..That problem is likely to be worse in policy-relevant fields such as nutrition, education, epidemiology and economics, in which the science is often uncertain and the societal stakes can be high
See this great piece by Bruce Charlton
Professional science has arrived at this state in which the typical researcher feels free to indulge in unrestrained careerism, while blandly assuming that the ‘systems’ of science will somehow transmute the dross of his own contribution into the gold of truth. It does not: hence the preponderance of irreproducible publications.
More structural issues in higher-ed that appear hard to change (see previous post). From Science:
Unfortunately, it is not clear why Ph.D. students pursue postdoc positions and how their plans depend on individual-level factors, such as career goals or labor market perceptions.
And in Nature
The number of US faculty members who have tenure or are on the tenure track is falling, according to a report by the American Association of University Professors in Washington DC. Over the past 40 years, the proportion of the academic labour force that is in a full-time tenured position has shrunk by one-quarter, and the proportion in tenure-track posts has halved, reports Higher Education at a Crossroads.
Across large swathes of higher ed there is an enormous amount of cross-subsidy, much of it based on misinformation about ‘what you are buying’. Tech and data will start to unpick at much of this. The future for many institutions is uncertain.
Two articles about Sci-hub (here and here). No, I am not encouraging illegal downloading. But I hope we can look back in a few years with shame at the way journals, their publishers and those who have a vested interest in the mismeasure of science have hindered educational advance, and wasted public money. Some specialty journals do indeed pour money back into their subject, but it is a minority. All too often medical journals are a way of making money for publishers and specialist societies. There will be an iTunes moment (I hope).
This quote is from the ever quotable John Ioannidis, in a commentary of the disagreements about how to interpret the evidence linking salt intake and health.
“Sometimes I wonder whether published observational epidemiology is simply reflecting a power-weighted vote count of the opinions of epidemiologists. What does a risk ratio of 1.3 mean? Perhaps it means that those who believe in the risk factor have 1.3-fold more powerful opinions than those who don’t believe in the risk factor. In this (hypothetical) nightmare situation, risk ratios are accurate measures of epidemiologists’ net bias.
Systematic reviews cannot settle this conundrum after the fact. Even systematic reviews of randomized trials can reach almost any conclusion the reviewers believe in. “
Gary Taubes, talking about our understanding of obesity:
“Here’s another possibility: The 600,000 articles — along with several tens of thousands of diet books — are the noise generated by a dysfunctional research establishment.”
I think it was James Le Fanu who suggested that closing most UK departments of epidemiology and public health might result in a net gain to human health. So much research work is zombie science: you can’t kill it , because it is already dead ( I owe this formulation to Bruce Charlton). But the problem is not just with observational research. Ironically, it may have been the fact that Doll and Hill were right, that may have been, in the long term, a harmful influence on discovery.
Some Neanderthals would — based on MC1R sequence — be expected to have red hair. What has always caused me confusion is the way that dates for everything to do with human paleohistory, and the various representations of our evolution, are revised based on n of 1 publications. No doubt the story will get easier, but I think silence for a while on the ‘greatest story every told’ would be in order. At least from me.
Note added: And then…..
Venki Ramakrishnan was on the radio the other day. I cannot remember his exact words but they were something to the effect that he wanted ‘not to generate lots of data, but instead, lots of understanding’. Says it all.
People always want to mess up on invention. This about the birth of email and the death of Ray Tomlinson.
“It wasn’t an assignment at all, he was just fooling around; he was looking for something to do with ARPANET,” Raytheon spokeswoman Joyce Kuzman said in a statement about Tomlinson’s death.
When Tomlinson showed his early work on email to his coworker at Bolt Beranek and Newman (BBN), Jerry Burchfiel, he was initially warned that he shouldn’t show anyone what he was doing. “Don’t tell anyone!” Burchfiel reportedly said. “This isn’t what we’re supposed to be working on.”
Tomlinson’s death gives us a chance to look at how various innovations come to pass. They are rarely, if ever, the work of one person. And in the case of email, Tomlinson contributed greatly, along with people like Bob Clements of BBN, Dick Watson of SRI International, and Stephen Lukasik of ARPA (now known as Darpa). And they all managed to anger the Department of Defense for quite literally being too ahead of their time.
We owe the word ‘revolution’ — in the meaning of changing the world — to Galileo and the motion of the planets. You can almost define invention as that which disturbs: it is why Freeman Dyson, titled one of his book’s about science, ‘Disturbing the Universe’. But each generation wants to forget this, ours perhaps more than others. One of the great things about computing over the last half century is that sometimes the barriers to entry have been so low: biology and medicine are much harder. The other lesson: do not be too far ahead of your time.
This made me laugh. I have got used to MC1R mutations and red hair in Neanderthals, but this article (full research paper in Science here) brought a smile to my face, even if I am still a little hazy on the genetics.
JBS Haldane once commented ‘that God would appear to be inordinately fond of beetles’, based on the observation that the world was so full of different species of beetles.
I have long had similar thoughts about seborrhoeic keratoses. God must be inordinately fond of them. Seborrhoeic keratoses are benign skin tumours, some of which contain identified mutations: they generally attract little serious research interest (apart from yours truly, of course). However, their significance clinically is enormous. This is because they are incredibly common as people move into their fourth decades and beyond, and because they vary so much in their morphological appearance. They mimic everything, including melanoma. So, most things referred as possible melanomas in many clinics, will turn out to be harmless seborrhoeic keratoses. Of course, a more cynical view is that since seborrhoeic keratoses are such great mimics, they in effect create lots of work for dermatologists. I suppose I should say thank you, next time I bump into one of my distant cousins, but the basis of the link — if confirmed — also deserves some serious mechanistic thought.
Nice BBC radio programme. Impressed with the presenters, rather than yours truly.
Martin Wolf, the influential economist, wrote in the FT sometime back that the public might soon view pharma companies in the same way many viewed banks (or at least bankers). Here is a news item from today’s FT.
Dr Mikael Dolsten, president of worldwide research and development at Pfizer, said he was aware of the unease but that the combined company would occupy a “sweet spot” in R&D.
Brent Saunders, Allergan chief executive, has questioned the efficiency of discovery research conducted by big pharma groups. In an interview with the FT last year, he argued that smaller companies and academic centres were better suited to this sort of science.
Since then, he has modified his position somewhat, arguing that drug discovery has a role at big pharma companies, providing it is has a high chance of success and is targeted at illnesses where the company already has a strong selection of drugs.
This is all about the merger between Allergen and Pfizer. The arguments may be more nuanced than I want to believe, but this is what happens when ‘financialization’ becomes more important than invention (and long term value). Lets just call it: the no James Black syndrome.
And can we please skip the dreadful ‘sweet spot’ terminology.
Remember those compare and contrast questions (UC versus Crohns; DLE versus LP etc.). Well, look at these two quotes from articles in the same edition of Nature.
The first from the tsunami of papers showing that ‘Something in rotten in the state of
Denmark Science’ — essentially that the Mertonian norms for science have been well and truly trampled over.
Journals charge authors to correct others’ mistakes. For one article that we believed contained an invalidating error, our options were to post a comment in an online commenting system or pay a ‘discounted’ submission fee of US$1,716. With another journal from the same publisher, the fee was £1,470 (US$2,100) to publish a letter. Letters from the journal advised that “we are unable to take editorial considerations into account when assessing waiver requests, only the author’s documented ability to pay”.
Discrete Analysis’[the journal] costs are only $10 per submitted paper, says Gowers; money required to make use of Scholastica, software that was developed at the University of Chicago in Illinois for managing peer review and for setting up journal websites. (The journal also relies on the continued existence of arXiv, whose running costs amount to less than $10 per paper). A grant from the University of Cambridge will cover the cost of the first 500 or so submissions, after which Gower hopes to find additional funding or ask researchers for a submission fee.
Well done the Universities of Cambridge and Cornell (arXiv). For science, the way forward is clear. But for much clinical medicine, including much of my own field, we need to break down the barriers between publication and posting online information that others may find useful. This cannot happen until the financial costs approximate to zero.
Nor did Mann have to worry that a good idea would struggle for funding. Whereas many government research heads fret about budgets that don’t at least keep pace with inflation, past DARPA directors are surprisingly blasé about the agency’s finances. “I never really felt constrained by money,” Tether says. “I was more constrained by ideas.” In fact, aerospace engineer Verne (Larry) Lynn, DARPA’s director from 1995 to 1998, says he successfully lobbied Congress to shrink his budget after the Clinton administration had boosted it to “dangerous levels” to finance a short-lived technology reinvestment program. “When an organization becomes bigger, it becomes more bureaucratic,” Lynn told an interviewer in 2006.
A little about what makes DARPA tick. This reminds me of come of the comments from Xerox PARC (from John Seely Brown? — I can’t remember) about how important it was to keep the research budget (not the D, of R+D) below 1%. Any higher, and the bean counters would get interested, and start trying to manage the budget. [See my previous post from Alan Kay]. Not all increases in funding are a good idea (unless the success metric is money, rather than discovery)
After yesterday’s post I couldn’t resist one more slide from Alan Kay. This is about how we once knew how to fund and support real discovery and invention.