Interesting story in Nature highlighting instances where instead of doing post-docs, young biologists have raised funding to set up their own companies. Of course, most start-ups fail, but then most really interesting research projects should fail. Y Combinator is getting into this area, which surprised me. As the age of getting your first grant gets higher, and with the increasingly dysfunctional nature of much academic (medical) science, the attractions are obvious. I was sceptical (and still am) that the ‘software’ model would work in this area.
Cartoon characters not infrequently run off the edge of a cliff. Pause. They then realise there is nothing there to support their running. Time lags are awkward to deal with in any analysis, but since most things do not happen overnight, they are ubiquitous. In analysis, we replace with a fudge factor. Or we ignore them.
I haven’t seen much comment on what I think is the most interesting aspect of the science news over the last few weeks. Here is a line from THE.
Five out of nine laureates in the core prizes for physics, chemistry, medicine and economic sciences were born in the UK. All crossed the pond as rather valuable immigrants to the US.
UK science, and many UK universities, have been in profit-harvesting mode for a long time now. Over the edge of that cliff. Things are going to
fall apart. OK, what the hell:
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Knowledge is not a looseleaf notebook of facts. Above all, it is a responsibility for the integrity of what we are, primarily of what we are as ethical creatures. You cannot possibly maintain that it informed integrity if you let other people run the world for you what you yourself continue to live out of a rag bag of morals that come from past beliefs. That is really crucial today. You can see it is pointless to advise people to learn differential equations, or to do a course in electronics or in computer programming. And yet, 50 years from now, if an understanding of man’s origins, his evolution, his history, his progress is not the commonplace of the schoolbooks, we shall not exist. The commonplace of the schoolbooks of tomorrow is the adventure of today, and that is what we are engaged in……
It sounds very pessimistic to talk about western civilisation with a sense of retreat. I’ve been so optimistic about the ascent of man; and I going to give up at this moment? Of course not. The ascent of man will go on. But do not assume that it will go on carried by Western civilisation as we know it. We are being weighed in the balance at this moment.
The Ascent of Man, Jacob Bronowksi 1973.
Abraham Vergese, an infectious disease physician, gave a talk here in Edinburgh last week. It was a very mixed audience, but I suspect the many students who were there enjoyed it. I have not read any of this books — nor looked at his TED talk — but his Wikipedia entry gives you a flavour of how interesting he is, and how varied a career can be — when you have courage.
One issue that came up tangentially, was the history of diagnosis, and there were some opinions ventured by the audience in terms of when diagnosis was historically established. I may have missed key points, but I found it hard to accept that the idea of diagnosis was something you could date except in very broad terms, even less that you could associated it with the 1870s or with the idea of stethoscopes being a key marker of when modern ideas of diagnosis were established. For instance — and since the lecturer was an ID physician — my first thoughts turned to scabies. The scabies mite was identified in the 1690s, and it was recognised as the cause of the disease ( I am not quoting primary sources so let me know if……) So here we have a clear linking of symptoms, signs, causality, a causal agent, and a broader theory about pathogenesis and epidemiology. So, this it got me thinking about how I view the topic of diagnosis.
Diagnosis is the mapping of one state with another, with the two states being linked by a network of attributes. Diagnosis is a suitcase term: it may contain lots of different tools, tools suited to various purposes, and tools for which we may find different purposes over time. Diagnosis represents an attempt to classify the world into particular states with often the goal of making some predictions about some other state. Most of the time, we think in terms of prediction, about what might happen to that person with or without some intervention. If you see these physical signs (burrows) and the patient describes particular symptoms (itch), then the ‘state’ is scabies. If the diagnosis is correct, you can say something about what causes the state, what might happen, and what effect a particular intervention (permethrin / malathion etc) might have. If you are lucky, you can feel happy with causal arrows linking much of what you say and think. Prediction is important but it is of course not the only quality we want in a theory. We tend to prefer some theories to others, even when they why make similar predictions. Think of Copernicus. We tend to prefer one of the following, irrespective of whether both allow the same quantitative clinical predictions:
Our suitcase of diagnostic concepts have changed over time, however. For instance, even in modern medicine, causality is often lacking. We may use proxy or associated factors to define particular states. We may use simple heuristics as our guide to action, even though we have little idea of where the causal arrows are going. Think much of psychiatry. This does not mean we are powerless, just that we are more ignorant than we would like. We are of course wedded to particular metaphysical systems.
Diagnosis might have been used in the absence of knowledge about particular interventions to attribute blame, as an explanation. If a patient behaved in this way or suffered some state, it was a divine punishment for some behaviour. Now, I may not agree with this world view, but this too is diagnosis. The theory my seem wrong, it may seem primitive, but then my ideas of physics are primitive too if they are applied to the world of the very small.
Galen thought in terms of the mean, and the treatment by opposites (hot treatments for cold; moist treatment for drying diseases etc). This all sounds slightly crazy to modern ears (although dermatologists among you will point out the latter has definite therapeutic merit within very particular skin states). Or how about the idea of therapeutic ‘signatures’. This is from Ian Hacking :
Syphilis is signed by the market place where it is caught; the planet Mercury has signed the market place; the metal mercury, which bears the same name, is therefore the cure for syphilis.
As Hacking points out this allowed Paracelsus to kill lots of people simply because he knew that mercury worked. But whatever the metaphysical system linking two states, the idea of diagnosis was firmly established. Just as Newton got most things right in his physics, and most of us ignore what came after — except when we use the GPS.
Diagnosis was not limited to medicine. Our ancestors spent their lives making diagnoses about what to eat and what not to eat. Making diagnoses about what particular weather states would do to crops etc. Plumbers make diagnoses, as do any humans trying to make sense of an environment that is not static, and where we value intervention.
What may have been specific to medicine was our hangs up about whether there was something special about humans, and whether the simple rules, experimentations and demonstrations of efficacy that allowed other types of human technological progress or indeed much of everyday life, applied in the domain of disease. Successful interventions or demonstrations will have had an effect on metaphysical beliefs in the long term. And of course much of this story is tied up with the growth of that particular branch of formal knowledge we call science. 1870 is just a little late.
 Hacking I. The emergence of probability : A philosophical study of early ideas about probability, induction and statistical inference. Cambridge: Cambridge University Press; 1984.
The goal of the new CFF [Cystic Fibrosis Foundation, a US patient charity] Therapeutics Lab, says Preston W. Campbell III, the foundation’s CEO and president, is to generate and share tools, assays, and lead compounds, boosting its partners’ chances of finding treatments. Frustration with academic technology transfer agreements was a key motivation, he notes. University-based researchers funded by the foundation have to seek approval from their institution’s legal department before sharing assays, cells, or any intellectual property, a hurdle that can take a year to negotiate. “This was killing us,” Campbell says, “ but if we created our own laboratory, we could not only focus on the things we wanted to focus on, we could also share them freely.” Science
Well you really could not make this up. From the EFF:
On August 30, 2016, the Patent Office issued U.S. Patent No. 9,430,468, titled; “Online peer review and method.” The owner of this patent is none other than Elsevier, the giant academic publisher. When it first applied for the patent, Elsevier sought very broad claims that could have covered a wide range of online peer review. Fortunately, by the time the patent actually issued, its claims had been narrowed significantly. So, as a practical matter, the patent will be difficult to enforce. But we still think the patent is stupid, invalid, and an indictment of the system….
Before discussing the patent, it is worth considering why Elsevier might want a government granted monopoly on methods of peer review. Elsevier owns more than 2000 academic journals. It charges huge fees and sometimes imposes bundling requirements whereby universities that want certain high profile journals must buy a package including other publications. Universities, libraries, and researchers are increasingly questioning whether this model makes sense.
Avoid Elsevier. This is a world that should no longer exist.
Biology is short of theory compared with physics, and medicine more so. More dull trials, and less and less insight. Busyness and project management, directed by chief executives, wielding Excel spreadsheets. Alfred G Knudson has just died and Nature’s obituary tells the story of somebody who could play at natural history and then form a majestic and testable hypothesis. The penultimate sentence reads: “[his] lack of patience for science that merely repeated the work of others kept everyone in his sphere striving for the new”
Two articles both from different areas. The first is from an interview with Paul Greengrass (he of ‘Bloody Sunday’, and the Bourne films).
“Youngsters starting out probably aren’t going to be supported and developed like I was in my early career, they’re much more likely be chewed up,” he said. “This places a greater weight on universities like Kingston, which is a breeding ground for talent, to educate kids about the importance of point of view – it’s the easiest thing to lose but the most important thing to hold on to.”
The second in Science, about a likely Nobel prize winner, Rainer Weiss.
Then, in his junior year, Weiss flunked out of school entirely. He fell for a woman he met on a ferry from Nantucket to Boston. “She taught me about folk dancing and playing the piano,” he says. Weiss followed her when she moved to Evanston, Illinois, abandoning his classes in midterm. But the affair fizzled. “I fell in love and went crazy,” he says, “and of course she couldn’t stand to be around a crazy man.” Weiss returned to MIT hoping to take his finals only to find he’d flunked out.
Weiss says he was unfazed. “People say, ‘I failed out of college! My life is over!’ Well, it’s not over. It depends on what you do with it.” He took a job as a technician in MIT’s legendary Building 20, a temporary structure erected during the war, working for Jerrold Zacharias, who studied beams of atoms and molecules with light and microwaves and developed the first commercial atomic clock. Under Zacharias’s tutelage, Weiss finished his bachelor’s degree in 1955 and earned his Ph.D. in 1962.
A later quote from the same article:
After a postdoc at Princeton University developing experimental tests of gravity under physicist Robert Dicke, Weiss returned to MIT in 1964. As a junior faculty member, he says, he published little and didn’t worry about advancing his career. MIT’s Shoemaker says Weiss probably got tenure only for his teaching—and wouldn’t get it today. Bernard Burke, an emeritus physicist at MIT, agrees that early on Weiss was a “happy gadgeteer” who “wasn’t likely to get tenure unless he did something that did something.”
The echo of how he has lived some of his life is provided by one of his protégés, David Shoemaker
Shoemaker adds that Weiss’s foremost quality is empathy. A college dropout, Shoemaker credits Weiss with getting him into graduate school at MIT without an undergraduate degree. “He sought ways to bring out the best in me,” Shoemaker says. “He also took a rather irregular path, and I think because of that and just his nature, he is really interested in helping people.”
Now, none of this is too surprising. Science and any serious intellectual or cultural endeavour is a way of constructively catching dissent. And dissent clusters: it is not uniform across society, but found on the fringes or boundaries of good sense. But we are no longer focussed on diversity or providing a garden for play. Instead, we are obsessed with homogeneity and forcing all to the mean.
Blake got it right:
The Enquiry in England is not whether a Man has Talents & Genius, But whether he is Passive & Polite & a Virtuous Ass & obedient to Noblemen’s Opinions in Art & Science. If he is, he is a Good Man. If not he must be Starved.
A comment in Science
A well-stated hypothesis describes a state of nature. It is either true or not true, not subject to probability. The phrase “probability the hypothesis is true” is meaningless. One can only say, “likelihood that the observed data came from a population characterized by the hypothesis.”
I only post, because I seem to spend my life trying to argue that the dismal null hypothesis is a tool for doing one type of statistics, and has a limited role in science. It has little to do with what we mean by a scientific hypothesis. There are not an infinite number of scientific hypotheses: there is a not a probability distribution in the way we use this term in statistics.
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
The (medical) future is here, just unevenly distributed
The lessons from Glybera, the first gene therapy to be sold in Europe, still loom large. It cures a genetic condition that causes a dangerously high amount of fat to build up in the blood system. Priced at $1m, the product has only been bought once since 2012 and stands out as a commercial disaster. Economist
“This is a terrific essay. The keystone of science’s power and the continued survival of a civilisation based on — and at the mercy of — science, is contained in the following:
‘As Jacob Bronowski (1956) said – in science truth is all-of-a-piece: either we are truthful always and about everything; or else the dishonesty ramifies, the rot spreads, and rapidly we are being honest about nothing.’
External audit, as we have seen over the last quarter century in many human domains, does not work. All too often it is merely a tool for rendering deceit invisible. Integrity is not a bolt on for our survival, but a bit of our biological machinery that is struggling against the loss of the ‘personal’.
If we look back to the writings of Merton, Lewis Thomas, Peter Medawar, John Ziman, and the like, it is clear we lack a coherent and deep view of what has happened to modern science and — because science is integral to the modern world — our civilisation. This essay sets the tone for what must follow.”
Derek Bok states that some of those who were found guilty of criminal acts in the recent waves of corporate malfeasance in the US, scored very well on their ethics modules at Harvard. It is easy (and facile) to imagine that somehow doing a ‘course’ on a particular topic will produce a change in behaviour that is permanent and withstands countervailing forces (culture eats strategy,and culture eats morality etc, I hear you say). Those in universities should of course know better — producing changes in behaviour in response to an environmental stimulus is a paraphrase of one definition of learning. But the message doesn’t get through, largely because the academy has increasingly chosen to turn its professional tools away from examination of its own purpose. It is deemed rude to ask for evidence when everybody knows the sun goes round the earth.
Nor, if we are to believe Timothy Wilson, should we go in with the ‘null’ hypothesis that courses wishing to eradicate ‘isms’ may only be beneficial. The evidence points in a different direction: they make some people’s behaviour worse. I sometimes wonder if anybody is really too worried about whether these interventions work — they just want to tick boxes to comply with yet more rituals of verification (to use Michael Power’s phrase from the Audit Society).
Anyway these ramblings were by way of introduction for what is for me one of the clearest expositions of morality and the human condition. I have no idea why I cannot keep it out of my mind but maybe putting it down in writing might help. It comes from a short article by Jacob Bronowski, in a posthumous collection of his essays, ‘A sense of the future’. The article is “A moral for an age of plenty” and it includes an account of the death of the physicist Louis Slotin.
Louis Slotin was a physicist in his mid thirties, working at Los Alamos in 1946. Bronowski described him so: ‘Slotin was good with his hands; he liked using his head; he was bright and a little daring — in short, he was like any other man anywhere who is happy in his work’. Just so.
Slotin was moving bits of plutonium closer together, but for obvious reasons, not too close. And as experts are tempted to do, he was using a screwdriver. His hand slipped. The monitors went through the roof. He immediately pulled the pieces of plutonium apart, and asked everybody to mark their precise positions at the time of the accident. The former meant he would die (9 days later, as it turned out); the latter allowed him to prognosticate on what would happen to the others (they survived).
There are two things that make up morality. One is the sense that other people matter: the send of common loyalty….The other is a clear judgement of what is at stake: a cold knowledge, without a trace of deception, of precisely what will happen to oneself and to others if one plays the hero or the coward. This is the highest morality: to combine human love with an unflinching, a scientific judgement.
I actually think we are more lacking in the second than the former. Worse still, we are less tolerant of evidence than we once were: we prefer to wallow smugly in our self-congratulatory goodness. We have been here before. Medicine only became useful when physicians learned this lesson.
[ And yes, people remarked that Slotin hadn’t followed protocol…]
A nice story in Nature about two giants: Jerome Bruner and the Turing award winner, Alan Kay.
Jerry made seminal contributions to an astonishing number of fields — each a stop on the road to finding out what makes us human. Beginning in the 1960s, computer simulations became the model of the human mind in cognitive psychology, with researchers trying to simulate how humans solve problems, form concepts, comprehend language and learn. But reducing humans to computers was antithetical to Jerry’s humanistic perspective.
Given this, it was surprising that computer scientist Alan Kay, the designer of what became the Macintosh graphical user interface, turned up more than 30 years ago on Bruner’s Manhattan doorstep with a gift of a Macintosh computer. Jerry’s ideas of representing information through actions, icons and symbols, central to his theory of cognitive development, had inspired Kay to get users (even children) to act (through a computer mouse) on icons, enabling the use of an abstract set of symbols (computer program). This was the foundation for what became the Macintosh interface.
One other line in the obiuaryt by Patricia Marks Greenfield stood out:
In 1972, Bruner sailed his boat across the Atlantic to take up the first Watts Professorship of Psychology at the University of Oxford, UK.
I guess the removal expenses were as stingy then as now.
“I have never kept count of the many inventions I made but it must run into the hundreds. Most of them were trivial, such as a wax pencil that would write clearly on cold wet glassware straight from a refrigerator. It was published as one of my first letters to Nature in 1945.”
“A Rough Ride to the Future” by James Lovelock. Blake said it: it is all about ‘minute particulars’. Of a piece.
No, not ‘up North’, but a neat way to check whether people have been sloppy or dishonest. The following from the Economist
The GRIM test, short for granularity-related inconsistency of means, is a simple way of checking whether the results of small studies of the sort beloved of psychologists (those with fewer than 100 participants) could be correct, even in principle.
Full PeerJ reprint here.
Daniel Sarewitz in Nature
The quality problem has been widely recognized in cancer science, in which many cell lines used for research turn out to be contaminated. For example, a breast-cancer cell line used in more than 1,000 published studies actually turned out to have been a melanoma cell line. The average biomedical research paper gets cited between 10 and 20 times in 5 years, and as many as one-third of all cell lines used in research are thought to be contaminated, so the arithmetic is easy enough to do: by one estimate, 10,000 published papers a year cite work based on contaminated cancer cell lines. Metastasis has spread to the cancer literature……..That problem is likely to be worse in policy-relevant fields such as nutrition, education, epidemiology and economics, in which the science is often uncertain and the societal stakes can be high
See this great piece by Bruce Charlton
Professional science has arrived at this state in which the typical researcher feels free to indulge in unrestrained careerism, while blandly assuming that the ‘systems’ of science will somehow transmute the dross of his own contribution into the gold of truth. It does not: hence the preponderance of irreproducible publications.
Two articles about Sci-hub (here and here). No, I am not encouraging illegal downloading. But I hope we can look back in a few years with shame at the way journals, their publishers and those who have a vested interest in the mismeasure of science have hindered educational advance, and wasted public money. Some specialty journals do indeed pour money back into their subject, but it is a minority. All too often medical journals are a way of making money for publishers and specialist societies. There will be an iTunes moment (I hope).
I am not certain when I learned a little about sign language, probably from Steven Pinker’s, ‘The language instinct’. But it is absolutely fascinating, and its study — it seems to me — is yet another one of the almost endless arguments for letting academics play: the world in a grain of sand. There is a short article in this week’s Science:
The use of new parts also makes language more efficient: The youngest ISL signers can express themselves much faster than the oldest—153.2 signs per minute compared with 103.5 signs per minute.
The findings also show that social interaction is essential for language evolution. When a new generation establishes a system for signing, Sandler says, it stays more or less the same as its members age. Her work has shown that when young signers enter a community, they add complexity through experimentation with their peers in what she calls “a social game.” The more players, the more innovations.
You could the same for science in general or any branch of human culture. Which is why many of us worry.
Here is number 3. My favourite bit of skin biology.
Here is video number 2 in the ‘clinical’ skin biology series.
I have been busy producing and updating some videos. Here is the first in a series on skin biology.
Some Neanderthals would — based on MC1R sequence — be expected to have red hair. What has always caused me confusion is the way that dates for everything to do with human paleohistory, and the various representations of our evolution, are revised based on n of 1 publications. No doubt the story will get easier, but I think silence for a while on the ‘greatest story every told’ would be in order. At least from me.
Note added: And then…..
Venki Ramakrishnan was on the radio the other day. I cannot remember his exact words but they were something to the effect that he wanted ‘not to generate lots of data, but instead, lots of understanding’. Says it all.
It’s no secret that therapies that look promising in mice rarely work in people. But too often, experimental treatments that succeed in one mouse population do not even work in other mice, suggesting that many rodent studies may be flawed from the start. Nature
As it says, ‘no secret’. Science is usually self correcting, but the time period may vary. What has always puzzled me is how in the areas of biology I know something about, mouse work has been so informative; whereas in others, all is seems to be good for, is publication is high impact journals. For those interested in pigmentation, mice have been wonderfully informative, whereas for those other bits of skin biology I am familiar with (ahem), like inflammation, mice have been less helpful. A part of me wonders whether some of this is due to whether you are trying to identify potential pathways, or whether you are trying to build interventions based on particular pathways. And finally, lest there be any confusion, I am not one of those who believes we haven’t learned a lot from animals.
Remember those compare and contrast questions (UC versus Crohns; DLE versus LP etc.). Well, look at these two quotes from articles in the same edition of Nature.
The first from the tsunami of papers showing that ‘Something in rotten in the state of
Denmark Science’ — essentially that the Mertonian norms for science have been well and truly trampled over.
Journals charge authors to correct others’ mistakes. For one article that we believed contained an invalidating error, our options were to post a comment in an online commenting system or pay a ‘discounted’ submission fee of US$1,716. With another journal from the same publisher, the fee was £1,470 (US$2,100) to publish a letter. Letters from the journal advised that “we are unable to take editorial considerations into account when assessing waiver requests, only the author’s documented ability to pay”.
Discrete Analysis’[the journal] costs are only $10 per submitted paper, says Gowers; money required to make use of Scholastica, software that was developed at the University of Chicago in Illinois for managing peer review and for setting up journal websites. (The journal also relies on the continued existence of arXiv, whose running costs amount to less than $10 per paper). A grant from the University of Cambridge will cover the cost of the first 500 or so submissions, after which Gower hopes to find additional funding or ask researchers for a submission fee.
Well done the Universities of Cambridge and Cornell (arXiv). For science, the way forward is clear. But for much clinical medicine, including much of my own field, we need to break down the barriers between publication and posting online information that others may find useful. This cannot happen until the financial costs approximate to zero.
From Alan Kay. If it comes from a Turing award winner, maybe people might take notice. Perhaps not.
Patients are willing to pay, and pay dearly: the HeartSheet treatment costs nearly ¥15 million (US$122,000). Last month, the health ministry added it to the procedures covered by national health insurance, which will help. But patients still pay 10–30% of the cost for a drug that is not known to be effective. As they do so, they basically subsidize the company’s clinical trial.
Japan has turned the drug-discovery model on its head. Usually, the investment — and thus the risk — is borne by drug companies, because they stand to gain in the long run. Now the risk is being outsourced. By the time it is clear whether a treatment works or not, the companies will have already made revenue from it.
‘Now Weinberg has added another credential to his crowded vita: historian of science. In his past writings, he had mainly concerned himself with the modern era of physics and astronomy, from the late nineteenth century to the present—a time, he says, when “the goals and standards of physical science have not materially changed.” Yet to appreciate how those goals and standards took shape, he realized he would have to dig deeper into the history of science. So, “as is natural for an academic,” he volunteered to teach a course on the subject—in this case, to undergraduates with no special background in science or mathematics. Then he immersed himself in the primary and secondary literature. The result is To Explain the World, which takes us all the way from the first glimmerings of science in ancient Greece, through the medieval world, both Christian and Islamic, and down to the Newtonian revolution and beyond.’
In a review, by Jim Holt,of ‘To Explain the World: The Discovery of Modern Science’, by Steven Weinberg.
The problem is that this is no longer natural or even encouraged of an academic. And if the writings of Weinberg I have read are anything to go by, this course must have been something special. I can remember the late John Ziman telling me that having been appointed to a lectureship in physics at Cambridge, he realised that there was no suitable text for his Cambridge undergraduates in the area that interested him. So, he spent two years writing such a text (which sold well for many years, he added). He observed that no longer would a UK university consider some behaviour appropriate: what about the REF! This tells us something about great thinkers, deep domain expertise, and how explanatory ability is the crux of great teaching. And about universities, and their troubled relation with teaching — and academics.