Research

Most lawyers don’t make anything except hours.

by reestheskin on 17/10/2017

Comments are disabled

This was a quote from an article by an ex-lawyer who got into tech and writing about tech. Now some of by best friends are lawyers, but this chimed with something I came across by Benedict Evans on ‘why you must pay sales people commissions’. The article is here (the video no longer plays for me).

The opening quote poses a question:

I felt a little odd writing that title [ why you must pay sales people commissions]. It’s a little like asking “Why should you give engineers big monitors?” If you have to ask the question, then you probably won’t understand the answer. The short answer is: don’t, if you don’t want good engineers to work for you; and if they still do, they’ll be less productive. The same is true for sales people and commissions.

The argument is as follows:

Imagine that you are a great sales person who knows you can sell $10M worth of product in a year. Company A pays commissions and, if you do what you know you can do, you will earn $1M/year. Company B refuses to pay commissions for “cultural reasons” and offers $200K/year. Which job would you take? Now imagine that you are a horrible sales person who would be lucky to sell anything and will get fired in a performance-based commission culture, but may survive in a low-pressure, non-commission culture. Which job would you take?

But the key message for me is:

Speaking of culture, why should the sales culture be different from the engineering culture? To understand that, ask yourself the following: Do your engineers like programming? Might they even do a little programming on the side sometimes for fun? Great. I guarantee your sales people never sell enterprise software for fun. [emphasis mine].

Now why does all this matter? Well personally, it still matters a bit, but it matters less and less. I am towards the end of my career, and for the most part I have loved what I have done. Sure, the NHS is increasingly a nightmare place to work, but it has been in decline most of my life:  I would not recommend it unreservedly to anybody. But I have loved my work in a university. Research was so much fun for so long, and the ability to think about how we teach and how we should teach still gives me enormous pleasure: it is, to use the cliche, still what I think about in the shower. The very idea of work-life balance was — when I was young and middle-aged at least — anathema. I viewed my job as a creative one, and building things and making things brought great pleasure. This did not mean that you had to work all the hours God made, although I often did. But it did mean that work brought so much pleasure that the boundary between my inner life and what I got paid to do was more apparent to others than to me. And in large part that is still true.

Now in one sense, this whole question matters less and less to me personally. In the clinical area, many if not most clinicians I know now feel that they resemble those on commission more than the engineers. Only they don’t get commission. Most of my med school year who became GPs will have bailed out. And I do not envy the working lives of those who follow me in many other medical specialties in hospital. Similarly, universities were once full of academics who you almost didn’t need to pay, such was their love for the job. But modern universities have become more closed and centrally managed, and less tolerant of independence of mind.

In one sense, this might go with the turf — I was 60 last week. Some introspection, perhaps. But I think there really is more going on. I think we will see more and more people bailing out as early as possible (no personal plans, here), and we will need to think and plan for the fact that many of our students will bail out of the front line of medical practice earlier than we are used to. I think you see the early stirrings of this all over: people want to work less than full-time; people limit their NHS work vis a vis private work; some seek administrative roles in order to minimise their face-to-face practice; and even young medics soon after graduation are looking for portfolio careers. And we need to think about how to educate our graduates for this: our obligations are to our students first and foremost.

I do not think any of these responses are necessarily bad. But working primarily in higher education, has one advantage: there are lost of different institutions, and whilst in the UK there is a large degree of groupthink, there is still some diversity of approach. And if you are smart and you fall outwith the clinical guilds / extortion rackets, there is no reason to stay in the UK. For medics, recent graduates, need to think more strategically. The central dilemma is that depending on your specialty, your only choice might appear to be to work for a monopolist, one which seeks to control not so much the patients cradle-to-grave, but those staff who fall under its spell, cradle-to-grave. But there are those making other choices — just not enough, so far.

An aside. Of course, even those who have achieved the most in research do not alway want to work for nothing, post retirement. I heard the following account first hand from one of Fred Sanger’s previous post-docs. The onetime post-doc was now a senior Professor, charged with opening and celebrating a new research institution. Sanger — a double Laureate — would be a great catch as a speaker. All seemed will until the man who personally created much of modern biology realised the date chosen was a couple of days after he was due to retire from the LMB. He could not oblige: the [garden] roses need me more!

More Big data, rather than Big ideas

by reestheskin on 29/09/2017

Comments are disabled

No, I could not run, lead or guide a pharma company. Tough business. But I just hope their executives do not really believe most of what they say. So, in a FT article about the new CE of Novartis, Vas Narasimhan, has vowed to slash drug development costs; there will be a productivity revolution; 10-25 per cent could be cut from the cost of trials if digital technology were used to carry them out more efficiently.

And of course:

(he) plans to partner with, or acquire, artificial intelligence and data analytics companies, to supplement Novartis’s strong but “scattered” data science capability.

I really think of our future as a medicines and data science company, centred on innovation and access (read that again, parenthesis mine)

And to add insult to injury:

Dr Narasimhan cites one inspiration as a visit to Disney World with his young children where he saw how efficiently people were moved around the park, constantly monitored by “an army of Massachusetts Institute of Technology-trained data scientists”.

And not that I am a lover of Excel…

No longer rely on Excel spreadsheets and PowerPoint slides, but instead “bring up a screen that has a predictive algorithm that in real time is recalculating what is the likelihood our trials enrol, what is the quality of our clinical trials”

Just recall that in  2000 it would have been genes / genetics / genomics rather than data / analytics / AI / ML etc

So, looks to me like lots of cost cutting and optimisation. No place for a James Black, then.

Minute particulars

by reestheskin on 30/08/2017

Comments are disabled

This is from an article in Nature. And the problem is resolving differences in experimental results between labs.

But subtle disparities were endless. In one particularly painful teleconference, we spent an hour debating the proper procedure for picking up worms and placing them on new agar plates. Some batches of worms lived a full day longer with gentler technicians. Because a worm’s lifespan is only about 20 days, this is a big deal. Hundreds of e-mails and many teleconferences later, we converged on a technique but still had a stupendous three-day difference in lifespan between labs. The problem, it turned out, was notation — one lab determined age on the basis of when an egg hatched, others on when it was laid.

Now my title is from Blake:

He who would do good to another must do it in Minute Particulars: general Good is the plea of the scoundrel, hypocrite, and flatterer, for Art and Science cannot exist but in minutely organized Particulars.

And yet, I think I am using the quote in a way he would have strongly disagreed with. Some of the time ‘Minute particulars’ are not the place to be if you want to change the world. Especially in biology.

Its science Jim, but not as we (now) know it.

by reestheskin on 28/08/2017

Comments are disabled

If you are interested in the history of science. Or put another way, how science can work when it is allowed to, check out this wonderful site: ‘Molecular Tinkering: How Edinburgh changed the face of molecular biology’.

I first became aware of ‘Edinburgh’ and its role in molecular biology by accident. In the late 1970’s I spent three months here doing a psychiatry elective, on the unit of Prof Bob Kendell. I stayed in self-catering University accommodation through the summer, knowing nobody, spending most of my free time tramping around the city on foot. Not always bored. There was a motley crew of people staying in the small hall, and one was a biologist from China. I had never met anybody from this far away and mysterious place. Of course, mostly I asked about barefoot doctors and the like, and questions that you would not now use your mobile phone to ask. No Alexa then. But I wondered why he was here. Molecular biology, he explained. Now, I could recite the story Jim Watson told about Watson and Crick, and even mention a few names who followed. But no more than a handful. ‘Recombinant DNA’ might have passed my lips, but please do not ask me to explain it. But from him, I learned something special has awoken. And for me it was literally a couple more years before the waves of this second revolution broke on the shores of MRC Clinical Training Fellowship applications.

A sample:

Throughout the 20th century film editors worked by identifying key scenes in their film reels, snipping precise frames and re-joining them to create a new narrative. This is exactly how the Murrays wanted to work with the long molecules of DNA. Restriction enzymes would be the editor’s scissors, making the cuts. As well as doing the cutting, the enzymes would be the editor’s eyes: recognising the precise frame at which to snip.

Or if you ever doubt that the space for discovery is infinite, listen to this from Noreen Murray:

“Looking back, it is amazing that Ken and I were the only people in the UK in the late 60s and early 70s to notice the potential of restriction enzymes.”

Or this, for a definition of what research leadership is all about:

Waddington had a real knack for selecting and recruiting interesting but often unproven biologists.

Science works! Rewrite the textbooks

by reestheskin on 01/08/2017

Comments are disabled

Little evidence is found for higher-order organization into 30- or 120-nm fibers, as would be expected from the classic textbook models based on in vitro visualization of non-native chromatin.

Well, chromatin structure might not be everybody’s cup of tea but I once shared an ‘office’ with a couple of French / Polish researchers in Strasbourg. It was all above my head, so I had to make do with the textbooks, and I stuck to my simple cloning of upstream regulatory regions of a retinoid receptor. Now, it appears from this article in Science, the textbooks will need rewriting. Science works.

RCTs and the kudos of stacking shelves

by reestheskin on 31/07/2017

Comments are disabled

Many many years ago I wrote a few papers about — amongst other things — the statistical naivety of the EBM gang. I enjoyed writing them but I doubt they changed things very much. EBM as Bruce Charlton pointed out many years ago has many of the characteristic of a cult (or was it a Zombie? — you cannot kill it because it is already dead). Anyway one of the reasons I disliked a lot of EBM advocates was because I think they do not understand what RCTs are, and of course they are often indifferent to science. Now, in one sense, these two topics are not linked. Science is meant to be about producing broad ranging theories that both predict how the world works and explain what goes on. Sure, there may be lots of detail on the way, but that is why our understanding of DNA and genetics today is so different from that of 30 years ago.

By contrast RCTs are usually a form of A/B testing. Vital, in many instances, but an activity that is often a terminal side road rather than a crossroads on the path to understanding how the world works. That is not to say they are not important, nor worthy of serious intellectual endeavour. But the latter activity is for those who are capable of thinking hard about statistics and design. Instead the current academic space makes running or enrolling people in RCT some sort of intellectual activity : it isn’t, rather it is a part of professional practice, just as seeing patients is. Companies used to do it all themselves many decades ago, and they didn’t expect to get financial rewards from the RAE/REF for this sort of thing. There are optimal ways to stack shelves that maths geeks get excited about, but those who do the stacking do not share in the kudos — as in the cudos [1] — of discovery.

Anyway, this is by way of highlighting a post I came to by Frank Harrell, with title:

Randomized Clinical Trials Do Not Mimic Clinical Practice, Thank Goodness

Harrell is the author of one of those classic books…. . But I think the post speaks to something basic. RCT are not facsimiles of clinical practice, but some sort of bioassay to guide what might go on in the clinic. Metaphors if you will, but acts of persuasion not brittle mandates. This all leaves aside worthy debates on the corruption that has overtaken many areas of clinical measurement, but others can speak better to that than me.

[1] I really couldn’t resist.

A .. deep thinking, highly rigorous, no-BS [bullshit] research scientist..

by reestheskin on 26/07/2017

Comments are disabled

Hours later, with Blackburn’s approval, the institute issued comments on the scientific records of the two women. It had “invested millions of dollars” in each scientist, Salk stated, but a “rigorous analysis” showed each “consistently ranking below her peers in producing high quality research and attracting” grants. Neither has published in Cell, Nature, or Science in the last 10 years, it said. Lundblad’s salary “is well above the median for Salk full professors ($250,000) … yet her performance has long remained within the bottom quartile of her peers.” The institute wrote that Jones’s salary, in the low $200,000 range, “aligns” with salaries at top universities, although she “has long remained within the bottom quartile of her peers.”

This is from an article in Science (Gender discrimination lawsuit at Salk ignites controversy). The context is a sex discrimination case, but the account is about an astonishing lack of vision. Short termism of stock markets is not the only way value is being destroyed by a cash-in-hand mentality. The best rugby coaches have rarely been the greatest players. Nobel laureates may not be the best leaders. No (bull) shit here…

All about shifting units

by reestheskin on 05/07/2017

Comments are disabled

Of all the titles submitted to the 2014 research excellence framework, only “around a half in most subjects achieved at least one retail sale in the UK in the years 2008-14”. 

Worst sellers: warning of existential crisis for academic books | THE News

Consent for whom?

by reestheskin on 09/06/2017

Comments are disabled

Interesting piece in today’s NEJM on data sharing and clinical trials and how meaningfully patients are involved.

Patients also said they wanted trial results to be shared with participants themselves, along with an explanation of what the results mean for them — something that generally doesn’t happen now. In addition, most participants said they had entered a trial not to advance knowledge, but to obtain what their doctor thought was the best treatment option for them. All of which raises some questions about how well patients understand consent forms.

Which reminded me of a powerful paper by the late David Horrobin in the Lancet in which, from the position of being a patient with a terminal illness, he challenged what many say:

The idea that altruism is an important consideration for most patients with cancer is a figment of the ethicist’s and statistician’s imagination. Of course, many people, when confronted with the certainty of their own death, will say that even if they cannot live they would like to contribute to understanding so that others will have a chance of survival. But the idea that this is a really important issue for most patients is nonsense. What they want is to survive, and to do that they want the best treatment.

I have always been suspicious of the ‘equipoise’ argument and terrified when I see participation rates in clinical trials  as a NHS performance measure. It is bad enough that doctors might end up acting as agents of the state. But this is worse than shilling for pharma.

Thew NEJM piece also draws attention to people’s reluctance to share with commercial entities. What this tells you, is that many people view some corporations —pharma — in this instance as pirates. Or worse. This topic is not going away. Nor is the need for (commercial) pharma to finance and develop new drugs.

 

An age of optimisation rather than optimism for a different future

by reestheskin on 08/06/2017

Comments are disabled

Institutions with histories matter. It is just that in many instances innovation often comes from the periphery. I think this is often true in many fields: science, music, even medical education. It is not always this way, but often enough to make me suspicious of the ‘centre’. The centre of course gets to write the history books.

An article by Mark Mazower in the NYRB, praising Richard Evans, the historian of the Third Reich, caught my attention. It seems that nobody in the centre was too excited about understanding the event that changed much of the world forever. Mazower writes:

If you wanted to do research on Saint Anselm or Cromwell, there were numerous supervisors to choose from at leading universities; if you wanted to write about Erich Ludendorff or Hitler, there was almost no one. The study of modern Europe was a backwater, dominated by historians with good wartime records and helpful Whitehall connections—old Bletchley Park hands and former intelligence officials, some of whom had broken off university careers to take part in the war and then returned.

Forward-looking, encouraging of the social sciences, open to international scholarship from the moment of its establishment, St. Antony’s is the college famously written off by the snobbish Roddy Martindale in John le Carré’s Tinker, Tailor, Soldier, Spy as “redbrick.” The truth is that it was indeed the redbrick universities, the creations of the 1950s and 1960s, that gave Evans and others their chance and shaped historical consciousness as a result. The Evans generation, if we can call them that, men (and only a very few women) born between 1943 and 1950, came mostly from the English provinces and usually got their first jobs in the provinces, too.

It is interesting how academics who had had career breaks were important. And how you often  will need new institutions to change accepted practice. All those boffins whose careers were interrupted by the war led to the flowering of invention we saw after the second world war. You have to continually recreate new types of ivory towers. But I see little of this today. Instead, we live in an age of optimisation, rather than of optimism that things can be different. The future is being captured by the present ever more than it once was. At least in much of the academy.

The exponential (sic) phase

by reestheskin on 26/05/2017

Comments are disabled

During the 7-year period between the introduction of tacrolimus in preclinical studies in 1987 and the FDA approval of tacrolimus in 1994, the transplant program at the University of Pittsburgh produced one peer-reviewed article every 2.7 days, while transplanting an organ every 14.2 hours.

Always thought these surgeons needed to spend more time in theatre .

Science. Obituary of Thomas Starzl.

Disturbing the universe. How to do it.

by reestheskin on 17/05/2017

Comments are disabled

Alan Kay on discovery and invention: the scientists find the problems not the funders. Here. (Via Benedict Evans)

It’s hard to get a fix

by reestheskin on 03/05/2017

Comments are disabled

I always recommend people to read David Healy’s Psychopharmacology 1, 2, and 3, together with Jack Scannell’s articles (here and here) to get a feel for exactly what drug discovery means. What is beyond doubt is that we are not as efficient at it as we once were. There is lots of blame to go around. The following gives a flavour of some of the issues ( or at least one take on the core issues).

From a review in ‘Health Affairs’ of A Prescription For Change: The Looming Crisis In Drug Development by Michael S. Kinch Chapel Hill (NC): University of North Carolina Press, 2016, by Christopher-Paul Milne.

He chronicles these industries’ long, strange trip from being the darling of the investor world and beneficiary of munificent government funding to standing on the brink of extinction, and he details the “slow-motion dismantlement” of their R&D capacity with cold, hard numbers because “the data will lead us to the truth.” There are many smaller truths, too: Overall, National Institutes of Health (NIH) funding has fallen by 25 percent in relative terms since a funding surge ended in 2003; venture capital is no longer willing to invest in product cycles that are eleven or twelve years long; and biotech companies may have to pay licensing fees on as many as forty patents for a decade before they even get to the point of animal testing and human trials….

In an effort to survive in such a costly and competitive environment, pharmaceutical companies have shed their high-maintenance R&D infrastructure, maintaining their pipelines instead by acquiring smaller (mostly biotech) companies, focusing on the less expensive development of me-too drugs, and buying the rights to promising products in late-stage development. As a consequence, biotech companies are disappearing (down from a peak of 140 in 2000 to about 60 in 2017), and the survivors must expend an increasing proportion of their resources on animal and human testing instead of the more innovative (and less costly) identification of promising leads and platform technologies. Similarly, some of academia’s R&D capacity, overbuilt in response to the NIH funding surge, now lies fallow, while seasoned experts and their promising protégés have moved on to other fields.

Big ideas rather then big data, please

by reestheskin on 28/04/2017

Comments are disabled

With many powerful academicians, lobbyists, professional societies, funding agencies, and perhaps even regulators shifting away from trials to observational data, even for licensing purposes, clinical medicine may be marching headlong to a massive suicide of its scientific evidence basis. We may experience a return to the 18th century, before the first controlled trial on scurvy. Yet, there is also a major difference compared with the 18th century: now we have more observational data, which means mostly that we can have many more misleading results.

John P.A. Ioannidis

I think the situation is even worse. Indeed, we can only grasp the nature of reality with action, not with contemplation (pace Ioannidis). But experiments (sic) as in RCT are also part of the problem: we only understand the world by testing of ideas that appear to bring coherency to the natural world. A/B testing is inadequate for this task — although it may well be all we have left.

Why science is important

by reestheskin on 18/04/2017

Comments are disabled

Q: What’s at stake when scientists fib?

A: Science is the last institution where being honest is a quintessential part of what you’re doing. You can do banking and cheat, and you’ll make more money, and that money will still buy you the fast cars and the yachts. If you cheat in science, you’re not making more facts, you’re producing nonfacts, and that is not science. Science still has this chance of giving a lead to democratic societies because scientific values overlap strongly with democratic values.

Interview with Harry Collins about his book: Gravity’s Kiss: The Detection of Gravitational Waves Harry Collins MIT Press, 2017. 414 pp.

Software is eating the clinic

by reestheskin on 07/04/2017

Comments are disabled

There was an interesting paper published in Nature recently on the topic of automated skin cancer diagnosis. Readers of my online work will know it is a topic close to my heart.

Here is the text of a guest editorial I wrote for Acta about the paper. Acta is a ‘legacy’ journal that made the leap to full OA under Anders Vahlquist’s supervision a few years back — it is therefore my favourite skin journal. This month’s edition, is the first without a paper copy, existing just online. The link to the edited paper and references is here. I think this is the first paper in their first online only edition :-). Software is indeed eating the world.


 

When I was a medical student close to graduation, Sam Shuster then Professor of Dermatology in Newcastle, drew my attention to a paper that had just been published in Nature. The paper, from the laboratory of Robert Weinberg, described how DNA from human cancers could transform cells in culture (1). I tried reading the paper, but made little headway because the experimental methods were alien to me. Sam did better, because he could distinguish the underlying melody from the supporting orchestration. He told me that whilst there were often good papers in Nature, perhaps only once every ten years or so would you read a paper that would change both a field and the professional careers of many scientists. He was right. The paper by Weinberg was one of perhaps fewer than a dozen that defined an approach to the biology of human cancer that still resonate forty years later.

Revolutionary papers in science have one of two characteristics. They are either conceptual, offering a theory that is generative of future discovery — think DNA, and Watson and Crick. Or they are methodological, allowing what was once impossible to become almost trivial — think DNA sequencing or CRISPR technology. Revolutions in medicine are slightly different, however. Yes, of course, scientific advance changes medical practice, but to fully understand clinical medicine we need to add a third category of revolution. This third category comes from papers that change the everyday lives of what doctors do and how they work. Examples would include fibreoptic instrumentation and modern imaging technology. To date, dermatology has escaped such revolutions, but a paper recently published in Nature suggests that our time may have come (2).

The core clinical skill of the dermatologist is categorising morphological states in a way that informs prognosis with, or without, a therapeutic intervention. Dermatologists are rightly proud of these perceptual skills, although we have little insight as to how this expertise is encoded in the human brain. Nor should we be smug about our abilities as, although the domains are different, the ability to classify objects in the natural world is shared by many animals, and often appears effortless. Formal systems of education may be human specific, but the cortical machinery that allows such learning, is widespread in nature.

There have been two broad approaches to try and imitate these skills in silica. Either particular properties (shape, colour, texture etc.) are first explicitly identified and, much as we might add variables in a linear regression equation, the information used to try and discriminate between lesions in an explicit way. Think of the many papers using rule based strategies such as the ABCD system (3). This is obviously not the way the human brain works: a moment’s reflection about how fast an expert can diagnose skin cancers and how limited we are in being able to handle formal mathematics, tells us that human perceptual skills do not work like this.

There is an alternative approach, one to some extent that almost seems like magic. The underlying metaphor is as follows. When a young child learns to distinguish between cats and dogs, we know the language of explicit rules is not used: children cannot handle multidimensional mathematical space or complicated symbolic logic. But feedback, in terms of what the child thinks, allows the child to build up his or her own model of the two categories (cats versus dogs). With time, and with positive and negative feedback, the accuracy of the perceptual skills increase — but without any formal rules that the child could write down or share. And of course, since it is a human being we are talking about, we know all of this process takes place within and between neurons.

Computing scientists started to model the way that they believed collections of neurons worked over 4 decades ago. In particular, it became clear that groups of in silica neurons could order the world based on positive and negative feedback. The magic is that we do not have to explicitly program their behaviour, rather they just learn, but — since this is not magic after all — we have got much better at building such self-learning machines. (I am skipping any detailed explanation of such ‘deep learning’ strategies, here). What gives this field its current immediacy is a combination of increases in computing power, previously unimaginable large data sets (for training), advances in how to encode such ‘deep learning’, and wide potential applicability — from email spam filtering, terrorist identification, online recommendation systems, to self-driving cars. And medical imaging along the way.

In the Nature paper by Thrun and colleagues (2) such ‘deep learning’ approaches were used to train computers based on over 100,000 medical images of skin cancer or mimics of skin cancer. The inputs were therefore ‘pixels’ and the diagnostic category (only). If this last sentence does not shock you, you are either an expert in machine learning, or you are not paying attention. The ‘machine’ was then tested on a new sample of images and — since modesty is not a characteristic of a young science — the performance of the ‘machine’ compared with over twenty board certified dermatologists. If we use standard receiver operating curves (ROC) to assess performance the machine equalled if not out-performed the humans.

There are of course some caveats. The dermatologists were only looking at single photographic images, not the patients (4); the images are possibly not representative of the real world; and some of us would like to know more about the exact comparisons used. However, I would argue that there are also many reasons for imagining that the paper may underestimate the power of this approach: it is striking that the machine was learning from images that were relatively unstandardised and perhaps noisy in many ways. And if 100,000 seems large, it is still only a fraction of the digital images that are acquired daily in clinical practice.

It is no surprise that the authors mention the possibilities of their approach when coupled with the most ubiquitous computing device on this planet — the mobile phone. Thinking about the impact this will have on dermatology and dermatologists would require a different sort of paper from the present one but, as Marc Andreessen once said (4), ‘software is eating the world’. Dermatology will survive, but dermatologists may be on the menu.


 

Full paper with references on Acta is  here.

No RAE/REF. The view from Ireland

by reestheskin on 05/04/2017

Comments are disabled

‘I think we’re seeing the benefits of a good funding environment, and – to be frank – no research excellence framework’

Brexit and the Emerald Isle. Your mileage may vary. Here.

Teaching and / or research

by reestheskin on 27/02/2017

Comments are disabled

There is a good piece on Wonke by David Morris, dealing with the issue of how research and teaching are related, and the dearth of empirical support for any positive relation between the two. R & T are related at the highest level — some universities can do doctoral research and teaching well — and although I have little direct experience, the same can apply at Masters level. The problems arise at undergraduate level, the level in which most universities compete, and which accounts for the majority of teaching income. As ever, I think we have to think ecology, variation and the long now. What seems clear to me, is that research is indeed often at the expense of teaching, and that the status quo needs to be changed if universities are to continue to attract public (and political) support. Cross subsidies and the empty rhetoric of ‘research led teaching’ do not address what are structural issues in Higher Ed, issues that have been getting worse, driven by poor leadership over many decades.

For many universities this is a pizza and / or pasta issue: some of us like both. Just because the two show little covariation in ecological data, does not mean that they shouldn’t inform each other much better than they have over the recent past. On the other hand, scale and education are unhappy bedfellows, and staff time and attention matter. Do you really think about teaching the same way you approach research? If T & R do not covary, then are your students in the best place, and why did you admit them? Honest answers please.

Employers outside academia place no financial value on skills or training acquired through a postdoc position, the study says.

Quoted in Nature

’Doubling boomers’ and the problem of scale

by reestheskin on 16/01/2017

Comments are disabled

From an article in Nature describing how the US biomedical workforce has changed over recent times.

Our analysis of IPUMS-USA data reveals a cohort that entered the laboratory workforce as NIH funding grew from US$13.7 billion in 1998 to $28.1 billion in 2004. These ‘doubling boomers’ arguably suffered most as funds subsequently decreased (when adjusted for inflation). In 2004, there were nearly 26,000 individuals under 40 with PhDs working as biomedical scientists. By 2011, there were nearly 36,000. Over this period, the number of faculty jobs did not increase. Indeed, the number of openings expected as a result of academics retiring has declined since 1995, when federal law made it illegal for universities to mandate retirement at age 65 (ref. 3).

…….

The work environment that this cohort faces is unlike anything seen before, despite previous booms and busts4. Today in the United States, four out of five PhD biomedical researchers work outside academia — a record high (see ‘Lab labour’). They earn, on average, almost $30,000 more a year than their academic counterparts, and feel less pressure to produce scientific publications.

There are some obvious points. The doubling was crazy at the time (some of us said that, then), and even more so in hindsight. Universities rushed for the gold, with little wider thought. Second, careers are the ‘long now’ and getting longer. Personal investment relies on a certain degree of continuity and stability, and there will be a hangover, that the universities will now have to deal with. Finally, the obsession with growth by universities is dangerous. Haldane’s essay, ‘On being the right size’ comes to mind. Scaling matters, as does thinking about long term rather than short term success.

Physics works!

by reestheskin on 04/01/2017

Comments are disabled

“Throughout her career, Gonzalez has done “a bit of everything” at LIGO, she says. For a while, she took on the crucial task of diagnosing the performance of the interferometers to make sure that they achieved unparalleled sensitivity — which is now enough to detect length changes in the 4-kilometre-long arms of the interferometers to within one part in 1021, roughly equivalent to the width of DNA compared with the orbit of Saturn. “

It ain’t biology, then. Nature

A New Year’s resolution (to avoid)

by reestheskin on 02/01/2017

Comments are disabled

“…professors fixated on crawling alone the frontiers of knowledge with a magnifying glass.”

Quoted in the Economist 10/12/2011

Mega-silliness

by reestheskin on 25/11/2016

Comments are disabled

In 1978, the distinguished professor of psychology Hans Eysenck delivered a scathing critique of what was then a new method, that of meta-analysis, which he described as “an exercise in mega-silliness.”

Matthew Page and David Moher here in a commentary on a paper by the ever ‘troublesome’ John Ioannidis, in his article titled, “The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses

To which, some of use would say, this was all predictable when the EBM bandwagon jumped on the idea that collating some information, and ignoring other information was ‘novel’. Science advances by creating and testing coherent theories of how the world works. Adding together ‘treatment effects’ is messier, and more prone to error. Just because you can enter data in a spreadsheet, doesn’t mean you should.

Making new ideas

by reestheskin on 22/11/2016

Comments are disabled

Another great video of Alan Kay, explaining how intellectual revolutions occur ( ‘appoint people who are not amenable to management’)

Skip the post-doc?

by reestheskin on 04/11/2016

Comments are disabled

Interesting story in Nature highlighting instances where instead of doing post-docs, young biologists have raised funding to set up their own companies. Of course, most start-ups fail, but then most really interesting research projects should fail. Y Combinator is getting into this area, which surprised me. As the age of getting your first grant gets higher, and with the increasingly dysfunctional nature of much academic (medical) science, the attractions are obvious. I was sceptical (and still am)  that the ‘software’ model would work in this area.

This diagnosis is a forgery!

by reestheskin on 02/11/2016

Comments are disabled

There was a story in the FT a few weeks back (paywall). It concerned the painting ‘Portrait of a Man’, by the Dutch artist Frans Hals. Apparently, the Louvre had wanted to buy the painting some time back, but were unable to raise the funds. However, a few weeks ago, the painting was declared a “modern forgery” by Sotheby’s — trace elements of synthetic 20th-century materials have been discovered in it. The story has a wider resonance however. The FT writes:

But if anything the fake Hals merely highlights an existing problem in how we determine attribution. In their quest to confirm attributions, dealers and auction houses seek the imprimatur of independent, usually academic, experts. Often that person’s “expertise” is deduced by whether they have published anything on a particular artist. But the skills required to publish a book are different to those needed to recognise whether a painting is genuine. Many academics are also fine connoisseurs. One of the few to doubt the attribution to Parmigianino of the St Jerome allegedly connected to Ruffini was the English scholar, David Ekserdjian. But too often the market values being a published writer over having a good “eye”.

Here is a non trivial problem: how can we designate expertise, and to what extent can you formalise it. In some domains — research for example — it is easier than in others. But as anybody who reads Nature or the broadsheets knows, research publication is increasingly dysfunctional, partly because of the scale of modern science; partly because the ‘personal knowledge’ and community has been exiled; and partly because it has become subjugated to academic accountancy because the people running universities cannot admit that they do not possess the necessary judgment to predict the future. To use George Steiner’s tidy phrase, there is also the ‘stench of money’.

But the real danger is when the ‘research model’ is used in areas where it not only does not work, but does active harm. I wrote some time back in a paper in PLoS Medicine:

Herbert Simon, the polymath and Nobel laureate in economics, observed many years ago that medical schools resembled schools of molecular biology rather than of medicine . He drew parallels with what had happened to business schools. The art and science of design, be it of companies or health care, or even the type of design that we call engineering, lost out to the kudos of pure science. Producing an economics paper densely laden with mathematical symbols, with its patently mistaken assumptions about rational man, was a more secure way to gain tenure than studying the mess of how real people make decisions.

Many of the important problems that face us cannot be solved using the paradigm that has come to dominate institutional science (or I fear, the structures of many universities). For many areas (think: teaching or clinical expertise), we need to think in ‘design’ mode. We are concerned more with engineering and practice, than is normal in the world of science. I do not know to what extent this expertise can be formalised — it certainly isn’t going to be as easy as whether you published in ‘glossy’ or ’non-glossy’ cover journals, but reputations existed long before the digital age and the digital age offers new opportunities. Publishing science is one skill, diagnosing is another, but there is a lot of dark matter linking the two activities. What seems certain to me, is that we have got it wrong, and we are accelerating in the wrong direction.

TIJABP. Malthus, research funding, and despair. Institutions matter.

by reestheskin on 27/10/2016

Comments are disabled

TIJABP

I try to avoid writing on this topic, finding it too depressing — although not as depressing as I once did, as I am closer to the end rather than the beginning. And there are signs of hope, just not where they once were.

There is an editorial in Nature titled ‘Early-career researchers need fewer burdens and more support’. It makes depressing reading. The contrast is with a talk on YouTube I listened to a few days back, by the legendary computer engineer (and Turing award winner and much else) Alan Kay, in which he points out that things were really much better in the 1960s and people at the time knew they were much better. Even within my short career, things were much better in 1990 than 2000, 2000 than 2010 and so on. When people ask me, is it sensible to pursue a career in science, I am nervous about offering advice. Science is great. Academia, in many places, is great. But you can only do most science or academia in a particular environment, and there are few places that I would want to work in  if I were starting out. And I might not get into any of them, anyway (Michael Eisen’s comment: never a better time to do science, never a worse time to be a scientist’). I will share a few anecdotes.

Maybe 10-15  years ago I was talking to somebody who — with no exaggeration — I would describe as one of the UKs leading biologists. This person described how one of their offspring was at university and had, for the first few years not taken his/ her studies too seriously. Then things changed, and they wondered about doing a PhD and following a ‘classical’ scientific career. The senior biologist expressed concern, worried that there was now no sensible career in science, and that much as though he/she had enjoyed their career, he/she could not longer recommend it. There was some guilt, but your children are your children.

The second, was a brief conversation with the late physicist John Ziman. I had read some of Ziman’s work — his ‘Real Science’ is for me essential reading for anybody who wants to understand what has happened to the Mertonian norms, and why science is often increasingly dysfunctional — but he shared a bit of his life history with me. When he was appointed as a lecturer at Cambridge in physics, the topic of his lectures was ‘new’ and there were no established books. So he set out to remedy the situation and spent the first two years writing such a book (still available, I think), and after that, turned his attention back to physics research, and later much more (‘you have to retire to have the time to do serious work’). He commented that this would simply be impossible now.

With respect to medicine, there has been attempts for most of my life to develop schemes to encourage and support young trainees. I benefited from them, but I question whether they target the real problem. There are a number of issues.

First, the model of training of clinical academics in medicine is unusual. Universities tend to want external funders to support the research training of clinical academics (Fellowships), but that is a model with severe limitations. Nurturing talent is a core business of the universities, and they need to devote resource to it. It is their resposibility. Of course, they need to train and support academics, not just researchers. This is what career progression within academia is about: lecturer, reader, professor etc. What medical schools want to do is to off load the risk on to the person, and then only buy when the goods have been tasted. In a competitive world, where other career options are open, this might not work well. Worst of all, it funnels a large number of institutions — institutions that should show diversity of approaches — into the lowest common denominator of what is likely to be funded by the few central funders. Until you have independence of mind and action, you cut your chances of changing the world. (Yes, I hear you say, there is not enough money,  but most universities need to cut back on ‘volume’.)

The second issue, is about whether the focus should be on schemes encouraging young people into science. I know I may sound rather curmudgeonly, but I worry that much activity relating to pursuing certain careers is reminiscent of ‘wonga like’ business models. I think we should do better. If youngsters look at what life is like at 40, 50 and 60 or beyond, and like it, they might move in that direction. You would not need to encourage them — we are dealing with bright people. A real problem for science funding is that for many individuals, it resembles a subsistence society, with little confidence about long term secure funding, and little resilience against changes in political will. Just look at Brexit. I remember once hearing somebody who had once considered a science career telling me that it seemed to him that most academics spent their life writing grants, and feeling uncomfortable about replacing what they wanted to do, with what might be funded. Conversations about funding occupied more time than serous thinking. I listened nervously.

Finally, I take no pleasure in making the point, but I do not see any reason  to imagine that things will get better over a ten or twenty year period. One of my favourite quotes of the economist Kenneth Galbraith, is to the effect that the denigration of value judgement is one of the ways the scientific establishment maintains its irrelevance. I think there is a lot in that phrase. If we were to ask the question, what is more critical: understanding genetics, or understanding how institutions work, I know where  my focus wold be be. I suspect there is more fun there too, just that much of the intellectual work might not be within academia’s walls.

TIJABP

Note: After writing this I worried that people would think that I was opposing schemes to encourage young people, or that I failed to understand that we have to treat those with new ideas differently. That was not my intention. Elsewhere I have quoted Christos Papadimitriou, and he gets my world view, too.

“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”

Intellectual friction and remix: ‘this was killing us’

by reestheskin on 17/09/2016

Comments are disabled

The goal of the new CFF [Cystic Fibrosis Foundation, a US patient charity] Therapeutics Lab, says Preston W. Campbell III, the foundation’s CEO and president, is to generate and share tools, assays, and lead compounds, boosting its partners’ chances of finding treatments. Frustration with academic technology transfer agreements was a key motivation, he notes. University-based researchers funded by the foundation have to seek approval from their institution’s legal department before sharing assays, cells, or any intellectual property, a hurdle that can take a year to negotiate. “This was killing us,” Campbell says, “ but if we created our own laboratory, we could not only focus on the things we wanted to focus on, we could also share them freely.”  Science

Stupid patent of the month

by reestheskin on 13/09/2016

Comments are disabled

Well you really could not make this up. From the EFF:

On August 30, 2016, the Patent Office issued U.S. Patent No. 9,430,468, titled; “Online peer review and method.” The owner of this patent is none other than Elsevier, the giant academic publisher. When it first applied for the patent, Elsevier sought very broad claims that could have covered a wide range of online peer review. Fortunately, by the time the patent actually issued, its claims had been narrowed significantly. So, as a practical matter, the patent will be difficult to enforce. But we still think the patent is stupid, invalid, and an indictment of the system….

Before discussing the patent, it is worth considering why Elsevier might want a government granted monopoly on methods of peer review. Elsevier owns more than 2000 academic journals. It charges huge fees and sometimes imposes bundling requirements whereby universities that want certain high profile journals must buy a package including other publications. Universities, libraries, and researchers are increasingly questioning whether this model makes sense.

Avoid Elsevier. This is a world that should no longer exist.

The (medical) future is here, just unevenly distributed.

by reestheskin on 02/08/2016

Comments are disabled

The (medical) future is here, just unevenly distributed

The lessons from Glybera, the first gene therapy to be sold in Europe, still loom large. It cures a genetic condition that causes a dangerously high amount of fat to build up in the blood system. Priced at $1m, the product has only been bought once since 2012 and stands out as a commercial disaster. Economist