Science

On diagnosis

by reestheskin on 19/09/2016

Comments are disabled

Abraham Vergese, an infectious disease physician, gave a talk here in Edinburgh last week. It was a very mixed audience, but I suspect the many students who were there enjoyed it. I have not read any of this books — nor looked at his TED talk — but his Wikipedia entry gives you a flavour of how interesting he is, and how varied a career can be — when you have courage.

One issue that came up tangentially, was the history of diagnosis, and there were some opinions ventured by the audience in terms of when diagnosis was historically established. I may have missed key points, but I found it hard to accept that the idea of diagnosis was something you could date except in very broad terms, even less that you could associated it with the 1870s or with the idea of stethoscopes being a key marker of when modern ideas of diagnosis were established. For instance — and since the lecturer was an ID physician — my first thoughts turned to scabies. The scabies mite was identified in the 1690s, and it was recognised as the cause of the disease ( I am not quoting primary sources so let me know if……) So here we have a clear linking of symptoms, signs, causality, a causal agent, and a broader theory about pathogenesis and epidemiology. So, this it got me thinking about how I view the topic of diagnosis.

Diagnosis is the mapping of one state with another, with the two states being linked by a network of attributes. Diagnosis is a suitcase term: it may contain lots of different tools, tools suited to various purposes, and tools for which we may find different purposes over time. Diagnosis represents an attempt to classify the world into particular states with often the goal of making some predictions about some other state. Most of the time, we think in terms of prediction, about what might happen to that person with or without some intervention. If you see these physical signs (burrows) and the patient describes particular symptoms (itch), then the ‘state’ is scabies. If the diagnosis is correct, you can say something about what causes the state, what might happen, and what effect a particular intervention (permethrin / malathion etc) might have. If you are lucky, you can feel happy with causal arrows linking much of what you say and think. Prediction is important but it is of course not the only quality we want in a theory. We tend to prefer some theories to others, even when they why make similar predictions. Think of Copernicus. We tend to prefer one of the following, irrespective of whether both allow the same quantitative clinical predictions:

  1. Sunbathing causes skin cancer: if you increase exposure by X then incidence goes up by Y
  2. Sunbathing increase the dose of UVR, UVR is mutagenic and in particular cause very specific types of mutation, cancer is a result of the accumulation of mutations, and therefore we will see particular mutational spectra in skin cancers

Our suitcase of diagnostic concepts have changed over time, however. For instance, even in modern medicine, causality is often lacking. We may use proxy or associated factors to define particular states. We may use simple heuristics as our guide to action, even though we have little idea of where the causal arrows are going. Think much of psychiatry. This does not mean we are powerless, just that we are more ignorant than we would like. We are of course wedded to particular metaphysical systems.

Diagnosis might have been used in the absence of knowledge about particular interventions to attribute blame, as an explanation. If a patient behaved in this way or suffered some state, it was a divine punishment for some behaviour. Now, I may not agree with this world view, but this too is diagnosis. The theory my seem wrong, it may seem primitive, but then my ideas of physics are primitive too if they are applied to the world of the very small.

Galen thought in terms of the mean, and the treatment by opposites (hot treatments for cold; moist treatment for drying diseases etc). This all sounds slightly crazy to modern ears (although dermatologists among you will point out the latter has definite therapeutic merit within very particular skin states). Or how about the idea of therapeutic ‘signatures’. This is from Ian Hacking [1]:

Syphilis is signed by the market place where it is caught; the planet Mercury has signed the market place; the metal mercury, which bears the same name, is therefore the cure for syphilis.

As Hacking points out this allowed Paracelsus to kill lots of people simply because he knew that mercury worked. But whatever the metaphysical system linking two states, the idea of diagnosis was firmly established. Just as Newton got most things right in his physics, and most of us ignore what came after — except when we use the GPS.

Diagnosis was not limited to medicine. Our ancestors spent their lives making diagnoses about what to eat and what not to eat. Making diagnoses about what particular weather states would do to crops etc. Plumbers make diagnoses, as do any humans trying to make sense of an environment that is not static, and where we value intervention.

What may have been specific to medicine was our hangs up about whether there was something special about humans, and whether the simple rules, experimentations and demonstrations of efficacy that allowed other types of human technological progress or indeed much of everyday life, applied in the domain of disease. Successful interventions or demonstrations will have had an effect on metaphysical beliefs in the long term. And of course much of this story is tied up with the growth of that particular branch of formal knowledge we call science. 1870 is just a little late.

[1] Hacking I. The emergence of probability : A philosophical study of early ideas about probability, induction and statistical inference. Cambridge: Cambridge University Press; 1984.

[2] Jonathan Rees: Why we should let EBM rest in peace. Clinics in Dermatology (2013) 31, 806–810

Intellectual friction and remix: ‘this was killing us’

by reestheskin on 17/09/2016

Comments are disabled

The goal of the new CFF [Cystic Fibrosis Foundation, a US patient charity] Therapeutics Lab, says Preston W. Campbell III, the foundation’s CEO and president, is to generate and share tools, assays, and lead compounds, boosting its partners’ chances of finding treatments. Frustration with academic technology transfer agreements was a key motivation, he notes. University-based researchers funded by the foundation have to seek approval from their institution’s legal department before sharing assays, cells, or any intellectual property, a hurdle that can take a year to negotiate. “This was killing us,” Campbell says, “ but if we created our own laboratory, we could not only focus on the things we wanted to focus on, we could also share them freely.”  Science

Stupid patent of the month

by reestheskin on 13/09/2016

Comments are disabled

Well you really could not make this up. From the EFF:

On August 30, 2016, the Patent Office issued U.S. Patent No. 9,430,468, titled; “Online peer review and method.” The owner of this patent is none other than Elsevier, the giant academic publisher. When it first applied for the patent, Elsevier sought very broad claims that could have covered a wide range of online peer review. Fortunately, by the time the patent actually issued, its claims had been narrowed significantly. So, as a practical matter, the patent will be difficult to enforce. But we still think the patent is stupid, invalid, and an indictment of the system….

Before discussing the patent, it is worth considering why Elsevier might want a government granted monopoly on methods of peer review. Elsevier owns more than 2000 academic journals. It charges huge fees and sometimes imposes bundling requirements whereby universities that want certain high profile journals must buy a package including other publications. Universities, libraries, and researchers are increasingly questioning whether this model makes sense.

Avoid Elsevier. This is a world that should no longer exist.

Two strikes and striving for the new

by reestheskin on 25/08/2016

Comments are disabled

Biology is short of theory compared with physics, and medicine more so. More dull trials, and less and less insight. Busyness and project management, directed by chief executives, wielding Excel spreadsheets. Alfred G Knudson has just died and Nature’s obituary tells the story of somebody who could play at natural history and then form a majestic and testable hypothesis. The penultimate sentence reads: “[his] lack of patience for science that merely repeated the work of others kept everyone in his sphere striving for the new”

The Enquiry in England is not whether a Man has Talents & Genius

by reestheskin on 18/08/2016

Comments are disabled

Two articles both from different areas. The first is from an interview with Paul Greengrass (he of ‘Bloody Sunday’, and the Bourne films).

“Youngsters starting out probably aren’t going to be supported and developed like I was in my early career, they’re much more likely be chewed up,” he said. “This places a greater weight on universities like Kingston, which is a breeding ground for talent, to educate kids about the importance of point of view – it’s the easiest thing to lose but the most important thing to hold on to.”

The second in Science, about a likely Nobel prize winner, Rainer Weiss.

Then, in his junior year, Weiss flunked out of school entirely. He fell for a woman he met on a ferry from Nantucket to Boston. “She taught me about folk dancing and playing the piano,” he says. Weiss followed her when she moved to Evanston, Illinois, abandoning his classes in midterm. But the affair fizzled. “I fell in love and went crazy,” he says, “and of course she couldn’t stand to be around a crazy man.” Weiss returned to MIT hoping to take his finals only to find he’d flunked out.

Weiss says he was unfazed. “People say, ‘I failed out of college! My life is over!’ Well, it’s not over. It depends on what you do with it.” He took a job as a technician in MIT’s legendary Building 20, a temporary structure erected during the war, working for Jerrold Zacharias, who studied beams of atoms and molecules with light and microwaves and developed the first commercial atomic clock. Under Zacharias’s tutelage, Weiss finished his bachelor’s degree in 1955 and earned his Ph.D. in 1962.

A later quote from the same article:

After a postdoc at Princeton University developing experimental tests of gravity under physicist Robert Dicke, Weiss returned to MIT in 1964. As a junior faculty member, he says, he published little and didn’t worry about advancing his career. MIT’s Shoemaker says Weiss probably got tenure only for his teaching—and wouldn’t get it today. Bernard Burke, an emeritus physicist at MIT, agrees that early on Weiss was a “happy gadgeteer” who “wasn’t likely to get tenure unless he did something that did something.” 

The echo of how he has lived some of his life is provided by one of his protégés, David Shoemaker

Shoemaker adds that Weiss’s foremost quality is empathy. A college dropout, Shoemaker credits Weiss with getting him into graduate school at MIT without an undergraduate degree. “He sought ways to bring out the best in me,” Shoemaker says. “He also took a rather irregular path, and I think because of that and just his nature, he is really interested in helping people.”

Now, none of this is too surprising. Science and any serious intellectual or cultural endeavour is a way of constructively catching dissent. And dissent clusters: it is not uniform across society, but found on the fringes or boundaries of good sense. But we are no longer focussed on diversity or providing a garden for play. Instead, we are obsessed with homogeneity and forcing all to the mean.

Blake got it right:

The Enquiry in England is not whether a Man has Talents & Genius, But whether he is Passive & Polite & a Virtuous Ass & obedient to Noblemen’s Opinions in Art & Science. If he is, he is a Good Man. If not he must be Starved.

Science versus statistics

by reestheskin on 10/08/2016

Comments are disabled

A comment in Science

A well-stated hypothesis describes a state of nature. It is either true or not true, not subject to probability. The phrase “probability the hypothesis is true” is meaningless. One can only say, “likelihood that the observed data came from a population characterized by the hypothesis.”

I only post, because I seem to spend my life trying to argue that the dismal null hypothesis is a tool for doing one type of statistics, and has a limited role in science. It has little to do with what we mean by a scientific hypothesis. There are not an infinite number of scientific hypotheses: there is a not a probability distribution in the  way we use this term in statistics.

A learning objective I approve of

by reestheskin on 08/08/2016

Comments are disabled

“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”

Reading the classics by Christos Papadimitriou. [As in maths / computing classics, so includes Euler’s paper on the Konigsberg bridges etc, but also Vannevar Bush’s ‘As we may think’.

The (medical) future is here, just unevenly distributed.

by reestheskin on 02/08/2016

Comments are disabled

The (medical) future is here, just unevenly distributed

The lessons from Glybera, the first gene therapy to be sold in Europe, still loom large. It cures a genetic condition that causes a dangerously high amount of fat to build up in the blood system. Priced at $1m, the product has only been bought once since 2012 and stands out as a commercial disaster. Economist

On the failure of self-regulation in science

by reestheskin on 29/07/2016

Comments are disabled

comment which I will repost here, on a superb essay by Bruce Charlton over on the Winnower.

“This is a terrific essay. The keystone of science’s power and the continued survival of a civilisation based on — and at the mercy of — science, is contained in the following:

‘As Jacob Bronowski (1956) said – in science truth is all-of-a-piece: either we are truthful always and about everything; or else the dishonesty ramifies, the rot spreads, and rapidly we are being honest about nothing.’

External audit, as we have seen over the last quarter century in many human domains, does not work. All too often it is merely a tool for rendering deceit invisible. Integrity is not a bolt on for our survival, but a bit of our biological machinery that is struggling against the loss of the ‘personal’.

If we look back to the writings of Merton, Lewis Thomas, Peter Medawar, John Ziman, and the like, it is clear we lack a coherent and deep view of what has happened to modern science and — because science is integral to the modern world — our civilisation. This essay sets the tone for what must follow.”

Morality 101

by reestheskin on 21/07/2016

Comments are disabled

Derek Bok states that some of those who were found guilty of criminal acts in the recent waves of corporate malfeasance in the US, scored very well on their ethics modules at Harvard. It is easy (and facile) to imagine that somehow doing a ‘course’ on a particular topic will produce a change in behaviour that is permanent and withstands countervailing forces (culture eats strategy,and culture eats morality etc, I hear you say). Those in universities should of course know better — producing changes in behaviour in response to an environmental stimulus is a paraphrase of one definition of learning. But the message doesn’t get through, largely because the academy has increasingly chosen to turn its professional tools away from examination of its own purpose. It is deemed rude to ask for evidence when everybody knows the sun goes round the earth.

Nor, if we are to believe Timothy Wilson, should we go in with the ‘null’ hypothesis that courses wishing to eradicate ‘isms’ may only be beneficial. The evidence points in a different direction: they make some people’s behaviour worse. I sometimes wonder if anybody is really too worried about whether these interventions work — they just want to tick boxes to comply with yet more rituals of verification (to use Michael Power’s phrase from the Audit Society).

Anyway these ramblings were by way of introduction for what is for me one of the clearest expositions of morality and the human condition. I have no idea why I cannot keep it out of my mind but maybe putting it down in writing might help. It comes from a short article by Jacob Bronowski, in a posthumous collection of his essays, ‘A sense of the future’. The article is “A moral for an age of plenty”  and it includes an account of the death of the physicist Louis Slotin.

Louis Slotin was a physicist in his mid thirties, working at Los Alamos in 1946. Bronowski described him so: ‘Slotin was good with his hands; he liked using his head; he was bright and a little daring — in short, he was like any other man anywhere who is happy in his work’. Just so.

Slotin was moving bits of plutonium closer together, but for obvious reasons, not too close. And as experts are tempted to do, he was using a screwdriver. His hand slipped. The monitors went through the roof. He immediately pulled the pieces of plutonium apart, and asked everybody to mark their precise positions at the time of the accident. The former meant he would die (9 days later, as it turned out); the latter allowed him to prognosticate on what would happen to the others (they survived).

Bronowski writes:

There are two things that make up morality. One is the sense that other people matter: the send of common loyalty….The other is a clear judgement of what is at stake: a cold knowledge, without a trace of deception, of precisely what will happen to oneself and to others if one plays the hero or the coward. This is the highest morality: to combine human love with an unflinching, a scientific judgement.

I actually think we are more lacking in the second than the former. Worse still,  we are less tolerant of evidence than we once were: we prefer to wallow smugly in our self-congratulatory goodness. We have been here before. Medicine only became useful when physicians learned this lesson.

[ And yes, people remarked that Slotin hadn’t followed protocol…]

Obituary of Jerome Bruner

by reestheskin on 16/07/2016

Comments are disabled

A nice story in Nature about two giants: Jerome Bruner and the Turing award winner,  Alan Kay.

Jerry made seminal contributions to an astonishing number of fields — each a stop on the road to finding out what makes us human. Beginning in the 1960s, computer simulations became the model of the human mind in cognitive psychology, with researchers trying to simulate how humans solve problems, form concepts, comprehend language and learn. But reducing humans to computers was antithetical to Jerry’s humanistic perspective.

Given this, it was surprising that computer scientist Alan Kay, the designer of what became the Macintosh graphical user interface, turned up more than 30 years ago on Bruner’s Manhattan doorstep with a gift of a Macintosh computer. Jerry’s ideas of representing information through actions, icons and symbols, central to his theory of cognitive development, had inspired Kay to get users (even children) to act (through a computer mouse) on icons, enabling the use of an abstract set of symbols (computer program). This was the foundation for what became the Macintosh interface.

One other line in the obiuaryt by Patricia Marks Greenfield stood out:

In 1972, Bruner sailed his boat across the Atlantic to take up the first Watts Professorship of Psychology at the University of Oxford, UK. 

I guess the removal expenses were as stingy then as now.

Minute particulars

by reestheskin on 15/07/2016

Comments are disabled

“I have never kept count of the many inventions I made but it must run into the hundreds. Most of them were trivial, such as a wax pencil that would write clearly on cold wet glassware straight from a refrigerator. It was published as one of my first letters to Nature in 1945.”

A Rough Ride to the Future” by James Lovelock.  Blake said it: it is all about ‘minute particulars’. Of a piece.

The GRIM test for honesty

by reestheskin on 19/06/2016

Comments are disabled

No, not ‘up North’, but a neat way to check whether people have been sloppy or dishonest.  The following from the Economist

The GRIM test, short for granularity-related inconsistency of means, is a simple way of checking whether the results of small studies of the sort beloved of psychologists (those with fewer than 100 participants) could be correct, even in principle.

Full PeerJ reprint here.

 

Metastasis has spread to the cancer literature

by reestheskin on 12/05/2016

Comments are disabled

Daniel Sarewitz in Nature

The quality problem has been widely recognized in cancer science, in which many cell lines used for research turn out to be contaminated. For example, a breast-cancer cell line used in more than 1,000 published studies actually turned out to have been a melanoma cell line. The average biomedical research paper gets cited between 10 and 20 times in 5 years, and as many as one-third of all cell lines used in research are thought to be contaminated, so the arithmetic is easy enough to do: by one estimate, 10,000 published papers a year cite work based on contaminated cancer cell lines. Metastasis has spread to the cancer literature……..That problem is likely to be worse in policy-relevant fields such as nutrition, education, epidemiology and economics, in which the science is often uncertain and the societal stakes can be high

See this great piece by Bruce Charlton

Professional science has arrived at this state in which the typical researcher feels free to indulge in unrestrained careerism, while blandly assuming that the ‘systems’ of science will somehow transmute the dross of his own contribution into the gold of truth. It does not: hence the preponderance of irreproducible publications.

Who’s downloading pirated papers? Everyone.

by reestheskin on 09/05/2016

Comments are disabled

Two articles about Sci-hub (here and here). No, I am not encouraging illegal downloading. But I hope we can look back in a few years with shame at the way journals, their publishers and those who have a vested interest in the mismeasure of science have hindered educational advance, and wasted public money. Some specialty journals do indeed pour money back into their subject, but it is a minority. All too often medical journals are a way of making money for publishers and specialist societies. There will be an iTunes moment (I hope).

Signing

by reestheskin on 22/04/2016

Comments are disabled

I am not certain when I learned a little about sign language, probably from Steven Pinker’s, ‘The language instinct’. But it is absolutely fascinating, and its study — it seems to me — is yet another one of the almost endless arguments for letting academics play: the world in a grain of sand. There is a short article in this week’s Science:

The use of new parts also makes language more efficient: The youngest ISL signers can express themselves much faster than the oldest—153.2 signs per minute compared with 103.5 signs per minute.

The findings also show that social interaction is essential for language evolution. When a new generation establishes a system for signing, Sandler says, it stays more or less the same as its members age. Her work has shown that when young signers enter a community, they add complexity through experimentation with their peers in what she calls “a social game.” The more players, the more innovations.

You could the same for science in general or any branch of human culture. Which is why many of us worry.

 

Skin biology 03: skin pigmentation

by reestheskin on 16/04/2016

Comments are disabled

Here is number 3. My favourite bit of skin biology.

Skin biology 02: from sunburn, to DNA and skin cancer.

by reestheskin on 15/04/2016

Comments are disabled

Here is video number 2 in the ‘clinical’ skin biology series.

Skin biology 01: the basics

by reestheskin on 13/04/2016

Comments are disabled

I have been busy producing and updating some videos. Here is the first in a series on skin biology.

No time for Neanderthals

by reestheskin on 17/03/2016

Comments are disabled

Some Neanderthals would — based on MC1R sequence — be expected to have red hair. What has always caused me confusion is the way that dates for everything to do with human paleohistory, and the various representations of our evolution, are revised based on n of 1 publications. No doubt the story will get easier, but I think silence for a while on the ‘greatest story every told’ would be in order. At least from me.

Note added: And then…..

‘not to generate lots of data, but instead, lots of understanding’

by reestheskin on 16/03/2016

Comments are disabled

Venki Ramakrishnan was on the radio the other day. I cannot remember his exact words but they were something to the effect that he wanted ‘not to generate lots of data, but instead, lots of understanding’. Says it all.

Of mice and men

by reestheskin on 18/02/2016

Comments are disabled

It’s no secret that therapies that look promising in mice rarely work in people. But too often, experimental treatments that succeed in one mouse population do not even work in other mice, suggesting that many rodent studies may be flawed from the start. Nature

As it says, ‘no secret’. Science is usually self correcting, but the time period may vary. What has always puzzled me is how in the areas of biology I know something about, mouse work has been so informative; whereas in others, all is seems to be good for, is publication is high impact journals. For those interested in pigmentation, mice have been wonderfully informative, whereas for those other bits of skin biology I am familiar with (ahem), like inflammation, mice have been less helpful. A part of me wonders whether some of this is  due to whether you are trying to identify potential pathways, or whether you are trying to build interventions based on particular pathways. And finally, lest there be any confusion, I am not one of those who believes we haven’t learned a lot from animals.

Defence expenditure

by reestheskin on 10/02/2016

Comments are disabled

 With a research budget increasing to $71bn in 2017, at a time when the overall defence budget is shrinking, new technology is a key part of US efforts to maintain its military edge over China and Russia.

FT [direct link to this aside] 

Publishing: compare and contrast

by reestheskin on 09/02/2016

Comments are disabled

Remember those compare and contrast questions (UC versus Crohns; DLE versus LP etc.). Well, look at these two quotes from articles in the same edition of  Nature.

The first from the tsunami of papers showing that ‘Something in rotten in the state of Denmark Science’ — essentially that the Mertonian norms for science have been well and truly trampled over.

Journals charge authors to correct others’ mistakes. For one article that we believed contained an invalidating error, our options were to post a comment in an online commenting system or pay a ‘discounted’ submission fee of US$1,716. With another journal from the same publisher, the fee was £1,470 (US$2,100) to publish a letter. Letters from the journal advised that “we are unable to take editorial considerations into account when assessing waiver requests, only the author’s documented ability to pay”. 

The second is about attempts to piggyback on the arXiv server. An earlier article and blog post from Timothy Gowers says more.

Discrete Analysis’[the journal] costs are only $10 per submitted paper, says Gowers; money required to make use of Scholastica, software that was developed at the University of Chicago in Illinois for managing peer review and for setting up journal websites. (The journal also relies on the continued existence of arXiv, whose running costs amount to less than $10 per paper). A grant from the University of Cambridge will cover the cost of the first 500 or so submissions, after which Gower hopes to find additional funding or ask researchers for a submission fee.

Well done the Universities of Cambridge and Cornell (arXiv). For science, the way forward is clear. But for much clinical medicine, including much of my own field, we need to break down the barriers between publication and posting online information that others may find useful. This cannot happen until the financial costs approximate to zero.

Rarer than the Dodo

by reestheskin on 29/01/2016

Comments are disabled

From Alan Kay. If it comes from a Turing award winner, maybe people might take notice. Perhaps not.

Big meaning not big data Alan kay

 

Public risk, private profit.

by reestheskin on 15/12/2015

Comments are disabled

Nature:

Patients are willing to pay, and pay dearly: the HeartSheet treatment costs nearly ¥15 million (US$122,000). Last month, the health ministry added it to the procedures covered by national health insurance, which will help. But patients still pay 10–30% of the cost for a drug that is not known to be effective. As they do so, they basically subsidize the company’s clinical trial.

Japan has turned the drug-discovery model on its head. Usually, the investment — and thus the risk — is borne by drug companies, because they stand to gain in the long run. Now the risk is being outsourced. By the time it is clear whether a treatment works or not, the companies will have already made revenue from it.

‘As is natural of an academic’

by reestheskin on 11/12/2015

Comments are disabled

‘Now Weinberg has added another credential to his crowded vita: historian of science. In his past writings, he had mainly concerned himself with the modern era of physics and astronomy, from the late nineteenth century to the present—a time, he says, when “the goals and standards of physical science have not materially changed.” Yet to appreciate how those goals and standards took shape, he realized he would have to dig deeper into the history of science. So, “as is natural for an academic,” he volunteered to teach a course on the subject—in this case, to undergraduates with no special background in science or mathematics. Then he immersed himself in the primary and secondary literature. The result is To Explain the World, which takes us all the way from the first glimmerings of science in ancient Greece, through the medieval world, both Christian and Islamic, and down to the Newtonian revolution and beyond.’

In a review, by Jim Holt,of ‘To Explain the World: The Discovery of Modern Science’, by Steven Weinberg.

The problem is that this is no longer natural or even encouraged of an academic. And if the writings of Weinberg I have read are anything to go by, this course must have been something special. I can remember the late John Ziman telling me that having been appointed to a lectureship in physics at Cambridge, he realised that there was no suitable text for his Cambridge undergraduates in the area that interested him. So, he spent two years writing such a text (which sold well for many years, he added). He observed that no longer would a UK university consider some behaviour appropriate: what about the REF! This tells us something about great thinkers, deep domain expertise, and how explanatory ability is the crux of great teaching. And about universities, and their troubled relation with teaching — and academics.

100 years ago: the most beautiful theory

by reestheskin on 01/12/2015

Comments are disabled

Wonderful retrospective in the Economist on one of the intellectual triumphs of the twentieth century. A few quotes give the flavour of real advance without an impact statement in sight.

General relativity was presented to the Prussian Academy of Sciences over the course of four lectures in November 1915; it was published on December 2nd that year. The theory explained, to begin with, remarkably little, and unlike quantum theory, the only comparable revolution in 20th-century physics, it offered no insights into the issues that physicists of the time cared about most.

Some people might be satisfied just to let each theory be used for what it is good for and to worry no further. But people like that do not become theoretical physicists

A century ago general relativity answered no-one’s questions except its creator’s

Fire the experts

by reestheskin on 26/11/2015

Comments are disabled

There is a touching video of Marvin Minsky here. Steven Levy’s wonderful book on how some of this revolution took place is compelling reading (as in the Model Railroad club). You have to wonder how and why  so much fundamental and successful (‘an important  discovery every few days’) work was done in such a short period of time, with so little money.  And across the pond, and elsewhere the biological revolution that dominated the second half of the 20th century was being laid out with even less resource. Not so much ‘events, dear boy, events’ but ‘ideas, dear kids, ideas’.

Statistics teaching has been a massive educational and ergonomic failure.

by reestheskin on 14/10/2015

Comments are disabled

My intercalated degree (1980) was centred around epidemiology, and analysis of what was then a large dataset. So, I had to learn some FORTRAN and file editing on an IBM360 / find my way around a MTS system, all in the days of punchcards, and line printers, with only one ugly green on black monitor to share across a unit. Mostly, I wasted large amounts of print out and got used to retrieving my batch processed requests early morning, with the comment ‘run aborted’ , ‘system error’ etc. I had to wait another 24 hours to find out if I had corrected the errors in my programs. I remember the first time I looked at the GLIM manuals, wondering if I was attempting to read them upside down.

Years later, my interest in statistics resurfaced, mainly as a response to the banality of much of the EBM crowd with their NNT and their apparent lack of understanding of what I disparagingly call ‘probability management’. Most EBM merchants are frustrated chartered accountants. Real science is involved with understanding reality by creating models of how the world works: if you don’t do that, you are not doing science, but rather the the D of R+D, or ‘technology assessment’. The theory of how you should do the latter is a proper academic pursuit, but using these ‘products’ is not a matter for the academy — although it is important for many businesses or professions. Using what others invented is what doctors do in the clinic, but it does not count as research, but as honest professional toil (and very valuable work, at that, compared with say much business).

But statistics is hard, and frequently counterintuitive. We do not teach it well to medical students and for all the mantra about doctors’ communication skills, fluency with statistics is a core medical skill, and in many situations, the key communication skill doctors should possess. If you want your students to communicate well, do not stray far without mentioning binomial, poisson, Bayes etc.

What follows is a comment from Sander Greenland, on Deborah Mayo‘s excellent siteError Statistics . I do not know Greenland (although we have emailed each other in the distant past), but I think he is somebody who is always worth listening too. There are a couple of points he makes that chime with me, and they relate to both teaching and the ‘crisis’ in much medical (and scientific research). He writes:

“My view is that stats in soft sciences (medicine, health, social sciences among others) has been a massive educational and ergonomic failure, often self-blinded to the limits on time and capabilities of most teachers, students, and users. I suspect the reason may be that modern stats was developed and promulgated by a British elite whose students and colleagues were selected by their system to be the very best and brightest, a tiny fraction of a percent of the population. Furthermore, it was developed for fields where sequences of experiments leading to unambiguous answers could be carried out relatively quickly (over several years at most, not decades) so that the most serious errors could be detected and controlled, not left as part of the uncertainty surrounding the topic.”

First, we have to accept that we have failed. Second, all too often we are ‘self-blinded to the limits on time and capabilities of most teachers, students, and users’. This is a widespread problem in the undergraduate medical curriculum, more generally. Students must be researchers, teachers, scholars etc. All too often this is one giant GMC inspired delusion, fuelled by either the NHS (yes, I know there is no longer one NHS), or speciality groups (my organ is bigger than your organ….). Third, much of higher education has not caught up with its audience (pace, elite higher education).

Finally, the sorts of experimental science he is talking about (final para) is exactly the sort I love. This is the sort of work Brenner and Crick described when they and a handful of others invented molecular biology. Or for an example from another field, look at how David Hubel and Torsten Wiesel described their work. But sadly, most medical research is no longer like this. It is much, much duller, and much less intellectually  secure, because many built in tests of veracity through experimental design and approach, have been replaced by audit of process. Does it have to be this way? A silent bleak voice, tells me yes. As for the teaching, that is a problem we can do something about.

[I like his use of ergonomic, too]