This is from an article in Nature. And the problem is resolving differences in experimental results between labs.
But subtle disparities were endless. In one particularly painful teleconference, we spent an hour debating the proper procedure for picking up worms and placing them on new agar plates. Some batches of worms lived a full day longer with gentler technicians. Because a worm’s lifespan is only about 20 days, this is a big deal. Hundreds of e-mails and many teleconferences later, we converged on a technique but still had a stupendous three-day difference in lifespan between labs. The problem, it turned out, was notation — one lab determined age on the basis of when an egg hatched, others on when it was laid.
Now my title is from Blake:
He who would do good to another must do it in Minute Particulars: general Good is the plea of the scoundrel, hypocrite, and flatterer, for Art and Science cannot exist but in minutely organized Particulars.
And yet, I think I am using the quote in a way he would have strongly disagreed with. Some of the time ‘Minute particulars’ are not the place to be if you want to change the world. Especially in biology.
Little evidence is found for higher-order organization into 30- or 120-nm fibers, as would be expected from the classic textbook models based on in vitro visualization of non-native chromatin.
Well, chromatin structure might not be everybody’s cup of tea but I once shared an ‘office’ with a couple of French / Polish researchers in Strasbourg. It was all above my head, so I had to make do with the textbooks, and I stuck to my simple cloning of upstream regulatory regions of a retinoid receptor. Now, it appears from this article in Science, the textbooks will need rewriting. Science works.
Hours later, with Blackburn’s approval, the institute issued comments on the scientific records of the two women. It had “invested millions of dollars” in each scientist, Salk stated, but a “rigorous analysis” showed each “consistently ranking below her peers in producing high quality research and attracting” grants. Neither has published in Cell, Nature, or Science in the last 10 years, it said. Lundblad’s salary “is well above the median for Salk full professors ($250,000) … yet her performance has long remained within the bottom quartile of her peers.” The institute wrote that Jones’s salary, in the low $200,000 range, “aligns” with salaries at top universities, although she “has long remained within the bottom quartile of her peers.”
This is from an article in Science (Gender discrimination lawsuit at Salk ignites controversy). The context is a sex discrimination case, but the account is about an astonishing lack of vision. Short termism of stock markets is not the only way value is being destroyed by a cash-in-hand mentality. The best rugby coaches have rarely been the greatest players. Nobel laureates may not be the best leaders. No (bull) shit here…
Terrific interview with Alan Kay. Familiar memes, but I do not tire of them.
The business of a university is to help students learn contexts that they were unaware of when they were in high school.
His use of the word context encompasses intellectual creations such as reading, writing, printing etc. His oft quoted quip: a change in context is worth 80 IQ points
Or so says an article in Nature. No we don’t , is my response
Philanthropists are flying blind because little is known about how to donate money well. Facebook co-founder Mark Zuckerberg’s US$100-million gift to schools in Newark, New Jersey, reportedly achieved nothing. Some grants to academic scientists create so much administration that researchers are better off without them. And some funders’ decisions seem to be no better than if awardees were chosen at random, with the funded work achieving no more than the rejecte
There is no science to philanthropy. You can study it, you can come up with ideas about it, and try to meld systems of rationality about it. But this is just an abuse of the word science, an abuse meant to demarcate this area of activity from things that are non-science and are, by implication, less robust or rigorous. This is one of the ways the science (and STEM) lobby misunderstand the world. But, the quoted paragraph, does of course say something meaningful.
Institutions with histories matter. It is just that in many instances innovation often comes from the periphery. I think this is often true in many fields: science, music, even medical education. It is not always this way, but often enough to make me suspicious of the ‘centre’. The centre of course gets to write the history books.
An article by Mark Mazower in the NYRB, praising Richard Evans, the historian of the Third Reich, caught my attention. It seems that nobody in the centre was too excited about understanding the event that changed much of the world forever. Mazower writes:
If you wanted to do research on Saint Anselm or Cromwell, there were numerous supervisors to choose from at leading universities; if you wanted to write about Erich Ludendorff or Hitler, there was almost no one. The study of modern Europe was a backwater, dominated by historians with good wartime records and helpful Whitehall connections—old Bletchley Park hands and former intelligence officials, some of whom had broken off university careers to take part in the war and then returned.
Forward-looking, encouraging of the social sciences, open to international scholarship from the moment of its establishment, St. Antony’s is the college famously written off by the snobbish Roddy Martindale in John le Carré’s Tinker, Tailor, Soldier, Spy as “redbrick.” The truth is that it was indeed the redbrick universities, the creations of the 1950s and 1960s, that gave Evans and others their chance and shaped historical consciousness as a result. The Evans generation, if we can call them that, men (and only a very few women) born between 1943 and 1950, came mostly from the English provinces and usually got their first jobs in the provinces, too.
It is interesting how academics who had had career breaks were important. And how you often will need new institutions to change accepted practice. All those boffins whose careers were interrupted by the war led to the flowering of invention we saw after the second world war. You have to continually recreate new types of ivory towers. But I see little of this today. Instead, we live in an age of optimisation, rather than of optimism that things can be different. The future is being captured by the present ever more than it once was. At least in much of the academy.
Spectral authors also haunt the scientific canon. One physicist, frustrated at having his paper repeatedly rejected, finally saw it published after changing the title and adding a fictitious co-author, Stronzo Bestiale. It means “total asshole” in Italian.
Seriously, if you suggested the world we have now of predatory journals and the tyranny of metrics, would any sane scientist in 1960 think it possible? Uncle Syd once remarked that people no longer read papers they just xeroxed them. Now we do not even do that: metadata is all.
I thought I would have read this before, but maybe I put it to one side and foolishly forgot. It is a fitting description of Jacob Bronowksi by his wife, Rita. One thing — amongst many — caught my eye.
As a very young man he would travel miles every week to outlying villages in England to give what were called Workers’ Educational Association lectures. Quite literally he would travel through snow and fog to village halls to speak to 8 or 10 people who had equally braved the elements. I sometimes would think it a pity there were not hundreds thereto hear him. Little did I imagine that with radio and then television he would in fact finally reach millions.
And I would respond: you have to want to learn, and you have to want to educate.
But I can’t stop here. One bit of the jigsaw I didn’t know:
After receiving his Ph.D. and conducting 3 years of research, it became clear that, being a Jew, Bruno would not be made a Fellow of his college (Jesus College, Cambridge). He decided to ‘drop out’. Like so many young students (hippies, 30 years later), bearded and down-at-heel, he went to Paris to write. There he met, among others, Samuel Beckett, and they jointly edited an anthology called European Caravan..
It ends with his own words
What makes the biological machinery of man so powerful is that it modifies his actions through his imagination. It makes him able to symbolize, to project himself into the consequences of his acts, to conceptualize his plans, to weigh them, one against another, as a system of values… We, as men, are unique. We are the social solitaries … We are the creatures who have to create values in order to elucidate our own conduct, so that we learn from it and can direct it into the future (emphasis, mine)
In LEONARDO, Vol. 18, No. 4, pp. 223-225,1985
Q: What’s at stake when scientists fib?
A: Science is the last institution where being honest is a quintessential part of what you’re doing. You can do banking and cheat, and you’ll make more money, and that money will still buy you the fast cars and the yachts. If you cheat in science, you’re not making more facts, you’re producing nonfacts, and that is not science. Science still has this chance of giving a lead to democratic societies because scientific values overlap strongly with democratic values.
Interview with Harry Collins about his book: Gravity’s Kiss: The Detection of Gravitational Waves Harry Collins MIT Press, 2017. 414 pp.
Bruce Alberts talks a lot of sense about science education and education in general. And of course he produced a book that ‘educated’ a whole generation (or more) of people like me. But in this recent Science piece he is taking on some of the big questions, questions that have been asked before, but for which few have managed to follow through on. As ever, the emphases are mine.
In previous commentaries on this page, I have argued that “less is more” in science education, and that learning how to think like a scientist—with an insistence on using evidence and logic for decision-making—should become the central goal of all science educators. I have also pointed out that, because introductory science courses taught at universities define what is meant by “science education,” college science faculty are the rate-limiting factor for dramatically improving science education at lower levels.
For example, there is a long-standing belief that every introductory college biology course must “cover” a staggering amount of knowledge. There is no time to focus on a much more important goal—insisting that every student understand exactly how scientific knowledge is generated. Science is not a belief system; it is, instead, a very special way of learning about the true nature of the observable world.
His phrase, “college science faculty are the rate-limiting factor for dramatically improving science education at lower levels”, could equally apply to medicine and medical teachers. It is not hyperbole to say these are some of the central problems of our time. And it is not just science education that is the issue.
Universities are idea factories. Current corporatization approaches emphasize the factory rather than the ideas.
Ralf Buckley in Nature. I would say— for the short term at least — unless somebody finds a way to create new ‘dissenting academies’ things in UK higher ed will get worse.
As if by a miracle, once up and running, the Mark 1 telescope was the only instrument that could both detect the first Soviet and American satellites and transmit instructions to them. Amazing as it now seems, the need for such a telescope had escaped both the telecommunications industry and the military leaders of both superpowers.
Despite its spectacular success, which included tracking the Sputnik 1 satellite mission in 1957, the government did nothing to alter the remaining debt, being bound by the iron restraint of Treasury rules. It was Lord Nuffield who did so, thereby demonstrating the superiority of aristocratic, rather than state support, to science – and indeed to all intellectual activity, a view which Lovell expressed frequently and forcefully to the end of his life.
Sir Fred Hoyle’ obituary of Sir Bernard Lovell. I fear Hoyle is right — at least if we realise we need more Fred Hoyles. Now, they are not aristocrats, but US philanthropists.
Interesting editorial in Nature. And unexpected. The issue is support for science and the state of politics in the US.
Just telling the same old stories won’t cut it. The most seductive of these stories — and certainly the one that scientists like to tell themselves and each other — is the simple narrative that investment in research feeds innovation and promotes economic growth. ‘It’s the economy, stupid’, so the saying goes, and as nations become a little less stupid by pushing against the frontiers of knowledge, so the benefits of all this new insight spread from the laboratory to the wider population, as improvements in the standard of living and quality of life. This comfortable story has all the hallmarks of a bubble waiting to pop.
The article goes on:
It is right that more scientists should tell stories of the good their research can do. But it is more important and urgent than ever that researchers should question how these stories really end — and whether too many of the people they claim to act for don’t really get to live happily ever after.
Much science is in a vacuous bubble, and arguments for more funding from its practitioners is increasingly viewed as self serving. Universities share some or much of this blame, all too happy to ‘shift more units’. This lack of intellectual honesty will harm academia in the long term. The one uniting feature that justifies higher education is the pursuit of truth in whichever direction enquiry moves. Universities are not businesses, profit centres, or corporations. They have a different set of norms that are distinct from those advertised by much of the rest of the corporate world (or government). STEM has never been enough, and truthfulness is not something you can opt in or out of, like you can some undergraduate modules. The role for universities — and science — is greater than ever: the issue is whether the universities have the necessary leadership. Even with the right leaders, it is a tough ask.
In 1924, a 30-year old journalist on the Daily Express came to Cambridge aiming to interview Haldane. She was Charlotte Burghes, née Franken, and she had a young son, Ronnie. Haldane and Charlotte became lovers, but before they could marry she had to seek a divorce, a procedure that carried substantial social stigma at that time. A university committee resolved to strip Haldane of his readership, which was only restored by successful legal action.
There are many worthwhile insights on show in the THE interview with the Nobel physicist Saul Perlmutter, ‘You can’t order up a breakthrough’. But this caught me eye:
“I think for students it’s never too early…to realise that they should be helping…to figure out the world together, not just learning the received facts and the things we already know,” he says. “We find ourselves looking at a world where I don’t think almost any of the problems I see today would worry me, if we knew how to work together and how to think through problems together in a rational way that wove together fears and needs with a rational understanding of the world…Maybe one of the best ways into that is to start teaching.” (emphasis mine)
I think this is the kernel of the problem we face, and the trite “we need more STEM’ or ‘teach all students to code’, is missing the key issue. We are, as has been said before, a ‘civilisation that is face to face with its own implications”.
In June 2016, data released by the UK’s Office for National Statistics revealed that there had been 52,400 more deaths in the year following June 2015 compared with the same period a year before; an annual rise of 9%. These rises in mortality rates are unprecedented in post-war times. In England and Wales, the increase was almost entirely in the population aged over 55 years, predominantly in those aged over 75, and was largely attributed to dementia and Alzheimer’s disease, with influenza being suggested as a minor contributory factor. It was mostly those with long-term care needs who were dying earlier.
In addition, in late 2016, official data was released for Scotland showing no rise in life expectancy for men and women for the first time in 160 years.
These are two quotes from an article in yet another new journal, Nature Human Behaviour. The title of the piece, ‘Policymakers should not act like scientists’, is worth thinking over. There is always a bias built into ‘tests of departure from..’ and those who control the funding, can control what counts as legitimate evidence. And the null hypothesis of the ‘classical statistical paradigm’ is not appropriate for many types of problems. From somewhere, I remember a comment from Paul Jannsen along the lines of, ‘in those days the idea of obviousness still existed’. As we saw last week, in the debate about the NHS, in the mouths of politicians a handful can mean upwards of 10,000. As is often misquoted: the plural of anecdote is data.
From an article in Nature describing how the US biomedical workforce has changed over recent times.
Our analysis of IPUMS-USA data reveals a cohort that entered the laboratory workforce as NIH funding grew from US$13.7 billion in 1998 to $28.1 billion in 2004. These ‘doubling boomers’ arguably suffered most as funds subsequently decreased (when adjusted for inflation). In 2004, there were nearly 26,000 individuals under 40 with PhDs working as biomedical scientists. By 2011, there were nearly 36,000. Over this period, the number of faculty jobs did not increase. Indeed, the number of openings expected as a result of academics retiring has declined since 1995, when federal law made it illegal for universities to mandate retirement at age 65 (ref. 3).
The work environment that this cohort faces is unlike anything seen before, despite previous booms and busts4. Today in the United States, four out of five PhD biomedical researchers work outside academia — a record high (see ‘Lab labour’). They earn, on average, almost $30,000 more a year than their academic counterparts, and feel less pressure to produce scientific publications.
There are some obvious points. The doubling was crazy at the time (some of us said that, then), and even more so in hindsight. Universities rushed for the gold, with little wider thought. Second, careers are the ‘long now’ and getting longer. Personal investment relies on a certain degree of continuity and stability, and there will be a hangover, that the universities will now have to deal with. Finally, the obsession with growth by universities is dangerous. Haldane’s essay, ‘On being the right size’ comes to mind. Scaling matters, as does thinking about long term rather than short term success.
“Throughout her career, Gonzalez has done “a bit of everything” at LIGO, she says. For a while, she took on the crucial task of diagnosing the performance of the interferometers to make sure that they achieved unparalleled sensitivity — which is now enough to detect length changes in the 4-kilometre-long arms of the interferometers to within one part in 1021, roughly equivalent to the width of DNA compared with the orbit of Saturn. “
It ain’t biology, then. Nature
Interesting story in Nature highlighting instances where instead of doing post-docs, young biologists have raised funding to set up their own companies. Of course, most start-ups fail, but then most really interesting research projects should fail. Y Combinator is getting into this area, which surprised me. As the age of getting your first grant gets higher, and with the increasingly dysfunctional nature of much academic (medical) science, the attractions are obvious. I was sceptical (and still am) that the ‘software’ model would work in this area.
Cartoon characters not infrequently run off the edge of a cliff. Pause. They then realise there is nothing there to support their running. Time lags are awkward to deal with in any analysis, but since most things do not happen overnight, they are ubiquitous. In analysis, we replace with a fudge factor. Or we ignore them.
I haven’t seen much comment on what I think is the most interesting aspect of the science news over the last few weeks. Here is a line from THE.
Five out of nine laureates in the core prizes for physics, chemistry, medicine and economic sciences were born in the UK. All crossed the pond as rather valuable immigrants to the US.
UK science, and many UK universities, have been in profit-harvesting mode for a long time now. Over the edge of that cliff. Things are going to
fall apart. OK, what the hell:
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Knowledge is not a looseleaf notebook of facts. Above all, it is a responsibility for the integrity of what we are, primarily of what we are as ethical creatures. You cannot possibly maintain that it informed integrity if you let other people run the world for you what you yourself continue to live out of a rag bag of morals that come from past beliefs. That is really crucial today. You can see it is pointless to advise people to learn differential equations, or to do a course in electronics or in computer programming. And yet, 50 years from now, if an understanding of man’s origins, his evolution, his history, his progress is not the commonplace of the schoolbooks, we shall not exist. The commonplace of the schoolbooks of tomorrow is the adventure of today, and that is what we are engaged in……
It sounds very pessimistic to talk about western civilisation with a sense of retreat. I’ve been so optimistic about the ascent of man; and I going to give up at this moment? Of course not. The ascent of man will go on. But do not assume that it will go on carried by Western civilisation as we know it. We are being weighed in the balance at this moment.
The Ascent of Man, Jacob Bronowksi 1973.
Abraham Vergese, an infectious disease physician, gave a talk here in Edinburgh last week. It was a very mixed audience, but I suspect the many students who were there enjoyed it. I have not read any of this books — nor looked at his TED talk — but his Wikipedia entry gives you a flavour of how interesting he is, and how varied a career can be — when you have courage.
One issue that came up tangentially, was the history of diagnosis, and there were some opinions ventured by the audience in terms of when diagnosis was historically established. I may have missed key points, but I found it hard to accept that the idea of diagnosis was something you could date except in very broad terms, even less that you could associated it with the 1870s or with the idea of stethoscopes being a key marker of when modern ideas of diagnosis were established. For instance — and since the lecturer was an ID physician — my first thoughts turned to scabies. The scabies mite was identified in the 1690s, and it was recognised as the cause of the disease ( I am not quoting primary sources so let me know if……) So here we have a clear linking of symptoms, signs, causality, a causal agent, and a broader theory about pathogenesis and epidemiology. So, this it got me thinking about how I view the topic of diagnosis.
Diagnosis is the mapping of one state with another, with the two states being linked by a network of attributes. Diagnosis is a suitcase term: it may contain lots of different tools, tools suited to various purposes, and tools for which we may find different purposes over time. Diagnosis represents an attempt to classify the world into particular states with often the goal of making some predictions about some other state. Most of the time, we think in terms of prediction, about what might happen to that person with or without some intervention. If you see these physical signs (burrows) and the patient describes particular symptoms (itch), then the ‘state’ is scabies. If the diagnosis is correct, you can say something about what causes the state, what might happen, and what effect a particular intervention (permethrin / malathion etc) might have. If you are lucky, you can feel happy with causal arrows linking much of what you say and think. Prediction is important but it is of course not the only quality we want in a theory. We tend to prefer some theories to others, even when they why make similar predictions. Think of Copernicus. We tend to prefer one of the following, irrespective of whether both allow the same quantitative clinical predictions:
Our suitcase of diagnostic concepts have changed over time, however. For instance, even in modern medicine, causality is often lacking. We may use proxy or associated factors to define particular states. We may use simple heuristics as our guide to action, even though we have little idea of where the causal arrows are going. Think much of psychiatry. This does not mean we are powerless, just that we are more ignorant than we would like. We are of course wedded to particular metaphysical systems.
Diagnosis might have been used in the absence of knowledge about particular interventions to attribute blame, as an explanation. If a patient behaved in this way or suffered some state, it was a divine punishment for some behaviour. Now, I may not agree with this world view, but this too is diagnosis. The theory my seem wrong, it may seem primitive, but then my ideas of physics are primitive too if they are applied to the world of the very small.
Galen thought in terms of the mean, and the treatment by opposites (hot treatments for cold; moist treatment for drying diseases etc). This all sounds slightly crazy to modern ears (although dermatologists among you will point out the latter has definite therapeutic merit within very particular skin states). Or how about the idea of therapeutic ‘signatures’. This is from Ian Hacking :
Syphilis is signed by the market place where it is caught; the planet Mercury has signed the market place; the metal mercury, which bears the same name, is therefore the cure for syphilis.
As Hacking points out this allowed Paracelsus to kill lots of people simply because he knew that mercury worked. But whatever the metaphysical system linking two states, the idea of diagnosis was firmly established. Just as Newton got most things right in his physics, and most of us ignore what came after — except when we use the GPS.
Diagnosis was not limited to medicine. Our ancestors spent their lives making diagnoses about what to eat and what not to eat. Making diagnoses about what particular weather states would do to crops etc. Plumbers make diagnoses, as do any humans trying to make sense of an environment that is not static, and where we value intervention.
What may have been specific to medicine was our hangs up about whether there was something special about humans, and whether the simple rules, experimentations and demonstrations of efficacy that allowed other types of human technological progress or indeed much of everyday life, applied in the domain of disease. Successful interventions or demonstrations will have had an effect on metaphysical beliefs in the long term. And of course much of this story is tied up with the growth of that particular branch of formal knowledge we call science. 1870 is just a little late.
 Hacking I. The emergence of probability : A philosophical study of early ideas about probability, induction and statistical inference. Cambridge: Cambridge University Press; 1984.
The goal of the new CFF [Cystic Fibrosis Foundation, a US patient charity] Therapeutics Lab, says Preston W. Campbell III, the foundation’s CEO and president, is to generate and share tools, assays, and lead compounds, boosting its partners’ chances of finding treatments. Frustration with academic technology transfer agreements was a key motivation, he notes. University-based researchers funded by the foundation have to seek approval from their institution’s legal department before sharing assays, cells, or any intellectual property, a hurdle that can take a year to negotiate. “This was killing us,” Campbell says, “ but if we created our own laboratory, we could not only focus on the things we wanted to focus on, we could also share them freely.” Science
Well you really could not make this up. From the EFF:
On August 30, 2016, the Patent Office issued U.S. Patent No. 9,430,468, titled; “Online peer review and method.” The owner of this patent is none other than Elsevier, the giant academic publisher. When it first applied for the patent, Elsevier sought very broad claims that could have covered a wide range of online peer review. Fortunately, by the time the patent actually issued, its claims had been narrowed significantly. So, as a practical matter, the patent will be difficult to enforce. But we still think the patent is stupid, invalid, and an indictment of the system….
Before discussing the patent, it is worth considering why Elsevier might want a government granted monopoly on methods of peer review. Elsevier owns more than 2000 academic journals. It charges huge fees and sometimes imposes bundling requirements whereby universities that want certain high profile journals must buy a package including other publications. Universities, libraries, and researchers are increasingly questioning whether this model makes sense.
Avoid Elsevier. This is a world that should no longer exist.
Biology is short of theory compared with physics, and medicine more so. More dull trials, and less and less insight. Busyness and project management, directed by chief executives, wielding Excel spreadsheets. Alfred G Knudson has just died and Nature’s obituary tells the story of somebody who could play at natural history and then form a majestic and testable hypothesis. The penultimate sentence reads: “[his] lack of patience for science that merely repeated the work of others kept everyone in his sphere striving for the new”
Two articles both from different areas. The first is from an interview with Paul Greengrass (he of ‘Bloody Sunday’, and the Bourne films).
“Youngsters starting out probably aren’t going to be supported and developed like I was in my early career, they’re much more likely be chewed up,” he said. “This places a greater weight on universities like Kingston, which is a breeding ground for talent, to educate kids about the importance of point of view – it’s the easiest thing to lose but the most important thing to hold on to.”
The second in Science, about a likely Nobel prize winner, Rainer Weiss.
Then, in his junior year, Weiss flunked out of school entirely. He fell for a woman he met on a ferry from Nantucket to Boston. “She taught me about folk dancing and playing the piano,” he says. Weiss followed her when she moved to Evanston, Illinois, abandoning his classes in midterm. But the affair fizzled. “I fell in love and went crazy,” he says, “and of course she couldn’t stand to be around a crazy man.” Weiss returned to MIT hoping to take his finals only to find he’d flunked out.
Weiss says he was unfazed. “People say, ‘I failed out of college! My life is over!’ Well, it’s not over. It depends on what you do with it.” He took a job as a technician in MIT’s legendary Building 20, a temporary structure erected during the war, working for Jerrold Zacharias, who studied beams of atoms and molecules with light and microwaves and developed the first commercial atomic clock. Under Zacharias’s tutelage, Weiss finished his bachelor’s degree in 1955 and earned his Ph.D. in 1962.
A later quote from the same article:
After a postdoc at Princeton University developing experimental tests of gravity under physicist Robert Dicke, Weiss returned to MIT in 1964. As a junior faculty member, he says, he published little and didn’t worry about advancing his career. MIT’s Shoemaker says Weiss probably got tenure only for his teaching—and wouldn’t get it today. Bernard Burke, an emeritus physicist at MIT, agrees that early on Weiss was a “happy gadgeteer” who “wasn’t likely to get tenure unless he did something that did something.”
The echo of how he has lived some of his life is provided by one of his protégés, David Shoemaker
Shoemaker adds that Weiss’s foremost quality is empathy. A college dropout, Shoemaker credits Weiss with getting him into graduate school at MIT without an undergraduate degree. “He sought ways to bring out the best in me,” Shoemaker says. “He also took a rather irregular path, and I think because of that and just his nature, he is really interested in helping people.”
Now, none of this is too surprising. Science and any serious intellectual or cultural endeavour is a way of constructively catching dissent. And dissent clusters: it is not uniform across society, but found on the fringes or boundaries of good sense. But we are no longer focussed on diversity or providing a garden for play. Instead, we are obsessed with homogeneity and forcing all to the mean.
Blake got it right:
The Enquiry in England is not whether a Man has Talents & Genius, But whether he is Passive & Polite & a Virtuous Ass & obedient to Noblemen’s Opinions in Art & Science. If he is, he is a Good Man. If not he must be Starved.
A comment in Science
A well-stated hypothesis describes a state of nature. It is either true or not true, not subject to probability. The phrase “probability the hypothesis is true” is meaningless. One can only say, “likelihood that the observed data came from a population characterized by the hypothesis.”
I only post, because I seem to spend my life trying to argue that the dismal null hypothesis is a tool for doing one type of statistics, and has a limited role in science. It has little to do with what we mean by a scientific hypothesis. There are not an infinite number of scientific hypotheses: there is a not a probability distribution in the way we use this term in statistics.
“Classics are written by people, often in their twenties, who take a good look at their field, are deeply dissatisfied with an important aspect of the state of affairs, put in a lot of time and intellectual effort into fixing it, and write their new ideas with self-conscious clarity. I want all Berkeley graduate students to read them.”
The (medical) future is here, just unevenly distributed
The lessons from Glybera, the first gene therapy to be sold in Europe, still loom large. It cures a genetic condition that causes a dangerously high amount of fat to build up in the blood system. Priced at $1m, the product has only been bought once since 2012 and stands out as a commercial disaster. Economist