The use of Benzedrine by American athletes in the 1936 Berlin Olympics prompted the Temmler company on the edge of Berlin to focus on creating a more powerful version. By the autumn of 1937, its chief chemist, Dr. Fritz Hauschild (in postwar years the drug provider for East German athletes), created a synthesized version of methamphetamine. This was patented as Pervitin. It produced intense sensations of energy and self-confidence.
In pill form Pervitin was marketed as a general stimulant, equally useful for factory workers and housewives. It promised to overcome narcolepsy, depression, low energy levels, frigidity in women, and weak circulation. The assurance that it would increase performance attracted the Nazi Party’s approval, and amphetamine use was quietly omitted from any anti-drug propaganda. By 1938, large parts of the population were using Pervitin on an almost regular basis, including students preparing for exams, nurses on night duty, businessmen under pressure, and mothers dealing with the pressures of Kinder, Küche, Kirche (children, kitchen, church—to which the Nazis thought women should be relegated). Ohler quotes from letters written by the future Nobel laureate Heinrich Böll, then serving in the German army, begging his parents to send him more Pervitin. Its consumption came to be seen as entirely normal.
Lots I didn’t know, but any reader of David Healy will not be surprised. A dermatologist doesn’t come out of it too well, either.
Once upon a time the government gave money to universities, and the universities educated people (or they tried). Now things are different. The government buys educational services, and the universities are the contractors.
Not so much Software as aService, but Education as aService.
According to the job description for the chair of modern Greek studies posted last month, whoever fills the professorship part-funded by the Greek Laskaridis shipping family will not be paid an “official salary” from the university. Instead, they will receive an unspecified share of €20,000 (£16,730) from the Dutch Society of Modern Greek Studies to carry out numerous academic duties for, on average, one day a week.
The professorship, named after the late shipping heiress Marilena Laskaridis, lasts for five years, during which time the post-holder will be asked to teach, to supervise PhD students and to win research grants. Despite being based in Amsterdam’s Faculty of Humanities, the professor would not be an employee of the university and would not receive any of the usual benefits enjoyed by other staff.
Technology magnifies power in general, but the rates of adoption are different. Criminals, dissidents, the unorganized—all outliers—are more agile. They can make use of new technologies faster, and can magnify their collective power because of it. But when the already-powerful big institutions finally figured out how to use the Internet, they had more raw power to magnify.
This is true for both governments and corporations. We now know that governments all over the world are militarizing the Internet, using it for surveillance, censorship, and propaganda. Large corporations are using it to control what we can do and see, and the rise of winner-take-all distribution systems only exacerbates this.
This is the fundamental tension at the heart of the Internet, and information-based technology in general. The unempowered are more efficient at leveraging new technology, while the powerful have more raw power to leverage. These two trends lead to a battle between the quick and the strong: the quick who can make use of new power faster, and the strong who can make use of that same power more effectively.
Bruce Schneier, on Crooked Timber, talking around Cory Doctorow’s new novel ‘Walkaway’. BS—well worth reading in full, as usual.
Microsoft held back from distributing a free repair for old versions of its software that could have slowed last week’s devastating ransomware attack, instead charging some customers $1,000 a year per device for protection against such threats.
In truth, everyone knows that values are actually marketing exercises, used by organisations as slogans. They have little to do with actual behaviour in organisations. They infantilise people, reduce them to ciphers.
The first law of higher education is that the future of universities is political not technical.
The second unchanging law of higher education is that there is no situation so bad that it cannot get worse.
The third law of higher education, as exemplified by the classic University Challenge episode of The Young Ones, is that the posh kids always win.
The fourth and final law of higher education, that exceeds the first three, is that universities outlive ministers.
And if the is not enough food for thought, consider this:
Higher education in England is no longer a supply-led industry. English universities are now in a demand-led environment in which the regulator has the last word. The Rubicon has been crossed, and few in higher education have really begun to understand what the implications of that are for universities. They will have the next five years of Conservative government to contemplate it.
In 2020 — 10 years from now — Moore’s Law predicts that computers will be 100 times more powerful. That’ll change things in ways we can’t know, but we do know that human nature never changes. Cory Doctorow rightly pointed out that all complex ecosystems have parasites. Society’s traditional parasites are criminals, but a broader definition makes more sense here. As we users lose control of those systems and IT providers gain control for their own purposes, the definition of “parasite” will shift. Whether they’re criminals trying to drain your bank account, movie watchers trying to bypass whatever copy protection studios are using to protect their profits, or Facebook users trying to use the service without giving up their privacy or being forced to watch ads, parasites will continue to try to take advantage of IT systems. They’ll exist, just as they always have existed, and — like today — security is going to have a hard time keeping up with them.
Welcome to the future. Companies will use technical security measures, backed up by legal security measures, to protect their business models. And unless you’re a model user, the parasite will be you.
Which just reminds my of my own ecological ignorance. Many years back I was moaning to William Bains about how “surely the system (insert your own bête noire) will collapse under the weight of all these people who do nothing except get in the way and stop real work being done”. He corrected me by reminding me that in many biological systems the biomass of parasites exceeds that of the non-parasites. It is now my strategy when meeting somebody or hearing some new idea to ask the simple polite question: are you a parasite? There are an awful lot of them. I expect to see more and more.
Spectral authors also haunt the scientific canon. One physicist, frustrated at having his paper repeatedly rejected, finally saw it published after changing the title and adding a fictitious co-author, Stronzo Bestiale. It means “total asshole” in Italian.
Seriously, if you suggested the world we have now of predatory journals and the tyranny of metrics, would any sane scientist in 1960 think it possible? Uncle Syd once remarked that people no longer read papers they just xeroxed them. Now we do not even do that: metadata is all.
I thought I would have read this before, but maybe I put it to one side and foolishly forgot. It is a fitting description of Jacob Bronowksi by his wife, Rita. One thing — amongst many — caught my eye.
As a very young man he would travel miles every week to outlying villages in England to give what were called Workers’ Educational Association lectures. Quite literally he would travel through snow and fog to village halls to speak to 8 or 10 people who had equally braved the elements. I sometimes would think it a pity there were not hundreds thereto hear him. Little did I imagine that with radio and then television he would in fact finally reach millions.
And I would respond: you have to want to learn, and you have to want to educate.
But I can’t stop here. One bit of the jigsaw I didn’t know:
After receiving his Ph.D. and conducting 3 years of research, it became clear that, being a Jew, Bruno would not be made a Fellow of his college (Jesus College, Cambridge). He decided to ‘drop out’. Like so many young students (hippies, 30 years later), bearded and down-at-heel, he went to Paris to write. There he met, among others, Samuel Beckett, and they jointly edited an anthology called European Caravan..
It ends with his own words
What makes the biological machinery of man so powerful is that it modifies his actions through his imagination. It makes him able to symbolize, to project himself into the consequences of his acts, to conceptualize his plans, to weigh them, one against another, as a system of values… We, as men, are unique. We are the social solitaries … We are the creatures who have to create values in order to elucidate our own conduct, so that we learn from it and can direct it into the future (emphasis, mine)
Excuse my senility vanity, but I remember being taught by an ‘ancient’ GP in my first year of med school, in 1976. His name was Andrew Smith, and most of us thought him amazing in many ways. One of the stories that made a deep impression on me, was how— the day after he graduated — he was delivering a baby using forceps in the mother’s own house at 3am. I would have been 18 or so and he in his early sixties —not far from where I am now. So, he would have been a medical student in the late 1930s, and I will probably stop practising medicine in the early 2020s. When I add the two professional lifetimes together at the extremes (med student to final year of practice) I am always amazed how big the number is — a span of 80 years or so. And one of our problems in undergraduate education is that we have to be concerned with these extremes: I am teaching students who will practice for another 40 years, but I have inherited a set of code written as many years in the past.
Now the above reminisce was set off by some words from Benedict Evans. He is talking about much shorter timeframes and is concerned with the commercial world. But my question for medical students (and others) is how is medicine really going to look in a few more score years, and how do we imagine all the system wide interactions that will make the future so different? This is surely more meaningful that memorising biochemical pathways.
“Everything bad that the internet did to media is probably going to happen to retailers. The tipping point might now be approaching, particularly in the US, where the situation is worsened by the fact that there is far more retail square footage per capita than in any other developed market. And when the store closes and you turn to shopping online (or are simply forced to, if enough physical retail goes away), you don’t buy all the same things, any more than you read all the same things when you took your media consumption online. When we went from a corner store to a department store, and then from a department store to big box retail, we didn’t all buy exactly the same things but in different places – we bought different things. If you go from buying soap powder in Wal-Mart based on brand and eye-level placement to telling Alexa ‘I need more soap’, some of your buying will look different….In parallel to this, TV, which so far has not really been touched by the internet, is also starting to look unstable.”
Medicare, America’s public health scheme for the over-65s, has recently started paying doctors for in-depth conversations with terminally ill patients; other national health-care systems, and insurers, should follow.
The quote is from a reasonable article in the Economist (How to have a better death). But what screams at me that is that the very incentive systems the Economist espouses are those that have led to the status quo. We already have behavioural code(s) that are misaligned, and now we add more and more buggy patches, layer upon layer. All because nobody talks to those on either side of the front line.
Nice letter in Academic Medicine. Not convinced by the exact details, but the author is on to something important. The first victim of insincerity is language (Orwell, if I remember correctly).
Medical professionalism is espoused as a necessity in health care, setting an important precedent of excellence and respect towards peers and patients. In many medical schools, a portion of the curriculum is dedicated to the intricacies of medical professionalism. Though typically taught through specific tenets and case studies, professionalism is still a general principle, resulting in varied definitions across institutions. This is, in fact, part of the beauty of professionalism—the lack of definition makes it a flexible concept, applicable in a wide variety of situations. However, the downside to this vagary is that it allows for the weaponization of professionalism, leaving space for “professionals” to reject certain approaches to health care.
I always recommend people to read David Healy’s Psychopharmacology 1, 2, and 3, together with Jack Scannell’s articles (here and here) to get a feel for exactly what drug discovery means. What is beyond doubt is that we are not as efficient at it as we once were. There is lots of blame to go around. The following gives a flavour of some of the issues ( or at least one take on the core issues).
From a review in ‘Health Affairs’ of A Prescription For Change: The Looming Crisis In Drug Development by Michael S. Kinch Chapel Hill (NC): University of North Carolina Press, 2016, by Christopher-Paul Milne.
He chronicles these industries’ long, strange trip from being the darling of the investor world and beneficiary of munificent government funding to standing on the brink of extinction, and he details the “slow-motion dismantlement” of their R&D capacity with cold, hard numbers because “the data will lead us to the truth.” There are many smaller truths, too: Overall, National Institutes of Health (NIH) funding has fallen by 25 percent in relative terms since a funding surge ended in 2003; venture capital is no longer willing to invest in product cycles that are eleven or twelve years long; and biotech companies may have to pay licensing fees on as many as forty patents for a decade before they even get to the point of animal testing and human trials….
In an effort to survive in such a costly and competitive environment, pharmaceutical companies have shed their high-maintenance R&D infrastructure, maintaining their pipelines instead by acquiring smaller (mostly biotech) companies, focusing on the less expensive development of me-too drugs, and buying the rights to promising products in late-stage development. As a consequence, biotech companies are disappearing (down from a peak of 140 in 2000 to about 60 in 2017), and the survivors must expend an increasing proportion of their resources on animal and human testing instead of the more innovative (and less costly) identification of promising leads and platform technologies. Similarly, some of academia’s R&D capacity, overbuilt in response to the NIH funding surge, now lies fallow, while seasoned experts and their promising protégés have moved on to other fields.
Higher education is an industry of the future — one in which the UK is a world-class player. Foreign universities are out to eat Britain’s lunch, and Mrs May’s obdurate stand is one of the best things that has ever happened to them.
Indeed, an industry of the future… “but not as you know it Jim”. FT
With many powerful academicians, lobbyists, professional societies, funding agencies, and perhaps even regulators shifting away from trials to observational data, even for licensing purposes, clinical medicine may be marching headlong to a massive suicide of its scientific evidence basis. We may experience a return to the 18th century, before the first controlled trial on scurvy. Yet, there is also a major difference compared with the 18th century: now we have more observational data, which means mostly that we can have many more misleading results.
I think the situation is even worse. Indeed, we can only grasp the nature of reality with action, not with contemplation (pace Ioannidis). But experiments (sic) as in RCT are also part of the problem: we only understand the world by testing of ideas that appear to bring coherency to the natural world. A/B testing is inadequate for this task — although it may well be all we have left.
G4S, the outsourcing company, has sold its US juvenile detention centres business for $57m. It said it had sold the business to BHSB, a US a “behavioural health care services company” that provides services to troubled young people. FT.
Interesting interview in the FT with the African guitarist Lionel Loueke, if you like to think about learning and certification, a couple of truths. The first is how technology can help. ‘Slow it down’ has helped many of us. Being able to record yourself, and then listen ( a point Eric Clapton talks about) is an interesting example of how you blur the gap between private practice and the external ear provided by a teacher.
He first heard jazz when a friend played him cassettes by Wes Montgomery and George Benson. At first, Loueke didn’t even know that jazz was an improvised music. ‘I approached it like I was playing Afropop, and learnt it by ear,’ he says. ‘I slowed down the cassette by putting in weak batteries, then back to electricity to get the speed. That’s how I started jazz”.
And of course, certification has its limits, and the ‘place to learn’ in not always in the classroom. Papert’s ‘mathland’, revisited.
When guitarist Lionel Loueke was a teenager in Benin, boiling precious guitar strings in vinegar to make them last, he didn’t think that one day he’d be auditioning in Los Angeles for a place at the Thelonious Monk Institute of Jazz Performance. Or that the panel of jazz professors would include Wayne Shorter, Terence Blanchard and Herbie Hancock. And certainly not that Hancock would exclaim, ‘How about we just forget about the school and I take you on the road right now?’
Comment on an FT article. How things have changed. Even I can remember a colleague — a few years my senior — who went for a Wellcome Training Fellowship, only to be interviewed by one person, with the opening question being, ‘Imagine I am an intelligent layperson: tell me what you want to do!’
I was a war baby, a small farmer’s son and in 1960, at 17, I had a chat with my most trusted teacher about what I should do to apply to become a doctor for which I had just acquired a good group of Scottish highers. He advised me that because I should have applied a number of months before, to write a letter to the University enclosing my qualifications. I was asked to come and have a chat with the Bursar and the only thing I remember him saying was that my qualifications were good but did I realise that I might be preventing somebody else from getting in. I am ashamed to say that I replied that I was not really too troubled about that. I was accepted, and was fine.
When you want to find your way around a city, you might memorise key streets or more likely use a simplified map as a guide as you travel. But when you know a city, you navigate by being able to recall how you get from A to B. In fact you may have difficulty drawing a map — certainly to scale — but your memory is made up of lots of instances of what lies around a particular corner. Much of what you learn about diseases is the map in this analogy. By contrast, what the experienced clinician knows are lots of instances of what lies round particular corners. Those instances have a name: they are called patients.
You cannot change or reform undergraduate medical education in a significant way without changing the way doctors work and behave.
A: Science is the last institution where being honest is a quintessential part of what you’re doing. You can do banking and cheat, and you’ll make more money, and that money will still buy you the fast cars and the yachts. If you cheat in science, you’re not making more facts, you’re producing nonfacts, and that is not science. Science still has this chance of giving a lead to democratic societies because scientific values overlap strongly with democratic values.
Interview with Harry Collins about his book: Gravity’s Kiss: The Detection of Gravitational Waves Harry Collins MIT Press, 2017. 414 pp.
That’s a question I just got at our most recent all-hands meeting. I’ve been reminding people that it’s Day 1 for a couple of decades. I work in an Amazon building named Day 1, and when I moved buildings, I took the name with me. I spend time thinking about this topic.
“Day 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is always Day 1.”
Resist Proxies: As companies get larger and more complex, there’s a tendency to manage to proxies. This comes in many shapes and sizes, and it’s dangerous, subtle, and very Day 2. A common example is process as proxy. Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s not that rare to hear a junior leader defend a bad outcome with something like, “Well, we followed the process.”
(Do you know what they know they want?)
Good inventors and designers deeply understand their customer. They spend tremendous energy developing that intuition. They study and understand many anecdotes rather than only the averages you’ll find on surveys. They live with the design. I’m not against beta testing or surveys. But you, the product or service owner, must understand the customer, have a vision, and love the offering. A remarkable customer experience starts with heart, intuition, curiosity, play, guts, taste. You won’t find any of it in a survey.
Jeff Bezos here. I dislike process, and in education or research, whatever promise it offers, is offset by its tendency to lead to institutional denigration of those who keep their eyes on reality.
I keep coming back to a few central insights that have — in the best sense of the word — disturbed my world view. These are from a wonderful article in a journal I had never heard of, written by Frank Davidoff. (But I do not buy the term ‘revolution’)
Competence, in contrast, is like “dark matter” in astronomy: although it makes up most of the universe of working knowledge, we understand relatively little about it. What does it really consist of? Which of its components are most important? How do people acquire it? What’s the best way to measure it? And how can you tell when they have enough of it?
Most importantly, it is increasingly clear that competence is acquired primarily through experiential learning – a four-element cycle (or spiral) in which learners move from direct personal involvement in experiences, to reflection on those experiences, integration of their observations with sense-making concepts and mental models, and finally back to more experiences. Formal training for all high-performance (applied) professions, for example, music, architecture, theater, and athletics, is grounded in the unique requirements of experiential learning: case-based coaching, rather than lectures by content experts; hands-on, practicum experiences (including simulations, if necessary) in addition to written end-objectives; repeated experiences and outcome evaluations over time rather than initial, one- shot exercises; and, ultimately, acquisition of the advanced skills of “reflection-in-action,” which is required for high-level performance and “reflection-on-action,” which is required for continued self-evaluation and self- instruction (Schon, 1987).
Mens Sana Monographs:2008 | Volume:6 | Issue:1 | Page:29–40
Focus on Performance: The 21 st Century Revolution in Medical Education
Bruce Alberts talks a lot of sense about science education and education in general. And of course he produced a book that ‘educated’ a whole generation (or more) of people like me. But in this recent Science piece he is taking on some of the big questions, questions that have been asked before, but for which few have managed to follow through on. As ever, the emphases are mine.
In previous commentaries on this page, I have argued that “less is more” in science education, and that learning how to think like a scientist—with an insistence on using evidence and logic for decision-making—should become the central goal of all science educators. I have also pointed out that, because introductory science courses taught at universities define what is meant by “science education,” college science faculty are the rate-limiting factor for dramatically improving science education at lower levels.
For example, there is a long-standing belief that every introductory college biology course must “cover” a staggering amount of knowledge. There is no time to focus on a much more important goal—insisting that every student understand exactly how scientific knowledge is generated. Science is not a belief system; it is, instead, a very special way of learning about the true nature of the observable world.
His phrase, “college science faculty are the rate-limiting factor for dramatically improving science education at lower levels”, could equally apply to medicine and medical teachers. It is not hyperbole to say these are some of the central problems of our time. And it is not just science education that is the issue.