Quote of the day

by reestheskin on 11/05/2017

Comments are disabled

“Elegance is not a dispensable luxury but a quality that decides between success and failure”

Edsger Dijkstra EWD 1284

Stronzo Bestiale: The triumph of metadata over meaning

by reestheskin on 10/05/2017

Comments are disabled

Spectral authors also haunt the scientific canon. One physicist, frustrated at having his paper repeatedly rejected, finally saw it published after changing the title and adding a fictitious co-author, Stronzo Bestiale. It means “total asshole” in Italian.

Seriously, if you suggested the world we have now of predatory journals and the tyranny of metrics, would any sane scientist in 1960 think it possible? Uncle Syd once remarked that people no longer read papers they just xeroxed them. Now we do not even do that: metadata is all.

FT

Unity in variety

by reestheskin on 09/05/2017

Comments are disabled

I thought I would have read this before, but maybe I put it to one side and foolishly forgot. It is a fitting description of Jacob Bronowksi by his wife, Rita. One thing — amongst many — caught my eye.

As a very young man he would travel miles every week to outlying villages in England to give what were called Workers’ Educational Association lectures. Quite literally he would travel through snow and fog to village halls to speak to 8 or 10 people who had equally braved the elements. I sometimes would think it a pity there were not hundreds thereto hear him. Little did I imagine that with radio and then television he would in fact finally reach millions.

And I would respond: you have to want to learn, and you have to want to educate.

But I can’t stop here. One bit of the jigsaw I didn’t know:

After receiving his Ph.D. and conducting 3 years of research, it became clear that, being a Jew, Bruno would not be made a Fellow of his college (Jesus College, Cambridge). He decided to ‘drop out’. Like so many young students (hippies, 30 years later), bearded and down-at-heel, he went to Paris to write. There he met, among others, Samuel Beckett, and they jointly edited an anthology called European Caravan..

It ends with his own words

What makes the biological machinery of man so powerful is that it modifies his actions through his imagination. It makes him able to symbolize, to project himself into the consequences of his acts, to conceptualize his plans, to weigh them, one against another, as a system of values… We, as men, are unique. We are the social solitaries … We are the creatures who have to create values in order to elucidate our own conduct, so that we learn from it and can direct it into the future (emphasis, mine)

In LEONARDO, Vol. 18, No. 4, pp. 223-225,1985

Medical practice 2060

by reestheskin on 08/05/2017

Comments are disabled

Excuse my senility  vanity, but I remember being taught by an ‘ancient’ GP in my first year of med school, in 1976. His name was Andrew Smith, and most of us thought him amazing in many ways. One of the stories that made a deep impression on me, was how— the day after he graduated — he was delivering a baby using forceps in the mother’s own house at 3am. I would have been 18 or so and he in his early sixties —not far from where I am now. So, he would have been a medical student in the late 1930s, and I will probably stop practising medicine in the early 2020s. When I add the two professional lifetimes together at the extremes (med student to final year of practice) I am always amazed how big the number is — a span of 80 years or so. And one of our problems in undergraduate education is that we have to be concerned with these extremes: I am teaching students who will practice for another 40 years, but I have inherited a set of code written as many years in the past.

Now the above reminisce was set off by some words from Benedict Evans. He is talking about much shorter timeframes and is concerned with the commercial world. But my question for medical students (and others) is how is medicine really going to look in a few more score years, and how do we imagine all the system wide interactions that will make the future so different? This is surely more meaningful that memorising biochemical pathways.

“Everything bad that the internet did to media is probably going to happen to retailers. The tipping point might now be approaching, particularly in the US, where the situation is worsened by the fact that there is far more retail square footage per capita than in any other developed market. And when the store closes and you turn to shopping online (or are simply forced to, if enough physical retail goes away), you don’t buy all the same things, any more than you read all the same things when you took your media consumption online. When we went from a corner store to a department store, and then from a department store to big box retail, we didn’t all buy exactly the same things but in different places – we bought different things. If you go from buying soap powder in Wal-Mart based on brand and eye-level placement to telling Alexa ‘I need more soap’, some of your buying will look different….In parallel to this, TV, which so far has not really been touched by the internet, is also starting to look unstable.”

The code is sick

by reestheskin on 05/05/2017

Comments are disabled

Medicare, America’s public health scheme for the over-65s, has recently started paying doctors for in-depth conversations with terminally ill patients; other national health-care systems, and insurers, should follow.

The quote is from a reasonable article in the Economist (How to have a better death). But what screams at me that is that the very incentive systems the Economist espouses are those that have led to the status quo. We already have behavioural code(s) that are misaligned, and now we add more and more buggy patches, layer upon layer. All because nobody talks to those on either side of the front line.

The Weaponization of Medical Professionalism

by reestheskin on 04/05/2017

Comments are disabled

Nice letter in Academic Medicine. Not convinced by the exact details, but the author is on to something important. The first victim of insincerity is language (Orwell, if I remember correctly).

Medical professionalism is espoused as a necessity in health care, setting an important precedent of excellence and respect towards peers and patients. In many medical schools, a portion of the curriculum is dedicated to the intricacies of medical professionalism. Though typically taught through specific tenets and case studies, professionalism is still a general principle, resulting in varied definitions across institutions. This is, in fact, part of the beauty of professionalism—the lack of definition makes it a flexible concept, applicable in a wide variety of situations. However, the downside to this vagary is that it allows for the weaponization of professionalism, leaving space for “professionals” to reject certain approaches to health care.

It’s hard to get a fix

by reestheskin on 03/05/2017

Comments are disabled

I always recommend people to read David Healy’s Psychopharmacology 1, 2, and 3, together with Jack Scannell’s articles (here and here) to get a feel for exactly what drug discovery means. What is beyond doubt is that we are not as efficient at it as we once were. There is lots of blame to go around. The following gives a flavour of some of the issues ( or at least one take on the core issues).

From a review in ‘Health Affairs’ of A Prescription For Change: The Looming Crisis In Drug Development by Michael S. Kinch Chapel Hill (NC): University of North Carolina Press, 2016, by Christopher-Paul Milne.

He chronicles these industries’ long, strange trip from being the darling of the investor world and beneficiary of munificent government funding to standing on the brink of extinction, and he details the “slow-motion dismantlement” of their R&D capacity with cold, hard numbers because “the data will lead us to the truth.” There are many smaller truths, too: Overall, National Institutes of Health (NIH) funding has fallen by 25 percent in relative terms since a funding surge ended in 2003; venture capital is no longer willing to invest in product cycles that are eleven or twelve years long; and biotech companies may have to pay licensing fees on as many as forty patents for a decade before they even get to the point of animal testing and human trials….

In an effort to survive in such a costly and competitive environment, pharmaceutical companies have shed their high-maintenance R&D infrastructure, maintaining their pipelines instead by acquiring smaller (mostly biotech) companies, focusing on the less expensive development of me-too drugs, and buying the rights to promising products in late-stage development. As a consequence, biotech companies are disappearing (down from a peak of 140 in 2000 to about 60 in 2017), and the survivors must expend an increasing proportion of their resources on animal and human testing instead of the more innovative (and less costly) identification of promising leads and platform technologies. Similarly, some of academia’s R&D capacity, overbuilt in response to the NIH funding surge, now lies fallow, while seasoned experts and their promising protégés have moved on to other fields.

But not as you know it Jim..

by reestheskin on 02/05/2017

Comments are disabled

Higher education is an industry of the future — one in which the UK is a world-class player. Foreign universities are out to eat Britain’s lunch, and Mrs May’s obdurate stand is one of the best things that has ever happened to them.

Indeed, an industry of the future… “but not as you know it Jim”. FT

The sweat guy

by reestheskin on 01/05/2017

Comments are disabled

My first publication was on eccrine sweating. I was known (for a while) as the ‘sweat guy’, too. Only at work, I note.

 

Big ideas rather then big data, please

by reestheskin on 28/04/2017

Comments are disabled

With many powerful academicians, lobbyists, professional societies, funding agencies, and perhaps even regulators shifting away from trials to observational data, even for licensing purposes, clinical medicine may be marching headlong to a massive suicide of its scientific evidence basis. We may experience a return to the 18th century, before the first controlled trial on scurvy. Yet, there is also a major difference compared with the 18th century: now we have more observational data, which means mostly that we can have many more misleading results.

John P.A. Ioannidis

I think the situation is even worse. Indeed, we can only grasp the nature of reality with action, not with contemplation (pace Ioannidis). But experiments (sic) as in RCT are also part of the problem: we only understand the world by testing of ideas that appear to bring coherency to the natural world. A/B testing is inadequate for this task — although it may well be all we have left.

Behavioural health care services company

by reestheskin on 27/04/2017

Comments are disabled

G4S, the outsourcing company, has sold its US juvenile detention centres business for $57m. It said it had sold the business to BHSB, a US a “behavioural health care services company” that provides services to troubled young people. FT.

We know where this ends up.

Guitar talk

by reestheskin on 26/04/2017

Comments are disabled

Interesting interview in the FT with the African guitarist Lionel Loueke, if you like to think about learning and certification, a couple of truths. The first is how technology can help. ‘Slow it down’ has helped many of us. Being able to record yourself, and then listen ( a point Eric Clapton talks about) is an interesting example of how you blur the gap between private practice and the external ear provided by a teacher.

He first heard jazz when a friend played him cassettes by Wes Montgomery and George Benson. At first, Loueke didn’t even know that jazz was an improvised music. ‘I approached it like I was playing Afropop, and learnt it by ear,’ he says. ‘I slowed down the cassette by putting in weak batteries, then back to electricity to get the speed. That’s how I started jazz”.

And of course, certification has its limits, and the ‘place to learn’ in not always in the classroom. Papert’s ‘mathland’, revisited.

When guitarist Lionel Loueke was a teenager in Benin, boiling precious guitar strings in vinegar to make them last, he didn’t think that one day he’d be auditioning in Los Angeles for a place at the Thelonious Monk Institute of Jazz Performance. Or that the panel of jazz professors would include Wayne Shorter, Terence Blanchard and Herbie Hancock. And certainly not that Hancock would exclaim, ‘How about we just forget about the school and I take you on the road right now?’

The raw eggs bit, I could not manage

by reestheskin on 25/04/2017

Comments are disabled

Emma Morano’s singular achievement in life may have been perseverance. She lived for 117 years, crediting her longevity to raw eggs and her lack of a husband. She died on April 15.

NYT

“Anyone who tries to make a distinction between education and entertainment doesn’t know the first thing about either.” — Marshall McLuhan

Getting into medical school

by reestheskin on 22/04/2017

Comments are disabled

Comment on an FT article. How things have changed. Even I can remember a colleague — a few years my senior — who went for a Wellcome Training Fellowship, only to be interviewed by one person, with the opening question being, ‘Imagine I am an intelligent layperson: tell me what you want to do!’

McRae

I was a war baby, a small farmer’s son and in 1960, at 17, I had a chat with my most trusted teacher about what I should do to apply to become a doctor for which I had just acquired a good group of Scottish highers. He advised me that because I should have applied a number of months before, to write a letter to the University enclosing my qualifications. I was asked to come and have a chat with the Bursar and the only thing I remember him saying was that my qualifications were good but did I realise that I might be preventing somebody else from getting in. I am ashamed to say that I replied that I was not really too troubled about that. I was accepted, and was fine.

Maps, and learning medicine

by reestheskin on 20/04/2017

Comments are disabled

When you want to find your way around a city, you might memorise key streets or more likely use a simplified map as a guide as you travel. But when you know a city, you navigate by being able to recall how you get from A to B. In fact you may have difficulty drawing a map — certainly to scale — but your memory is made up of lots of instances of what lies around a particular corner. Much of what you learn about diseases is the map in this analogy. By contrast, what the experienced clinician knows are lots of instances of what lies round particular corners. Those instances have a name: they are called patients.

You cannot change or reform undergraduate medical education in a significant way without changing the way doctors work and behave.

Why science is important

by reestheskin on 18/04/2017

Comments are disabled

Q: What’s at stake when scientists fib?

A: Science is the last institution where being honest is a quintessential part of what you’re doing. You can do banking and cheat, and you’ll make more money, and that money will still buy you the fast cars and the yachts. If you cheat in science, you’re not making more facts, you’re producing nonfacts, and that is not science. Science still has this chance of giving a lead to democratic societies because scientific values overlap strongly with democratic values.

Interview with Harry Collins about his book: Gravity’s Kiss: The Detection of Gravitational Waves Harry Collins MIT Press, 2017. 414 pp.

“Jeff, what does Day 2 look like?”

by reestheskin on 17/04/2017

Comments are disabled

That’s a question I just got at our most recent all-hands meeting. I’ve been reminding people that it’s Day 1 for a couple of decades. I work in an Amazon building named Day 1, and when I moved buildings, I took the name with me. I spend time thinking about this topic.

“Day 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is always Day 1.”

Resist Proxies: As companies get larger and more complex, there’s a tendency to manage to proxies. This comes in many shapes and sizes, and it’s dangerous, subtle, and very Day 2. A common example is process as proxy. Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s not that rare to hear a junior leader defend a bad outcome with something like, “Well, we followed the process.”

(Do you know what they know they want?)

Good inventors and designers deeply understand their customer. They spend tremendous energy developing that intuition. They study and understand many anecdotes rather than only the averages you’ll find on surveys. They live with the design. I’m not against beta testing or surveys. But you, the product or service owner, must understand the customer, have a vision, and love the offering. A remarkable customer experience starts with heart, intuition, curiosity, play, guts, taste. You won’t find any of it in a survey.

Jeff Bezos here. I dislike process, and in education or research, whatever promise it offers, is offset by its tendency to lead to institutional denigration of those who keep their eyes on reality.

Via Benedict’s Newsletter

I keep coming back to

by reestheskin on 14/04/2017

Comments are disabled

I keep coming back to a few central insights that have — in the best sense of the word — disturbed my world view. These are from a wonderful article in a journal I had never heard of, written by Frank Davidoff. (But I do not buy the term ‘revolution’)

Competence, in contrast, is like “dark matter” in astronomy: although it makes up most of the universe of working knowledge, we understand relatively little about it. What does it really consist of? Which of its components are most important? How do people acquire it? What’s the best way to measure it? And how can you tell when they have enough of it?

Most importantly, it is increasingly clear that competence is acquired primarily through experiential learning – a four-element cycle (or spiral) in which learners move from direct personal involvement in experiences, to reflection on those experiences, integration of their observations with sense-making concepts and mental models, and finally back to more experiences. Formal training for all high-performance (applied) professions, for example, music, architecture, theater, and athletics, is grounded in the unique requirements of experiential learning: case-based coaching, rather than lectures by content experts; hands-on, practicum experiences (including simulations, if necessary) in addition to written end-objectives; repeated experiences and outcome evaluations over time rather than initial, one- shot exercises; and, ultimately, acquisition of the advanced skills of “reflection-in-action,” which is required for high-level performance and “reflection-on-action,” which is required for continued self-evaluation and self- instruction (Schon, 1987).

Mens Sana Monographs:2008 | Volume:6 | Issue:1 | Page:29–40
Focus on Performance: The 21 st Century Revolution in Medical Education

Not a ragbag of facts

by reestheskin on 13/04/2017

Comments are disabled

Bruce Alberts talks a lot of sense about science education and education in general. And of course he produced a book that ‘educated’ a whole generation (or more) of people like me. But in this recent Science piece he is taking on some of the big questions, questions that have been asked before, but for which few have managed to follow through on. As ever, the emphases are mine.

In previous commentaries on this page, I have argued that “less is more” in science education, and that learning how to think like a scientist—with an insistence on using evidence and logic for decision-making—should become the central goal of all science educators. I have also pointed out that, because introductory science courses taught at universities define what is meant by “science education,” college science faculty are the rate-limiting factor for dramatically improving science education at lower levels.

For example, there is a long-standing belief that every introductory college biology course must “cover” a staggering amount of knowledge. There is no time to focus on a much more important goal—insisting that every student understand exactly how scientific knowledge is generated. Science is not a belief system; it is, instead, a very special way of learning about the true nature of the observable world.

His phrase, “college science faculty are the rate-limiting factor for dramatically improving science education at lower levels”, could equally apply to medicine and medical teachers. It is not hyperbole to say these are some of the central problems of our time. And it is not just science education that is the issue.

Idea factories

by reestheskin on 12/04/2017

Comments are disabled

Universities are idea factories. Current corporatization approaches emphasize the factory rather than the ideas.

Ralf Buckley in Nature. I would say— for the short term at least — unless somebody finds a way to create new ‘dissenting academies’ things in UK higher ed will get worse.

Core service training

by reestheskin on 11/04/2017

Comments are disabled

“Core surgical training in the UK has been dubbed “core service training” because many trainees believe it does not provide enough surgical experience. At the southern tip of Africa, I felt I was being taught to operate, not to just watch and hold retractors. My commitment and progression were judged on hard work and merit, not on how many courses I had attended.”

Here.

Incentives matter, especially the wrong ones

by reestheskin on 10/04/2017

Comments are disabled

Given your past views on measuring quality in universities, what do you think of the teaching excellence framework, which the government would like to use to measure teaching quality?

The government needs to think more about the evidence we have showing that measuring performance, and in particular ranking performance, creates strong incentives – but all too often the wrong incentives.

What is the biggest threat facing higher education today?

Too much emphasis on comparative achievement, not enough on the pleasure of learning or the importance of doing at least some things really well.

Amen. Nora O’Neill interviewed in the THE

Software is eating the clinic

by reestheskin on 07/04/2017

Comments are disabled

There was an interesting paper published in Nature recently on the topic of automated skin cancer diagnosis. Readers of my online work will know it is a topic close to my heart.

Here is the text of a guest editorial I wrote for Acta about the paper. Acta is a ‘legacy’ journal that made the leap to full OA under Anders Vahlquist’s supervision a few years back — it is therefore my favourite skin journal. This month’s edition, is the first without a paper copy, existing just online. The link to the edited paper and references is here. I think this is the first paper in their first online only edition :-). Software is indeed eating the world.


 

When I was a medical student close to graduation, Sam Shuster then Professor of Dermatology in Newcastle, drew my attention to a paper that had just been published in Nature. The paper, from the laboratory of Robert Weinberg, described how DNA from human cancers could transform cells in culture (1). I tried reading the paper, but made little headway because the experimental methods were alien to me. Sam did better, because he could distinguish the underlying melody from the supporting orchestration. He told me that whilst there were often good papers in Nature, perhaps only once every ten years or so would you read a paper that would change both a field and the professional careers of many scientists. He was right. The paper by Weinberg was one of perhaps fewer than a dozen that defined an approach to the biology of human cancer that still resonate forty years later.

Revolutionary papers in science have one of two characteristics. They are either conceptual, offering a theory that is generative of future discovery — think DNA, and Watson and Crick. Or they are methodological, allowing what was once impossible to become almost trivial — think DNA sequencing or CRISPR technology. Revolutions in medicine are slightly different, however. Yes, of course, scientific advance changes medical practice, but to fully understand clinical medicine we need to add a third category of revolution. This third category comes from papers that change the everyday lives of what doctors do and how they work. Examples would include fibreoptic instrumentation and modern imaging technology. To date, dermatology has escaped such revolutions, but a paper recently published in Nature suggests that our time may have come (2).

The core clinical skill of the dermatologist is categorising morphological states in a way that informs prognosis with, or without, a therapeutic intervention. Dermatologists are rightly proud of these perceptual skills, although we have little insight as to how this expertise is encoded in the human brain. Nor should we be smug about our abilities as, although the domains are different, the ability to classify objects in the natural world is shared by many animals, and often appears effortless. Formal systems of education may be human specific, but the cortical machinery that allows such learning, is widespread in nature.

There have been two broad approaches to try and imitate these skills in silica. Either particular properties (shape, colour, texture etc.) are first explicitly identified and, much as we might add variables in a linear regression equation, the information used to try and discriminate between lesions in an explicit way. Think of the many papers using rule based strategies such as the ABCD system (3). This is obviously not the way the human brain works: a moment’s reflection about how fast an expert can diagnose skin cancers and how limited we are in being able to handle formal mathematics, tells us that human perceptual skills do not work like this.

There is an alternative approach, one to some extent that almost seems like magic. The underlying metaphor is as follows. When a young child learns to distinguish between cats and dogs, we know the language of explicit rules is not used: children cannot handle multidimensional mathematical space or complicated symbolic logic. But feedback, in terms of what the child thinks, allows the child to build up his or her own model of the two categories (cats versus dogs). With time, and with positive and negative feedback, the accuracy of the perceptual skills increase — but without any formal rules that the child could write down or share. And of course, since it is a human being we are talking about, we know all of this process takes place within and between neurons.

Computing scientists started to model the way that they believed collections of neurons worked over 4 decades ago. In particular, it became clear that groups of in silica neurons could order the world based on positive and negative feedback. The magic is that we do not have to explicitly program their behaviour, rather they just learn, but — since this is not magic after all — we have got much better at building such self-learning machines. (I am skipping any detailed explanation of such ‘deep learning’ strategies, here). What gives this field its current immediacy is a combination of increases in computing power, previously unimaginable large data sets (for training), advances in how to encode such ‘deep learning’, and wide potential applicability — from email spam filtering, terrorist identification, online recommendation systems, to self-driving cars. And medical imaging along the way.

In the Nature paper by Thrun and colleagues (2) such ‘deep learning’ approaches were used to train computers based on over 100,000 medical images of skin cancer or mimics of skin cancer. The inputs were therefore ‘pixels’ and the diagnostic category (only). If this last sentence does not shock you, you are either an expert in machine learning, or you are not paying attention. The ‘machine’ was then tested on a new sample of images and — since modesty is not a characteristic of a young science — the performance of the ‘machine’ compared with over twenty board certified dermatologists. If we use standard receiver operating curves (ROC) to assess performance the machine equalled if not out-performed the humans.

There are of course some caveats. The dermatologists were only looking at single photographic images, not the patients (4); the images are possibly not representative of the real world; and some of us would like to know more about the exact comparisons used. However, I would argue that there are also many reasons for imagining that the paper may underestimate the power of this approach: it is striking that the machine was learning from images that were relatively unstandardised and perhaps noisy in many ways. And if 100,000 seems large, it is still only a fraction of the digital images that are acquired daily in clinical practice.

It is no surprise that the authors mention the possibilities of their approach when coupled with the most ubiquitous computing device on this planet — the mobile phone. Thinking about the impact this will have on dermatology and dermatologists would require a different sort of paper from the present one but, as Marc Andreessen once said (4), ‘software is eating the world’. Dermatology will survive, but dermatologists may be on the menu.


 

Full paper with references on Acta is  here.

On not dropping your anchor.

by reestheskin on 06/04/2017

Comments are disabled

From the Obit of Derek Walcott.

He would cup a breast as he fondled a white stone from the beach. These propensities, noted when he was teaching in America in the 1980s and 1990s, cost him the chance to be, in 1999, Britain’s poet laureate and, ten years later, professor of poetry at Oxford. He was not concerned, for he did not want to drop his anchor long on any northern shore.

Economist

No RAE/REF. The view from Ireland

by reestheskin on 05/04/2017

Comments are disabled

‘I think we’re seeing the benefits of a good funding environment, and – to be frank – no research excellence framework’

Brexit and the Emerald Isle. Your mileage may vary. Here.

It is odd to live in a country whose very name—the United Kingdom—sounds increasingly sarcastic.

FT. Obvious, but puzzled as I haven’t see it everywhere.

Writing by candlelight

by reestheskin on 03/04/2017

Comments are disabled

last month (, for example, when) the University of Copenhagen fired seismologist Hans Thybo, president of the European Geosciences Union. The official explanation for Thybo’s dismissal — his alleged use of private e-mail for work, and telling a postdoc that it is legitimate to openly criticize university management — seems petty in the extreme.

Nature December 2016. Little hygge on show here, then.

Worst thing about being a medical student? Feeling like a spare part

by reestheskin on 31/03/2017

Comments are disabled

A little awhile back after a teaching session, I asked — as I often do — a googly: ‘What is the worst thing about being a medical student?’ The response: ‘Feeling like a spare part’.

My quick and facetious response was to argue that at least spare parts were useful, whereas students were (usually) not. The humour was appreciated 🙂

But if you think this through, it is possible to argue that students are indeed less useful than they once were. At one time, final year students were a key component of clinical service. They clerked people, they could insert iv lines, write up drugs, and they were around long enough in one environment for people to make meaningful judgments of their abilities and remember them. And to be able to trust them. They could do paid locums, and even when they were not being paid, they could not be absent, nor did you need to formalise start and finish times. One of my colleagues  reported that he used to be able to ‘prescribe anything apart from diamorphine’ as a final year student.

This all raises some interesting questions

  1. Medical education may be getting worse. This is not to attach blame, merely a speculation based on what I see. There are indeed more educationalists, but making things explicit is easily confused with competence that is implicit.  It is even conceivable that those who ran medical schools were once more thoughtful, possibly because they were  not juggling so many roles, and because the environment was less cluttered by regulators.
  2. Engagement with clinical service helps learning, but the opportunities for this may be diminishing. Again, this is in part a change in the environment. The inpatient service is not where most medicine occurs: the office is (whether in the OPD or in primary care)
  3. I wonder if we are sequencing medical education incorrectly. The pressure is for more and more training and less and less education, but the facilities for training are diminishing all the time and are largely outwith the control of the universities. In a health service that is falling to bits, medical students are not a priority (and expect things to get worse). The NHS has never taught doctors: doctors have taught doctors. If culture eats strategy — as the saying goes —the toxic nature of much NHS provision, will negate all those lectures on ethics, fairness and sense of vocation. And don’t mention resilience.
  4. Never promise ‘engagement’ when you cannot deliver it, otherwise disenchantment grows. We overpromise, and universities increasingly advertise with exaggerated claims. Scholarship, and advertising are opposing world belief systems.
  5. Comparisons with some other EU countries makes me wonder if we should revisit the boundary between ‘student’ and ‘practice’.
  6. Our ideas of medical education owe much to a time when most people did not go to university. 

Medical education is indeed always advancing: but many of us think it is increasingly out of phase with the world we live in. Magnitudes matter.