That said, his [Chris Bustamente, president of Rio Salado College ] discussion underscored some of the stark labor realities driving the proposed solutions for increased access to higher ed. Rio Salado [in the USA] educates 60,000 students with 22 full time faculty and 1500 adjuncts. Let me say that again, Rio Salado educates 60,000 students with 22 full time faculty and 1500 adjuncts. And while a small percentage of these part-time faculty may do it for the love of teaching as Bustamente suggested, it’s all but certain the vast majority are teaching on subsistence wages to eke out a living, much like many of the students they serve. Such a mixed message about the power of a college education to set you free, at least financially, hasn’t been lost on me since my first adjuncting gig in 1997.
Every time I hear the term line-manager used about an academic, retirement gets a day closer
But the great JK Galbraith (senior) had some words of his own on line-management (Galbraith, a famous Harvard Professor of Economics, was ambassador to India for JFK)
Galbraith proved up to the task, in part, as Bruce Riedel writes in “JFK’s Forgotten Crisis”, because he had access to the president and his aides. Most ambassadors report to the State Department, but the blunt Galbraith told the president that going through those channels was “like trying to fornicate through a mattress”.
I had forgotten this piece I wrote a few years back for Reto Caduff’s amazing book onFreckles. Here it is:
Imagine at some future time, two young adults meet on an otherwise deserted planet. They are both heavily freckled. What would this tell us about them, their ancestors and how they had have spent their time? First, all of us learn early in life that skin colour and marks like freckles are unequally distributed across the people of this earth. They are most common in people with pale skin, especially so if they have red hair, and we all know that we get our skin colour from our parents. Second, freckles are most common in those who have spent a lot of time in the sun. So freckles betray both something about our ancestors, and how we ourselves have lived our life.
Skin colour varies across the earth, and the chief determinant of this variation has been the interaction between sunshine (more particularly ultraviolet radiation) and our skin over the last 5 to 50 thousand years. Dark skin is adapted so as to protect against excessive sunshine, whereas we think light skin is better adapted to areas where the sun shines less. As some humans migrated out of Africa, say 50,000 years ago, a series of changes or mutations occurred in many genes to make their skin lighter. Their skin became more sensitive to both the good and the harmful effects of ultraviolet radiation. One way this change was accomplished was the development of changes in a gene called the melanocortin 1 receptor (MC1R), a gene we could also call a gene for freckles.
Skin and hair colour arises from a mixture of two types of the pigment melanin: brown or eumelanin, and red, or pheomelanin. If the MC1R works effectively, eumelanin is favoured; if the MC1R works less well, pheomelanin is favoured. We know that the change to pheomelanin is associated with skin that is more sensitive to sunshine. When people who harbour changes in MC1R are exposed to sun they are much more likely to develop freckles, than those who contain no changes in their MC1R.
And what about the freckles themselves? They are just tiny areas of melanin production. Ironically, to the best of our knowledge the freckles themselves seem to protect against the sun quite effectively. It is the non-freckled areas that are most sensitive to the sun. If in a bid to protect skin against the harmful effects of excessive sun, we were to join all the freckles up, the sensitivity might disappear. Of course, on a planet located far away in time and place, our two young adults might already possess the technology to join all their freckles together. It is just that they chose not to.
A year ago, “TT [tenure track] or bust” was a common but ill-advised attitude toward the job market. That attitude should be unthinkable today. COVID-19 is an accelerant to a fire in academia that has been raging for at least a decade. When that fire is finally extinguished, the landscape of higher education will be unrecognizable at best and decimated at worst.
After he had been dismissed from government, and implicated in the anti-Medici conspiracy, Machiavelli was imprisoned, tortured, before returning to the family farm. But his passions ran deep.
…Machiavelli was unable to turn his mind from politics. ‘I could not help but fill your head with castles in the air,’ he wrote to Vettori in 1513, ‘because since Fortune has seen to it that I do not know how to talk about either the silk or wool trade, profits or losses, I have to talk about politics.’ He spent the days chewing the fat with woodcutters on the farm and playing cricca in the tavern. But in the evening, he told Vettori,
I return home and enter my study; on the threshold I take off my workday clothes, covered with mud and dirt, and put on the garments of court and palace. Fitted out appropriately, I step inside the venerable court of the ancients, where, solicitously received by them, I nourish myself on that food that alone is mine and for which I was born; where I am unashamed to converse with them … and they, out of their human kindness, answer me. And for four hours at a time I feel no boredom, I forget all my troubles, I do not dread poverty, and I am not terrified by death. I absorb myself into them completely. And because Dante says that: no one understands anything unless he retains [it], I have jotted down what I have profited from in their conversation and composed a short study, De principatibus. [emphasis added]
I cannot see the future, but like many, I have private models that I use to order the world, and for which I often have very little data. For instance, I think it obvious that the traditional middle-class professions (medicine, lay, veterinary medicine, architecture, dentistry, academia) are increasingly unattractive as careers1. I am not complaining about my choices — far from it; I benefited on the tailwinds of the dramatic social change that wars and other calamities bring. But my take on what has happened to school teachers and teaching is the model for what will happen to many others. I say this with no pleasure: there are few jobs more important. But the tragedy of schoolteaching — which is our tragedy — will continue to unfold as successive gangs of politicians of either armed with nothing more than some borrowed bullet points play to the gallery. Similarly, in higher education within a timescale of almost 40 years, I have seen at first-hand changes that would make me argue that not only are the days of Donnish Dominion(to use Halsey’s phrase2) well and truly over, but that most UK universities will be unable to recruit the brightest to their cause. I think we see that in clinical academia already — and not just in the UK. Amidst all those shiny new buildings moulded for student experience (and don’t forget the wellness centres…); the ennui of corporate mediocrity beckons. The bottom line is the mission statement.
As for medicine, a few quotes below from an FT article from late last year. I assume that without revolutionary change, we will see more and more medical students, and more and more doctors leaving mid-career. If you keep running to stand still, the motivation goes. And that is without all the non-COVID-19 effects of COVID-19.
One of the major factors for doctors is the electronic record system. It takes a physician 15 clicks to order a flu shot for a patient, says Tait. And instead of addressing this problem, healthcare companies end up offering physicians mindfulness sessions and healthy food options in the cafeteria, which only frustrates them further…[emphasis added]
Over the past few years, efforts have been made to increase the number of medical schools in the US to ensure that there is no shortage of doctors. “When you think about how much we’ve invested to create, roughly, 10 to 12 new medical schools in the last decade, at hundreds of millions of dollars per school, just to increase the pipeline of physicians being trained, we also need to think at the far end of the physicians who are leaving medicine because of burnout,” says Sinsky.
Take the case of a final-year resident doctor in New York, who spends a considerable part of his shift negotiating with insurance companies to justify why his patient needs the medicines he prescribed. “When I signed up to be a doctor, the goal was to treat patients, not negotiate with insurance providers,” he says.
According to Tait, 80 per cent of the challenge faced by doctors is down to the organisation where they work, and only 20 per cent could be attributed to personal resilience.
Re the final quote, 80:20 is being generous to the organisations.
Many years ago I was expressing exasperation at what I took to be the layers and layers of foolishness that meant that others couldn’t see the obvious — as defined by yours truly, of course. Did all those wise people in the year 2000 think that gene therapy for cancer was just around the corner, or that advance in genetics was synonymous with advance in medicine, or that the study of complex genetics would, by the force of some inchoate logic, lead to cures for psoriasis and eczema. How could any society function when so many of its parts were just free-riding on error, I asked? Worse still, these intellectual zombies starved the new young shoots of the necessary light of reason. How indeed!
William Bains, he of what I still think of as one of the most beautiful papers I have ever read1, put me right. William understood the world much better than me — or at least he understood the world I was blindly walking into, much better. He explained to me that it was quite possible to make money (both ‘real’ or in terms of ‘professional wealth’) out of ideas that you believed to be wrong as long as two linked conditions were met. First, do not tell other people you believe them to be wrong. On the contrary, talk about them as the next new thing. Second, find others who are behind the curve, and who were willing to buy from you at a price greater than you paid (technical term: fools). At the time, I did not even understand how pensions worked. Finally, William chided me for my sketchy knowledge of biology: he reminded me that in many ecosystems parasites account for much, if not most, of the biomass. He was right; and although my intellectual tastes have changed, the sermon still echoes.
The reason is that corporate tax burdens vary widely depending on where those profits are officially earned. These variations have been exploited by creative problem-solvers at accountancy firms and within large corporations. People who in previous eras might have written symphonies or designed cathedrals have instead saved companies hundreds of billions of dollars in taxes by shifting trillions of dollars of intangible assets across the world over the past two decades. One consequence is that many companies avoid paying any tax on their foreign sales. Another is that many countries’ trade figures are now unusable. [emphasis added].
Trade Wars Are Class Wars: How Rising Inequality Distorts the Global Economy and Threatens International by Matthew C. Klein, & Michael Pettis.
But after completing medical training, Sacks fled the homophobic confines of his nation and family—his mother had called him “an abomination.” Paul Theroux tells Burns that Sacks’s “great luck” was ending up in Los Angeles in 1960, where he found ample “guys, weights, drugs, and hospitals.”
Advance requires those who can imagine new spaces, and medicine is even more hostile today than it was all those years ago. We pretend otherwise, thinking those tick-box courses will suffice, but real diversity of intellect is the touchstone of our future.
I read Malcolm Bradbury’s satire The History Man many decades ago and loved it as a satire on university life (and which demonstrated to me why medical schools and universities were unlikely bedfellows).
The History Man is Malcolm Bradbury’s masterpiece, the definitive campus novel and one of the most influential novels of the 1970s. Funny, disconcerting and provocative, Bradbury brilliantly satirizes a world of academic power struggles as his anti-hero seduces his away around campus. (Amazon’s brief).
I have forgotten much of the detail, but not how fine a novel I thought it was, nor how funny I found it. But for every great thesis, there is an antithesis. Here is one:
Ignorance of history is a badge of honour in Silicon Valley. “The only thing that matters is the future,” self-driving-car engineer Anthony Levandowski told The New Yorker in 2018… I don’t even know why we study history,” Levandowski said in 2018.
I dislike agreeing with the corporation that is Google as I am always suspicious of their motives, but in this narrow domain, they are surely correct.
College degrees are out of reach for many Americans, and you shouldn’t need a college diploma to have economic security. We need new, accessible job-training solutions—from enhanced vocational programs to online education—to help America recover and rebuild.
My experience is limited, but everything I know suggests that much IT in healthcare diminishes medical care. It may serve certain administrative functions (who is attending what clinic and when etc), and, of course, there are certain particular use cases — such as repeat prescription control in primary care — but as a tool to support the active process of managing patients and improving medical decision making, healthcare has no Photoshop.
In the US it is said that an ER physician will click their mouse over 4000 times per shift, with frustration with IT being a major cause of physician burnout. Published data show that the ratio of patient-facing time to admin time has halved since the introduction of electronic medical records (i.e things are getting less efficient). We suffer slower and worse care: research shows that once you put a computer in the room eye contact between patient and physician drops by 20-30%. This is to ignore the crazy extremes: like the hospital that created PDFs of the old legacy paper notes, but then — wait for it — ordered them online not as a time-sequential series but randomly, expecting the doc to search each one. A new meaning for the term RAM.
There are many proximate reasons for this mess. There is little competition in the industry and a high degree of lock-in because of a failure to use open standards. Then there is the old AT&T problem of not allowing users to adapt and extend the software (AT&T famously refused to allow users to add answering machines to their handsets). But the ultimate causes are that reducing admin and support staff salaries is viewed as more important than allowing patients meaningful time with their doctor; and that those purchasing IT have no sympathy or insight into how doctors work.
The context is wildly different — it is an exchange on the OLPC project and how to use computers in schools, but here are two quotes from Alan Kay that made me smile.
As far as UI is concerned — I think this is what personal/interactive computing is about, and so I always start with how the synergies between the human and the system would go best. And this includes inventing/designing a programming language or any other kind of facility. i.e. the first word in “Personal Computing” is “Person”. Then I work my way back through everything that is needed, until I get to the power supply. Trying to tack on a UI to “something functional” pretty much doesn’t work well — it shares this with another prime mistake so many computer people make: trying to tack on security after the fact …[emphasis added]
I will say that I lost every large issue on which I had a firm opinion.
That “scientific management” bungled the algorithm for children’s exam results, verifies a maxim attributed to J.R. Searle, an American philosopher: if you have to add “scientific” to a field, it probably ain’t.
I have written elsewhere about this in medicine and science. We used to have physiology, but now some say physiological sciences; we used to have pharmacology, but now often see pharmacological sciences1. And as for medicine, neurology and neurosurgery used to be just fine, but then the PR and money grabbing started so we now have ‘clinical neuroscience’ — except it isn’t. As Herb Simon pointed out many years ago, the professions and professional practice always lose out in the academy.
Sadly, my old department in Newcastle became Dermatological Sciences, and my most recent work address is Deanery of Clinical Sciences — which means both nouns are misplaced. ↩
The following is from Scot Galloway at NYU Stern. He shoots from the hip, and sometimes only thinks afterwards. But he is interesting, brave, and more often right than most. I think I would have hated what he said when I was ready (sic) to go to university. But now, I think I wasn’t, and for medicine in particular, allowing 17 year olds to fall into the clutches of the GMC and their ilk should be a crime against….
Gap years should be the norm, not the exception. An increasingly ugly secret of campus life is that a mix of helicopter parenting and social media has rendered many 18-year-olds unfit for college. Parents drop them off at school, where university administrators have become mental health counselors. The structure of the Corona Corps would give kids (and let’s be honest, they are still kids) a chance to marinate and mature. The data supports this. 90% of kids who defer and take a gap year return to college and are more likely to graduate, with better grades. The Corps should be an option for non-college-bound youth as well.
“We’re going through a Copernican revolution of healthcare, where the patient is going to be at the centre. The gateway to healthcare is not going to be the physician. It’s going to be the smartphone.”…
“Christofer Toumazou, chief scientist at the Institute of Biomedical Engineering at Imperial College London, says there are “megabucks” to be saved by using technology and data to shift the focus of healthcare towards prevention.”
Ahem. I have been reading Seamus O’Mahony’s excellent Can Medicine be Cured in which he does a great job of following up on the crazy hype of big genetics from 20 year ago (and many other areas of sales masquerading as science). The above quotes are from only seven years ago. Still crazy after all these years, sings Paul Simon. Health care excels at adding tech as a new layer of complexity rather than replacing existing actors. And when will people start realising that prevention — which may indeed reduce suffering — will often increase costs. Life is a race against an army of exponential functions.
The task of a university is the creation of the future, so far as rational thought, and civilized modes of appreciation, can affect the issue. The future is big with every possibility of achievement and of tragedy.
Nobody then would have imagined how bad it would get. The final word was prescient.
Alfred North Whitehead, The Aim of Philosophy in Modes of Thought, 1938
Alas, there will no more new ones of these, as arguably the greatest of modern biology’s experimentalists, Sydney Brenner, passed away last year. One of his earlier quotes — the source I cannot find at hand — was that it is important in science to be out of phase. You can be ahead of the curve of fashion or possibly, better still, be behind it. But stay out of phase. So, no apologies for being behind the curve on these ones which I have just come across.
Sydney Brenner remarked in 2008, “We don’t have to look for a model organism anymore. Because we are the model organisms.”
Sydney Brenner has said that systems biology is “low input, high throughput, no output” biology.
She [Sigrid Nunez] was already well into her next novel by the time “The Friend” climbed bestseller lists. “What Are You Going Through”, out now, is not exactly a sequel, she says, but “these books belong together.” Both are “preoccupied with death”. And with ageing: “At a certain age, there is only one subject.”
Two quotes from Fintan O’Toole in the NYRB. The first, quoting Saki (H H Munro).
The people of Crete unfortunately make more history than they can consume locally.
The second, his own.
In this demented solipsism, the entire American past is shrink-fitted so that it hugs Trumps own ample figure, cleaving both to his greatness and his victimhood as an object of unparalleled persecution.
As miserable in the job as he was smart, autodidactic, and headstrong, he managed to escape a soul-destroying future trapped behind a shop on the counter by persuading his Latin tutor to hire him as a student teacher, then convincing his mother to pay off the indenture and set him free.
The Future was His, Maya Jasanoff in the NYRB, reviewing Inventing Tomorrow: H.G. Wells and the Twentieth Century by Sarah Cole.
In my ignorance I had always assumed that the ‘Haldane’ of the Haldane Principle1 referred to the great and singular geneticist and physiologist JBS Haldane. Not true. JBS once remarked that God must have been inordinately fond of beetles because there are so many species of beetles, so with the Haldanes; (good) fortune is, it appears, inordinately fond of the Haldane clan. A relative of JBS, Richard Burdon Haldane — who did indeed come up with the Haldane principle — is the subject of a new biography by Philip Campbell, and a witty and sharp review in the FT by Philip Stephens.
Watching today’s politicians fall over their own mistakes as they fumble with the Covid-19 pandemic, it is easy to forget that securing high office once required more than a few years of dashing off political columns for a national newspaper. So the life and political times of Richard Haldane, the subject of John Campbell’s engaging biography, offers a fitting rebuke to the trivial mendacity and downright incompetence of the nation’s present administration.
Exaggeration, it is not. Haldane…
…an Edinburgh lawyer and philosopher-politician before becoming a minister in Herbert Asquith’s Liberal administrations, was an important champion of universal education and one of the founding fathers of the UK university system. He also found time to create the Territorial Army, and to have a hand in the foundation of the London School of Economics, the Medical Research Council and the Secret Intelligence Service…
As Asquith’s minister for war, he created the expeditionary force that saved Britain from defeat in the opening stages of the first world war. As Lord Chancellor, his judgments did much to set in place the federalist tilt of the Canadian constitution.
And if there is any doubt about his intellectual gravitas, the review is headed by an image of Haldane with Albert Einstein whom he hosted on the latter’s first visit to the UK in the 1920s. Just conjure up BoJo or Patel or Hancock when you read the above, or when you step on something unpleasant and slimy.
It also seems that Haldane might have performed slightly better across the dispatch box than some of the current irregulars. Clark McGinn writes
He [Haldane] is also one of the few men to have beaten Winston Churchill by riposte. Haldane was a portly figure and Churchill remarked on his girth by asking when the baby was due and what it would be called. Haldane retorted: “If it’s a boy it will be George after the King, a girl will be Mary after the Queen. But if it is just wind I shall call it Winston.”
The Haldane Principle is the idea that decisions about what to spend research funds on should be made by researchers instead of politicians. It is named after Richard Burdon Haldane. For a recent take on the Haldane Principle see David Edgerton, The ‘Haldane Principle’ and other invented traditions in science policyhere. ↩
The following is from the Economist and is about schoolteachers.
A bigger question is how many of the new trainees will stay in teaching. Research in America shows that people who enter the profession during recessions tend to make better teachers than those who do not, perhaps because high-skilled workers have fewer other options during a downturn. But they are also a bit more likely to give up. England already has a problem retaining new teachers. About a fifth leave the job within two years of qualifying. About a third go within five.
I don’t have any systematic data on this topic, but the story appears familiar — if different in degree — across other public sector1 jobs such as nursing, higher education and possibly medicine. I am not reassured by the account below, rather, I think we are seeing structural changes that will continue to play out. The change in professional status of teaching and the resulting decline in morale always seemed to me to be the model for what might happen to medicine.
Sam Sims at the UCL Institute of Education says “muscular” policies that were put in place before the pandemic provide reason for optimism. Last year the government said that starting salaries would rise to £30,000 ($39,000) by 2022, a 23% increase. It is offering annual bonuses to teachers of subjects with the biggest shortages. And it is promising more mentoring and training for people who are new to the job. The idea is that new teachers will eventually consider themselves better-paid and better-supported than peers in many other professions. That might make Mr Seadon’s cohort a bit more likely to hang around.
Yes, higher education can be considered a special case of ‘public sector’ to the extent that much of its funding is underwritten by the state and decision making is heavily determined by political factors. ↩
No, not that sort of skin trade, but inking1. This is from an article in the Economist, and sadly, although I cannot show them here, the images are remarkable. But some nice word lines too about the acquisition of high level skills and apprenticeship — vocation, if you will.
In China several prominent tattooists are taking a different approach. They have set up schools. In Wu Shang’s studio four students are hunched over flat pieces of silicon rubber—mimicking skin, just like his model arms—trying to recreate images that they first painted on paper.
That might seem inoffensive, but it goes against a widespread but unwritten code. Masters may take an apprentice or two under their wings, but only if they are truly committed to the craft. The idea that anyone can just show up, pay a tuition fee and after a few months apply ink to skin leaves purists aghast. Even in China some are critical. Mr Shen, the neo-traditionalist, says that he honed his technique over many years by wielding needles by hand. “You need to learn about the relationship between skin and needle. You can’t just get that overnight in school,” he says.
Mr Handy says this gave him the opportunity to learn from his mistakes in private. He argues that “education is an experience understood in tranquillity. You look back and see where you went wrong.”
Looking back over his career, he believes that teaching and writing is all about creating the “Aha!” moment. That occurs when people realise that an idea the teacher or writer has advanced is both useful and something they already knew but had not articulated.
No Excel or TEF here. The plain language belies the depth of the insight.
Beautiful obituary of the wonderful classical guitarist and lutenist Julian Bream. Some of this story I knew already.
Almost as he started his long love affair with the guitar, Julian Bream was aware he was doing something disreputable. When he was caught as a teenager practising Bach in the Royal College of Music, he was warned not to bring that instrument into the building again. It lowered the tone.
Even the Army shared the snobbery
Signing on to do his National Service in an army band, he was told he could play piano and cello, fine, but the guitar only “occasionally”.
And it is not just rock musicians who sleep in the van before driving back up the M1 (note: an Austin, rather than a Transit)
Audiences clapped long and hard when he performed in the Wigmore Hall at 18, in 1951, but as he toured round Britain in the mid-1950s, sleeping in his Austin van to save on hotels, not many came to hear him.
Those from the home of the guitar were no less enthusiastic about this man from those Isles.
And from Spain, the spiritual and historical home of the guitar, came the loudest scorn of all. An Englishman playing a guitar, said one virtuoso, was a kind of blasphemy.
What I didn’t know was that he was essentially self-taught. This is common in rock, folk, and jazz and blues, but I assume rare in Classical music. Although Segovia was moderately well known, perhaps the lack of popularity of the guitar in the UK made this necessary. Readers of this rag will know that I am fascinated by autodidacts and what skills you can — and cannot —learn to a high level without formal instruction. My prejudice is also-taught: the energy needed to acquire mastery alone is worth so much more than the competence gained on the transactional shoulders of others. Passion and perspective are worth more than 50 IQ points, as they say.
There are limits, however. In this video he talks about his fingers and technique:
‘Unfortunately the Almighty bequeathed me with a very clumsy pair of hands… and very slow’ (link)
He had form on the lute as well, playing with the nails rather than the fingers, and again faced the distain of the ‘experts’.
Below, a video on why Bream thought of himself as a meat and potatoes Englishman.
Those qualified need not apply
From a letter in last week’s Economist from Andrew Carroll, commenting on the Economist’s own description of Clement Atlee
He “lacks the conspicuous attributes of a leader” but “has undeniable ability, judgment and integrity” (“Mr Attlee and Sir A. Sinclair”, November 30th 1935)
A few months back, I was walking past the entrance of the old Edinburgh Medical School, founded in 1726. A not-so-crazy thought came into my head, one that I could not dismiss: we need to move on from the idea that a Medical School must be situated within a University (and of course, it wasn’t always, anyway). The founding set of ideas that we have struggled with ever since Flexner, we should now recast for a very different world. We need to create something new, something that makes sense in terms of a university and something that puts professional training within a professional context. At present, we fail on both of these accounts. Rather than integrate we should fracture. We need to search out our own new world.
Specialisation and the division of labour is as old as humanity, and of course it goes way back further when we are talking biology. Adam Smith may have formalised why and how it was important economically but he did not invent it. Most specialisation relies on expertise, at least it used to until Crapita and the like started mining the seams of government ignorance.
The quote below is from an article in the Economist in May this year. It is about Public Health England (PHE) and how since they only possessed 290 contact tracers, they needed to call on those wonderful experts in everything, Serco, to help them out. Of course, expertise in such tasks always used to reside with Local Government, not PHE, but Boris and his bunch of Maoists, when they are not having their eyes tested in the fast lane, have decreed that Local Government — along with the opposition, the judges, the education sector and more — are enemies of the people. Given this mindset, we are left with those whose main area of expertise is commercialising ignorance.
Firms such as Serco, a big contractor, are in talks with the government to provide the workforce. It should be possible to train new recruits fairly quickly—the requirements of the job are similar to those of 111 operators, for whom the training time is just four hours. They will work from a script that guides them through the various stages of an interview [emphasis added].
Awhile back, I ended up corresponding with somebody in the Scottish government about how misleading their self-help pages on skin disease were: they contained factual errors, and would mislead people seeking medical help. The content had clearly not been written by a medical practitioner — defined as somebody with domain clinical expertise and who might have actually dealt with patients by shaking hands with them. Asking for validation studies or some sort of empirical evidence to support the content, was unhelpful as the content was supplied by another agency and was commercially ‘confidential’. I didn’t follow up because the person I corresponded with clearly knew that his own position was both untenable, and uncomfortable. Its just business: you know, ‘new ways of working’, ‘direction of travel’, and all those other vacuous suitcase terms that just mark a space where reason or domain expertise used to reside.
Rather than making clever machines, or allowing humans to do what only humans can do1, it seems we are content to make humans behave as stupidly as Excel spreadsheets. 111 is not for BoJo et al.; 111 is for poor people waiting to be levelled up, even if the best way to do that, is to go to straight to A&E. 2