Despite this, less than half of developers consider their formal education to be “important” or “very important” to their jobs.
Well this is tech, but it is also true of any many fields of endeavour. But not all. We need to understand when and where the rules of the game change. This is not just about certification
In the context of her research about the implications of information technology she stated three laws:
- Everything that can be automated will be automated.
- Everything that can be informated will be informated.
- Every digital application that can be used for surveillance and control will be used for surveillance and control.
Wikipedia. A lot of interesting links to her work. I never knew the origin.
Lots of money is spent on medical education in England – but very little of it goes towards teaching.
Says Philip Chan in THE. Yes, he is not the only one writing on this. I assume the sub-editor was having a poetic day.
“If we are really going to turn over our homes, our cars, our health and more to private tech companies, on a scale never imagined,” he wrote, “we need much, much stronger standards for security and privacy than now exist. Especially in the US, it’s time to stop dancing around the privacy and security issues and pass real, binding laws.
“And, if ambient technology is to become as integrated into our lives as previous technological revolutions like wood joists, steel beams and engine blocks, we need to subject it to the digital equivalent of enforceable building codes and auto safety standards. Nothing less will do. And health? The current medical device standards will have to be even tougher, while still allowing for innovation.”
Nice piece on Walt Mossberg from John Naughton.
Or so says an article in Nature. No we don’t , is my response
Philanthropists are flying blind because little is known about how to donate money well. Facebook co-founder Mark Zuckerberg’s US$100-million gift to schools in Newark, New Jersey, reportedly achieved nothing. Some grants to academic scientists create so much administration that researchers are better off without them. And some funders’ decisions seem to be no better than if awardees were chosen at random, with the funded work achieving no more than the rejecte
There is no science to philanthropy. You can study it, you can come up with ideas about it, and try to meld systems of rationality about it. But this is just an abuse of the word science, an abuse meant to demarcate this area of activity from things that are non-science and are, by implication, less robust or rigorous. This is one of the ways the science (and STEM) lobby misunderstand the world. But, the quoted paragraph, does of course say something meaningful.
“Certainly, for frontline doctors like us who are used to wrestling with clunky NHS IT systems, the biggest surprise of the malware attack was not that it happened but why it had taken so long. It is an irony lost on no NHS doctor that though we can transplant faces, build bionic limbs, even operate on fetuses still in the womb, a working, functional NHS computer can seem rarer and more precious than gold dust.’
There are two models. Sage on the stage. Or building structures than scale. Individual brilliance and interpretation; or Hollywood. There are not not enough sages; but people deny we can build structures that scale.
I see this dialectic everywhere in education. When do we need n=1; and what can do at scale. It is not just education however. All over the creative world we can see this battle play out. As Paul Simon put it:
“I’m sittin’ in the railway station, got a ticket for my destination
On a tour of one-night-stands, my suitcase and guitar at hand
And every stop is neatly planned for a poet and a one-man band”
On the other hand look at this. The song maker (Max Martin) few have heard of.
Woodie Flowers in a devastating critique of MITx said it well.
I believe the “sweet spot” for expensive universities like MIT is:
1) access to highly-produced training systems accompanied by
2) a rich on-campus opportunity to become educated.
MITx seems aimed at neither.
Medicine gets this confused big time. There is training and education. If we did the former better, we could offer a real education. But to do the training better, we need scale. And that means content. We could do things better and cheaper.
A few years back at an ADA meeting in Napa I got to listen to a dermatologist who understood how to influence government. I had never heard anybody speak live who was so effective, so effortless in his command of his brief, and with such charm. Maybe JFK might have had this effect, too. The there was Obama.
I do not have these skills, but more worryingly I do not think UK medicine does them that well either. I do not mean the ‘honours’ business, but meaningful attempts to balance the power and corruption of the state. We don’t seem to do activism well, either.
Some advice from last week’s NEJM, worth a read: ‘Effective Legislative Advocacy — Lessons from Successful Medical Trainee Campaigns’. Which, if nothing else, forced me to chase up the quote I (and others) have been misquoting for years from Rudolf Virchow.
Speaking on BBC Radio 4’s Midweek programme on 22 February 2006, Jonathon Kaplan quoted Virchow as saying that, “pathology is politics writ large”. He seems to have been misquoting the usual part‐quotation that, “Medicine is a social science and politics is nothing but medicine writ large”. In fact, what Virchow really said was that, “Medicine is a social science and politics is nothing else but medicine on a large scale. Medicine as a social science, as the science of human beings, has the obligation to point out problems and to attempt their theoretical solution; the politician, the practical anthropologist, must find the means for their actual solution”
Link here to J Epidemiol Community Health. 2006 Aug; 60(8): 671.
Interesting piece in today’s NEJM on data sharing and clinical trials and how meaningfully patients are involved.
Patients also said they wanted trial results to be shared with participants themselves, along with an explanation of what the results mean for them — something that generally doesn’t happen now. In addition, most participants said they had entered a trial not to advance knowledge, but to obtain what their doctor thought was the best treatment option for them. All of which raises some questions about how well patients understand consent forms.
Which reminded me of a powerful paper by the late David Horrobin in the Lancet in which, from the position of being a patient with a terminal illness, he challenged what many say:
The idea that altruism is an important consideration for most patients with cancer is a figment of the ethicist’s and statistician’s imagination. Of course, many people, when confronted with the certainty of their own death, will say that even if they cannot live they would like to contribute to understanding so that others will have a chance of survival. But the idea that this is a really important issue for most patients is nonsense. What they want is to survive, and to do that they want the best treatment.
I have always been suspicious of the ‘equipoise’ argument and terrified when I see participation rates in clinical trials as a NHS performance measure. It is bad enough that doctors might end up acting as agents of the state. But this is worse than shilling for pharma.
Thew NEJM piece also draws attention to people’s reluctance to share with commercial entities. What this tells you, is that many people view some corporations —pharma — in this instance as pirates. Or worse. This topic is not going away. Nor is the need for (commercial) pharma to finance and develop new drugs.
Institutions with histories matter. It is just that in many instances innovation often comes from the periphery. I think this is often true in many fields: science, music, even medical education. It is not always this way, but often enough to make me suspicious of the ‘centre’. The centre of course gets to write the history books.
An article by Mark Mazower in the NYRB, praising Richard Evans, the historian of the Third Reich, caught my attention. It seems that nobody in the centre was too excited about understanding the event that changed much of the world forever. Mazower writes:
If you wanted to do research on Saint Anselm or Cromwell, there were numerous supervisors to choose from at leading universities; if you wanted to write about Erich Ludendorff or Hitler, there was almost no one. The study of modern Europe was a backwater, dominated by historians with good wartime records and helpful Whitehall connections—old Bletchley Park hands and former intelligence officials, some of whom had broken off university careers to take part in the war and then returned.
Forward-looking, encouraging of the social sciences, open to international scholarship from the moment of its establishment, St. Antony’s is the college famously written off by the snobbish Roddy Martindale in John le Carré’s Tinker, Tailor, Soldier, Spy as “redbrick.” The truth is that it was indeed the redbrick universities, the creations of the 1950s and 1960s, that gave Evans and others their chance and shaped historical consciousness as a result. The Evans generation, if we can call them that, men (and only a very few women) born between 1943 and 1950, came mostly from the English provinces and usually got their first jobs in the provinces, too.
It is interesting how academics who had had career breaks were important. And how you often will need new institutions to change accepted practice. All those boffins whose careers were interrupted by the war led to the flowering of invention we saw after the second world war. You have to continually recreate new types of ivory towers. But I see little of this today. Instead, we live in an age of optimisation, rather than of optimism that things can be different. The future is being captured by the present ever more than it once was. At least in much of the academy.
Some nice memes in this letter from an MD student in Australia. Please discuss. Pharma might be interested
Considering this, if we thought about the pervasive attitudes that inform our definition of a “good” medical student as a disease, it’s hard to believe that we would not try to treat it.
Edward Tufte’s ‘The Cognitive Style of Powerpoint’ is funny. The problem is that it is not just funny, but deadly serious. Literally. His argument and case studies concern how humans died because people failed to understand how to communicate. And the title says it all. Powerpoint (at least its templates) degrades communication.
Communication is a big thing in medical education, and it is not unusual to have to sit through tedious talks on the subject. They usually start with Powerpoint slides, so at least you know that they are not going to say anything worthwhile and you can get your phone out and play.
Below is a memo, from Jeff Bezos, of Amazon.
Perhaps the single most important thing we could do to improve university education to is to remove all copies of Powerpoint. Words matter. Sentences even more.
This is from a book review on the ‘birth of cool’, by Robert Eaglestone in the THE.
Despite laying out some principles (“cool is…”), the book focuses on honed case studies of “the saints of cool” (as Hannah Arendt argues, we learn more from examples than from principles).
This little gem was new to me —but not the concept, or the principle…..
Academia tends to love rules, and formal systems, but for some domains of competence, they are grossly overrated. Formal logic is often not what is need, and we may seem more with a metaphor. Alan Kay’s aphorism: a different perspective may be as valuable as 80 IQ points.
When I covered Kasparov-Deep Blue match, I thought the drama came from a battle between computer and human. But it was really a story of people, with brutal capitalist impulse, teaming up with AI to destroy the confidence and dignity of the greatest champion the world had seen. That leads me to believe it’s not Skynet that should worry us about AI, but rather the homo sapiens who build, implement, and employ those systems.
Well, this is all about one of the great issues of our age. As I wrote in 2008:
Change in medicine is increasingly driven by the twin forces of specialization, and the desire to codify medical practice, i.e. to produce rules that can be followed by those from a range of educational and professional backgrounds. The battle is over the intellectual heartlands of clinical practice and how the knowledge that underpins clinical practice is acquired, distributed and validated.
The mistake is to believe that this process is not mired in political and financial assumptions about what good care means — and who can make money out of it.
Ultimately, students may feel less ripped off by essay mills than by universities.
I like this take on plagiarism and cheating. As has been said before, if somebody can write your essay, and the change in style not be noticed, those claims in the glossy prospectuses are hollow. Canaries in the coal mine.
I saw this giant problem in my education, and I actually designed a course called, “Physician Heal Thyself, Evidence-based lifestyle.” I brought in all these doctors who are experts in sleep medicine, sleep, fitness nutrition, food as medicine, functional medicine, integrative medicine, osteopathy and acupuncture. I got them all in a room and said I want you to teach students what we’re missing. We need to make this medical school education and have to implement this into the board certification programs as well as board exams. If it’s not required, it’s not going to be taught.
No, I am not taking this too seriously. Awhile back, I compiled a list of all the things we needed to
inflict on / ask medical students to know : I had to buy a larger hard drive. And as for this ‘new medicine’ George Bernard Shaw described it a long time ago.
I began as a university student after two years of military service. My brain was like an empty sponge. University was fantastic, a world full of knowledge and very interesting people. I studied medicine and went to philosophy lectures as much as I could, and then mathematics until my medical exams came up.
Wolfram Schultz in the THE
During the 7-year period between the introduction of tacrolimus in preclinical studies in 1987 and the FDA approval of tacrolimus in 1994, the transplant program at the University of Pittsburgh produced one peer-reviewed article every 2.7 days, while transplanting an organ every 14.2 hours.
Always thought these surgeons needed to spend more time in theatre .
Science. Obituary of Thomas Starzl.
What he was doing in this was holding a crucial middle ground. He understood better than anyone else that the public realm has to fight for its existence against two equally great dangers. One is the culture of self-enclosed, technocratic expertise, the hiving off of intellectual life into increasingly minute specializations and increasingly impenetrable professional dialects. The other is the insistence—so much in the ascendant now—that there is no expertise at all, that scholarship and rigor and evidence are the mere playthings of elitist eggheads. Bob’s great gift to civic life was the living demonstration in every issue of the Review that these impostors could be treated with equal—and magnificent—contempt. He held open the space for that great republican virtue: common curiosity. He made this fierce effort seem so natural that it is only in his absence that we realize how hard it is to do and how much it counts.
Fintan O’Toole. From a collection of essays on the late Bob Silvers of the NYRB.
O’Toole gets the central problem the academy is failing on. Not that the academy is ever sufficient.
The use of Benzedrine by American athletes in the 1936 Berlin Olympics prompted the Temmler company on the edge of Berlin to focus on creating a more powerful version. By the autumn of 1937, its chief chemist, Dr. Fritz Hauschild (in postwar years the drug provider for East German athletes), created a synthesized version of methamphetamine. This was patented as Pervitin. It produced intense sensations of energy and self-confidence.
In pill form Pervitin was marketed as a general stimulant, equally useful for factory workers and housewives. It promised to overcome narcolepsy, depression, low energy levels, frigidity in women, and weak circulation. The assurance that it would increase performance attracted the Nazi Party’s approval, and amphetamine use was quietly omitted from any anti-drug propaganda. By 1938, large parts of the population were using Pervitin on an almost regular basis, including students preparing for exams, nurses on night duty, businessmen under pressure, and mothers dealing with the pressures of Kinder, Küche, Kirche (children, kitchen, church—to which the Nazis thought women should be relegated). Ohler quotes from letters written by the future Nobel laureate Heinrich Böll, then serving in the German army, begging his parents to send him more Pervitin. Its consumption came to be seen as entirely normal.
Lots I didn’t know, but any reader of David Healy will not be surprised. A dermatologist doesn’t come out of it too well, either.
Antony Beevor in the NYRB
Once upon a time the government gave money to universities, and the universities educated people (or they tried). Now things are different. The government buys educational services, and the universities are the contractors.
Not so much Software as a Service, but Education as a Service.
Technology magnifies power in general, but the rates of adoption are different. Criminals, dissidents, the unorganized—all outliers—are more agile. They can make use of new technologies faster, and can magnify their collective power because of it. But when the already-powerful big institutions finally figured out how to use the Internet, they had more raw power to magnify.
This is true for both governments and corporations. We now know that governments all over the world are militarizing the Internet, using it for surveillance, censorship, and propaganda. Large corporations are using it to control what we can do and see, and the rise of winner-take-all distribution systems only exacerbates this.
This is the fundamental tension at the heart of the Internet, and information-based technology in general. The unempowered are more efficient at leveraging new technology, while the powerful have more raw power to leverage. These two trends lead to a battle between the quick and the strong: the quick who can make use of new power faster, and the strong who can make use of that same power more effectively.
Bruce Schneier, on Crooked Timber, talking around Cory Doctorow’s new novel ‘Walkaway’. BS—well worth reading in full, as usual.
Nice dissection of some of the issues by Ross Anderson, here.
And in today’s FT, we read
Microsoft held back from distributing a free repair for old versions of its software that could have slowed last week’s devastating ransomware attack, instead charging some customers $1,000 a year per device for protection against such threats.
Gee, a secure version of Windows? That’s extra.
Alan Kay on discovery and invention: the scientists find the problems not the funders. Here. (Via Benedict Evans)
In truth, everyone knows that values are actually marketing exercises, used by organisations as slogans. They have little to do with actual behaviour in organisations. They infantilise people, reduce them to ciphers.
Wonke on the laws that govern us.
- The first law of higher education is that the future of universities is political not technical.
- The second unchanging law of higher education is that there is no situation so bad that it cannot get worse.
- The third law of higher education, as exemplified by the classic University Challenge episode of The Young Ones, is that the posh kids always win.
- The fourth and final law of higher education, that exceeds the first three, is that universities outlive ministers.
And if the is not enough food for thought, consider this:
Higher education in England is no longer a supply-led industry. English universities are now in a demand-led environment in which the regulator has the last word. The Rubicon has been crossed, and few in higher education have really begun to understand what the implications of that are for universities. They will have the next five years of Conservative government to contemplate it.
Great essay by Bruce Schneier.
In 2020 — 10 years from now — Moore’s Law predicts that computers will be 100 times more powerful. That’ll change things in ways we can’t know, but we do know that human nature never changes. Cory Doctorow rightly pointed out that all complex ecosystems have parasites. Society’s traditional parasites are criminals, but a broader definition makes more sense here. As we users lose control of those systems and IT providers gain control for their own purposes, the definition of “parasite” will shift. Whether they’re criminals trying to drain your bank account, movie watchers trying to bypass whatever copy protection studios are using to protect their profits, or Facebook users trying to use the service without giving up their privacy or being forced to watch ads, parasites will continue to try to take advantage of IT systems. They’ll exist, just as they always have existed, and — like today — security is going to have a hard time keeping up with them.
Welcome to the future. Companies will use technical security measures, backed up by legal security measures, to protect their business models. And unless you’re a model user, the parasite will be you.
Which just reminds my of my own ecological ignorance. Many years back I was moaning to William Bains about how “surely the system (insert your own bête noire) will collapse under the weight of all these people who do nothing except get in the way and stop real work being done”. He corrected me by reminding me that in many biological systems the biomass of parasites exceeds that of the non-parasites. It is now my strategy when meeting somebody or hearing some new idea to ask the simple polite question: are you a parasite? There are an awful lot of them. I expect to see more and more.