tech

Smombies everywhere

My youngest daughter lived in South Korea for a while and I visited on a couple of occasions. It was a lot of fun in all sorts of ways. The following rings(!) true

The government initially tried to fight the “smombie” (a portmanteau of “smartphone” and “zombie”) epidemic by distributing hundreds of stickers around cities imploring people to “be safe” and look up. This seems to have had little effect even though, in Seoul at least, it recently replaced the stickers with sturdier plastic boards.

Instead of appealing to people’s good sense, the authorities have therefore resorted to trying to save them from being run over. Early last year, they began to trial floor-level traffic lights in smombie hotspots in central Seoul. Since then, the experiment has been extended around and beyond the capital. For the moment, the government is retaining old-fashioned eye-level pedestrian lights as well. But in future, the way to look at a South Korean crossroads may be down.

A dangerous creature is haunting South Korean crossroads – Smombie apocalypse

Direct URL for this post.

Software is eating..

Comparison of the accuracy of human readers versus machine-learning algorithms for pigmented skin lesion classification: an open, web-based, international, diagnostic study.

You can dice the results in various ways, but software is indeed eating the world — and the clinic. The (slow) transition to this new world will be interesting and eventful. A good spectator sport for some of us. (Interesting to note that this study in Lancet Oncology received no specific funding. Hmmm).

Direct URL for this post.

The digital skin web

On some Swedish trains, passengers carry their e-tickets in their hands—literally. About 3,000 Swedes have opted to insert grain-of-rice-sized microchips beneath the skin between their thumbs and index fingers. The chips, which cost around $150, can hold personal details, credit-card numbers and medical records. They rely on Radio Frequency ID (RFID), a technology already used in payment cards, tickets and passports.

Why Swedes are inserting microchips into their bodies – Bjorn Cyborg

One of these is going to end up being sectioned as some time….waiting for the first case-report. Not often I can get two puns in a three word title.

Direct URL for this post.

On  ratio scales and the spirits of invention

It is said that much of the foundations of 20th century physics was done in coffee houses (or in the case of Richard Feynman in strip bars), but things were once done differently in the UK

With neither institutional nor government masters to answer to, the British cyberneticians were free to concentrate on what interested them. In 1949, in an attempt to develop a broader intellectual base, many of them formed an informal dining society called the Ratio Club. Pickering documents that the money spent on alcohol at the first meeting dwarfed that spent on food by nearly six to one — another indication of the cultural differences between the UK and US cyberneticians.

The work of the British pioneers was forgotten until the late 1980s when it was rediscovered by a new generation of researchers… A company that I cofounded has now sold more than five million domestic floor-cleaning robots, whose workings were inspired by Walter’s tortoises. It is a good example of how unsupported research, carried out by unconventional characters in spite of their institutions, can have a huge impact.

A review from 2010 by Rodney Brooks of MIT of “The Cybernetic Brain: Sketches of Another Future” in Nature (For more on Donald Michie and “in spite of their institutions” see here).

Direct URL for this post.

How the Nobel are fallen

As John Hammerbacher, Facebook’s first research scientist, remarked: “the best minds of my generation are thinking about how to make people click ads… And it sucks.”

Quoted in Stand Out of Our Light, James Williams

Direct URL for this post.

Surgeons?

”A lot of patients are still having open surgery when they should be getting minimal access surgery,” said Mr Slack, a surgeon at Addenbrooke’s Hospital in Cambridge. “Robotics will help surgeons who don’t have the hand-eye co-ordination or dexterity to do minimal access surgery.”

Trial of new generation of surgical robots claims success | Financial Times

Direct URL for this post.

Turn-it-around

by reestheskin on 02/04/2019

Comments are disabled

A couple of articles from the two different domains of my professional life made me riff on some old memes. The first, was an article in (I think) the Times Higher about the fraud detection software Turnitin. I do not have any firsthand experience with Turnitin (‘turn-it-in’), as most of our exams use either clinical assessments or MCQs. My understanding is that submitted summative work is uploaded to Turnitin and the text compared with the corpus of text already collected. If strong similarities are present, the the work might be fraudulent. A numerical score is provided, but some interpretation is necessary, because in many domains there will be a lot of ‘stock phrases’ that are part of domain expertise, rather than evidence of cheating. How was the ‘corpus’ of text collected? Well, of course, from earlier student texts that had been uploaded.

Universities need to pay for this service, because in the age of massification, lecturers do not recognise the writing style of the students they teach. (BTW, as Graham Gibbs has pointed out, the move from formal supervised exams to course work has been a key driver of grade inflation in UK universities).

I do not know who owns the rights to the texts students submit, nor whether they are able to assert any property rights. There may be other companies out there apart from Turnitin, but you can see easily see that the more data they collect, the more powerful their software becomes. If the substrate is free, then the costs relate to how powerful their algorithms are. It is easy to imagine how this becomes a monopoly. However, if copies of all the submitted texts are kept by universities then collectively it would make it easier for a challenger to enter the field. But network effects will still operate.

The other example comes from medicine rather than education. The FT ran a story about the use of ‘machine learning’ to diagnose retinal scans. Many groups are working on this, but this report was about Moorfields in London. I think I read that as the work was being commercialised, then the hospital would have access to the commercial software free of charge. There are several issues, here.

Although, I have no expert knowledge in this particular domain, I know a little about skin cancer diagnosis using automated methods. First, the clinical material and annotation of clinical material is absolutely rate limiting. Second, once the system is commercialised, the more any subsequent images can be uploaded the better you would imagine the system will become. This of course requires further image annotation, but if we are interesting in improving diagnosis, we should keep enlarging the database if the costs of annotation are acceptable. As in the Turnitin example, the danger is that the monopoly provider becomes ever more powerful. Again, if the image use remains non-exclusive, then it means there are lower barriers to entry.

Deep problems

by reestheskin on 23/01/2019

Comments are disabled

News Feature: What are the limits of deep learning? | PNAS

 In addition to its vulnerability to spoofing, for example, there is its gross inefficiency. “For a child to learn to recognize a cow,” says Hinton, “it’s not like their mother needs to say ‘cow’ 10,000 times”—a number that’s often required for deep-learning systems. Humans generally learn new concepts from just one or two examples.

There is a nice review on Deep Learning in PNAS. The spoofing referred to, is an ‘adversarial patch’ — a patch comprising an image of something else. In the example here, a mini-image of a toaster confuses the AI such that a very large banana is seen as a toaster (the  paper is here on arXiv — an image is worth more than a thousand of my words).

Hinton, one of the giants of this field, is of course referring to Plato’s problem: how can we know so much given so little (input). From the dermatology perspective, the humans may still be smarter than the current machines in the real world, but pace Hinton our training sets need not be so large. But they do need to be a lot larger than n=2. The great achievement of the 19th century clinician masters was to be able to create concepts that gathered together disparate appearances, under one ‘concept’. Remember the mantra: there is no one-to-one correspondence between diagnosis and appearance. The second problem with humans is that they need continued (and structured) practice: the natural state of clinical skills is to get worse in the absence of continued reinforcement. Entropy rules.

Will things change? Yes, but radiology will fall first, then ‘lesions’ (tumours), and then rashes — the latter I suspect after entropy has had its way with me.

Annual Review of the ‘business’ that is ed-tech  by Audrey Watters.

Ed-tech is a confidence game. That’s why it’s so full of marketers and grifters and thugs. (The same goes for “tech” at large.)

Audrey Watters

 “criticism and optimism are the same thing. When you criticize things, it’s because you think they can be improved. It’s the complacent person or the fanatic who’s the true pessimist, because they feel they already have the answer. It’s the people who think that things are open-ended, that things can still be changed through thought, through creativity—those are the true optimists. So I worry, sure, but it’s optimistic worry.” Jaron Lanier. We Need to Have an Honest Talk About Our Data

Models of our mind and communities

by reestheskin on 18/12/2018

Comments are disabled

Google’s AI Guru Wants Computers to Think More Like Brains | WIRED

This is from an interview with Geoffrey Hinton who — to paraphrase Peter Medawar’s comments about Jim Watson — has something to be clever about. The article is worth reading in full, but here are a few snippets.

Now if you send in a paper that has a radically new idea, there’s no chance in hell it will get accepted, because it’s going to get some junior reviewer who doesn’t understand it. Or it’s going to get a senior reviewer who’s trying to review too many papers and doesn’t understand it first time round and assumes it must be nonsense. Anything that makes the brain hurt is not going to get accepted. And I think that’s really bad…

What we should be going for, particularly in the basic science conferences, is radically new ideas. Because we know a radically new idea in the long run is going to be much more influential than a tiny improvement. That’s I think the main downside of the fact that we’ve got this inversion now, where you’ve got a few senior guys and a gazillion young guys.

I would make a few comments:

  1. First the history of neural nets is long: even people like me had heard about them in the late 1980s. The history of ideas is often like that.
  2. The academy is being sidetracked into thinking it should innovate or develop ideas that whilst important are not revolutionary. Failure should be the norm, rather than the continued treadmill of grant income and papers.
  3. Scale and genuine discovery — for functioning of peer groups — seldom go together.
  4. Whilst most of the really good ideas are still out there, it is possible to create structures that stop people looking for them.
  5. Hinton makes a very important point in the article with broad relevance. He argues that you cannot judge (or restrict the use of) AI on the basis of whether or not it can justify its behaviour in terms of rules or logic — you have to judge it on it ability to work, in general. This is the same standard we apply to humans, or at least we did, until we thought it wise or expedient to create the fiction that much of human decision making is capable of conscious scrutiny. This applies to medicine, to the extent that clinical reasoning is often a fiction that masters like to tell novices about. Just-so stories, to torment the young with. And elsewhere in the academy for the outlandish claims that are made for changing human behaviour by signing up for online (“human remains”)courses (TIJABP).

All has been said before, I know, but no apology will be forthcoming.

The importance of obsession

by reestheskin on 16/12/2018

Comments are disabled

How a Welsh schoolgirl rewrote the rules of publishing | Financial Times by Gillian Tett

In 2011, Beth Reeks, a 15-year-old Welsh schoolgirl studying for her GCSE exams, decided to write a teenage romantic novel. So she started tapping on her laptop with the kind of obsessive creative focus – and initial secrecy – that has been familiar to writers throughout history. “My parents assumed I was on Facebook or something when I was on my laptop – or I’d call up a document or internet page so it looked like I was doing homework,” she explained at a recent writers’ convention. “I wrote a lot in secret… and at night. I was obsessed.”

But Reeks took a different route: after penning eight chapters of her boy-meets-girl novel, The Kissing Booth, she posted three of them on Wattpad, an online story-sharing platform …. As comments poured in, Reeks turned to social media for more ideas. “I started a Tumblr blog and a Twitter account for my writing. I used them to promote the book…[and] respond to anyone who said they liked the story,” she explained in a recent blog post. 

… while Reeks was at university studying physics, her work was turned into an ebook, then a paperback (she was offered a three-book deal by the mighty Random House) and, this year, Netflix released it as a film, which has become essential viewing for many teenage girls.

Norman’s Law of eLearning Tool Convergence

by reestheskin on 22/10/2018

Comments are disabled

Maybe more of a theory than a law, but still:

Any eLearning tool, no matter how openly designed, will eventually become indistinguishable from a Learning Management System once a threshold of supported use-cases has been reached.

They start out small and open. Then, as more people adopt them and the tool is extended to meet the additional requirements of the growing community of users, eventually things like access management and digital rights start getting integrated. Boil the frog. Boom. LMS.

Norman’s Law of eLearning Tool Convergence – D’Arcy Norman dot net

Publishers and universities.

by reestheskin on 07/10/2018

Comments are disabled

It is easy to make facile comparisons between universities,  publishing, and the internet. But it is useful to explore the differences and similarities, even down to the mundane production of ‘content’.

This is from Frederic Filloux form the ever wonderful Monday Note

Dear Publishers, if you want my subscription dollars (or euros), here is what I expect…

The biggest mistake of news publishers is their belief that the presumed uniqueness of their content is sufficient to warrant a lifetime of customer loyalty.

The cost of news production is a justification for the price of the service; in-depth, value-added journalism is hugely expensive. I’m currently reading Bad Blood, John Carreyrou’s book about the Theranos scandal (also see Jean-Louis last week’s column about it). This investigation cost the Wall Street Journal well over a million dollars. Another example is The New York Times, which spends about $200 m a year for its newsroom. The cost structure of news operations is the may reason why tech giants will never invest in this business: the economics of producing quality journalism are incompatible with the quantitative approach used in tech which relies Key Performance Indicators or Objectives and Key Results. (

In France, marketers from the French paid-TV network Canal+ prided themselves of their subscription management: “Even death isn’t sufficient to cancel a subscription,” as one of them told me once.

Carrot weather gets it right again

by reestheskin on 29/09/2018

Comments are disabled

Facebook accounts hacked? I thought that was the feature not the bug.

 

 

 

Carrot weather — the weather app with attitude.

 

Nature cannot be fooled — only investors

by reestheskin on 24/09/2018

Comments are disabled

Two quotes from Bad Blood: Secrets and lies in a Silicon Valley Startup, by John Carreyrou. Only without much silicon.

“Henry, you’re not a team player,” she said in an icy tone. “I think you should leave right now.” There was no mistaking what had just happened. Elizabeth wasn’t merely asking him to get out of her office. She was telling him to leave the company—immediately. Mosley had just been fired.

He also maintained that Holmes was a once-in-a-generation genius, comparing her to Newton, Einstein, Mozart, and Leonardo da Vinci.

The reality distortion field lived on. Medicine is indeed tricky.

It is all about incentives

by reestheskin on 21/09/2018

Comments are disabled

This is a scary story. But the lesson is (yet again) our inability to understand what makes humans tick.

The Untold Story of NotPetya, the Most Devastating Cyberattack in History | WIRED

How Maersk was taken down by Russian malware, and how it recovered. The passage that got the attention is the bit about flying a domain controller backup in from Ghana (the only one that survived). The one that matters is that they were still running Windows 2000 on some servers and hadn’t carried out a proposed security revamp because it wasn’t in the IT managers’ KPIs and so wouldn’t help their bonuses. Link  

Via Ben Evans

Radiologists and platforms

by reestheskin on 20/08/2018

Comments are disabled

It is not only taxi drivers that are being “uberised” but radiologists, lawyers, contractors and accountants. All these services can now be accessed at cut rates via platforms.

FT

The NHS became such a platform, for good and bad. That is the real lesson here. The tech is an amplifier, but the fundamentals were always about power.

MOOCs revisited

by reestheskin on 09/07/2018

Comments are disabled

One selling point of MOOCs (massive online open courses) has been that students can access courses from the world’s most famous universities. The assumption—especially in the marketing messages from major providers like Coursera and edX—is that the winners of traditional higher education will also end up the winners in the world of online courses.

But that isn’t always happening.

In fact, three of the 10 most popular courses on Coursera aren’t produced by a college or university at all, but by a company. That company—called Deeplearning.ai—is a unique provider of higher education. It is essentially built on the reputation of its founder, Andrew Ng, who teaches all five of the courses it offers so far. Link

The MOOC story is like so much of tech — or drug discovery for that matter. Finding a use for a drug invented for another reason often offers the biggest payback. This story has barely begun.

AI winter, revisited

by reestheskin on 25/06/2018

Comments are disabled

Hype is not fading, it is cracking.

I like the turn of phrase. It is from a post on the coming AI winter. Invest wisely.

AI winter – Addendum – Piekniewski’s blog

Pave paradise, and put up a parking lot

by reestheskin on 19/06/2018

Comments are disabled

A Magic Shield That Lets You Be An Assh*le? – NewCo Shift

The Internet of the 1990s was about choosing your own adventure. The Internet of right now over the last 10 years is about somebody else choosing your adventure for you.

link

“They took all the trees, put ’em in a tree museum, and they charged the people., a dollar and a half just to see ’em…”

 

Images aren’t everything — well, sometimes, maybe they..

by reestheskin on 12/06/2018

Comments are disabled

“It’s quite obvious that we should stop training radiologists,” said Geoffrey Hinton, an AI luminary, in 2016. In November Andrew Ng, another superstar researcher, when discussing AI’s ability to diagnose pneumonia from chest X-rays, wondered whether “radiologists should be worried about their jobs”. Given how widely applicable machine learning seems to be, such pronouncements are bound to alarm white-collar workers, from engineers to lawyers.

Economist

The Economist’s view is (rightly) more nuanced than Hinton’s statement on this topic might suggest, but this is real. For my own branch of clinical medicine, too. The interesting thing for those concerned with medical education is whether we will see the equivalent of the Osborne effect (and I don’t mean that Osborne effect).

Power, order and scale

by reestheskin on 31/05/2018

Comments are disabled

This is some text I recognise, but I had forgotten its source: Bruce Schneier.

Technology magnifies power in general, but the rates of adoption are different. Criminals, dissidents, the unorganized—all outliers—are more agile. They can make use of new technologies faster, and can magnify their collective power because of it. But when the already-powerful big institutions finally figured out how to use the Internet, they had more raw power to magnify.

This is true for both governments and corporations. We now know that governments all over the world are militarizing the Internet, using it for surveillance, censorship, and propaganda. Large corporations are using it to control what we can do and see, and the rise of winner-take-all distribution systems only exacerbates this.

This is the fundamental tension at the heart of the Internet, and information-based technology in general. The unempowered are more efficient at leveraging new technology, while the powerful have more raw power to leverage. These two trends lead to a battle between the quick and the strong: the quick who can make use of new power faster, and the strong who can make use of that same power more effectively.

Bruce Schneier

‘The most important thing humanity has ever built.’

by reestheskin on 28/05/2018

Comments are disabled

Well, this was the modest description of a ‘new’ way to test blood. Except it wasn’t. The reality distortion field in hyperspace. If you don’t know the Theranos story — or doubt the importance of real journalism — have a look.

Link

The journalist who broke the story, John Carreyrou, has a book coming out soon. Jean-Louis Gassée, a shrewd observer of Silicon Valley, has a nice piece about it. Note the turtle neck.

Just talking ‘bout

by reestheskin on 26/05/2018

Comments are disabled

In this week’s privacy nightmare, an Oregon couple discovered their Amazon Echo smart speaker recorded their conversation and sent the audio to an acquaintance — without their knowledge.

The claim seemed improbable, until the company confirmed it really happened. Amazon said it was reviewing how its smart speakers work to avoid similar situations.

Link

Facebook, as Big Tobacco for ‘the next billion’

by reestheskin on 11/05/2018

Comments are disabled

Exactly in the same way that Big Tobacco has been free to fill the lungs of Asian of African populations, with little interference from local health administrations, Facebook will have a free hand to lock up these markets. (If you find my comparison with the tobacco industry exaggerated, just ask the Rohingyas or people in the Philippines about the toxicity of Facebook to democracy — or read this Bloomberg Business Week piece, “What happens when the government uses Facebook as a weapon?)

Mark Zuckerberg’s long game: the next billion – Monday Note

Focus on the kids: they are the vulnerable ones

by reestheskin on 04/05/2018

Comments are disabled

I used to think this whole topic was overblown. But then again, I once thought those who foresaw the obesity epidermic were selling something. Wrong on both counts.

Former Google Design Ethicist: Relying on Big Tech in Schools Is a ‘Race to the Bottom’ | EdSurge News

I see this as game over unless we change course,” says Tristan Harris, a former ethicist at Google who founded the Center for Humane Technology. “Supercomputers play chess against your mind to extract the attention out of you. The stock price has to keep going up, so they point it at your kid and start extracting the attention out of them. You don’t want an extraction-based economy powered by AI, playing chess against people’s minds. We cannot win in that world.”

In an interview with EdSurge, Harris noted that the focus of their campaign started with children because they were the most vulnerable population. He says that particularly children in schools had little agency over whether they opted into or out of a technology platform because of pressure from both peers and educators handing out assignments.

Our struggle with Big Tech to protect trust and truth

by reestheskin on 03/05/2018

Comments are disabled

Some nice turns of phrase and perspective from this article in the FT

In 1829, the great Scottish historian and essayist Thomas Carlyle wrote: “Were we required to characterise this age of ours by any single epithet, we should be tempted to call it . . . the Mechanical Age. It is the Age of Machinery . . . the age which, with its whole undivided might, forwards, teaches, and practices the greater art of adapting means to ends.”

He continued with a lament for older ways of doing and being: “On every hand, the artisan is driven from his workshop, to make room for a speedier, inanimate one. The shuttle drops from the fingers of the weaver and falls into iron fingers that ply it faster. The sailor furls his sail, and lays down his oar, and bids a strong unwearied servant . . . bear him through the waters.”

It is a measure of just how much speedier our age is that no one today will take the time to write or read such comparatively languorous prose. What is striking about Carlyle’s writing from today’s vantage point is how early in the industrial revolution he mounted a protest against it. By 1829, the steam engine was entering its ­heyday, but the explosion of iron, steel, coal and oil that we associate with the industrial age was visible only on the horizon.

FT

Fear of the known

by reestheskin on 02/04/2018

Comments are disabled

Universities are certainly putting their courses online. The question is “why?” I talked last week with a University President whom I have known for many years and asked him why he was building online courses. His answer, unsurprisingly, was “fear.”

Roger Schank

This is an old quote, but still redolent.