It is perhaps surprising that it has taken this long for Google to get in on the telemedicine boom. The company has just obtained a CE mark for its first medical device – a web-based application that can help triage skin complaints.
The Derm Assist software uses artificial intelligence to assess images a person takes of their skin concern, such as a mole or a rash. Data suggest the app is reasonably accurate, but with Google intending to offer the software free of charge it is not clear how the company expects to profit from it, and privacy concerns regarding patients’ data have been raised.
Google stipulates that Derm Assist is not a diagnostic. After the user uploads the images, they answer a few questions about skin type, how long the issue has persisted and other symptoms to help the tool narrow down the possibilities. The app then gives a list of possible matching conditions, along with information and similar matching images from the web. It is then up to the user whether to follow up with a doctor. The company has made a point of stating that the app works on a wide range of skin tones.
In a sense this is a version of what people often do already – that is, Google their symptoms to try to find out what might be wrong with them. Indeed, Google says there are almost ten billion Google searches related to skin, nail and hair issues each year. But Derm Assist has the stamp of a regulator, being CE marked as a class I medical device in the EU.
Adjust your settings
To Google’s credit, data on the app have been published in peer-reviewed journals. But this does not concern the setting in which Derm Assist will be launched, instead looking at its utility in helping healthcare professionals make diagnoses.
A study published in Nature Medicine a year ago suggested that the app was as good as dermatologists at identifying 26 common skin conditions, including melanoma, basal cell carcinoma and squamous cell carcinoma.
Specifically, on a dataset of 963 validation cases, the software was non-inferior to six dermatologists and superior to six primary care physicians and six nurse practitioners. The top-1 accuracy – the proportion of samples where the top choice of diagnosis matched the true diagnosis – was 0.66 for Derm Assist, 0.63 for the dermatologists, 0.44 for the primary care doctors and 0.40 for the nurses.
Newer research, published in Jama this month, ploughs a similar furrow, showing that use of the technology by non-specialist healthcare workers was significantly associated with higher agreement with reference diagnoses. For GPs, the increase in diagnostic agreement was 10%, from 48% to 58%; for nurse practitioners, the increase was 12%, from 46% to 58%.
A pilot launch of Derm Assist in the EU is pencilled in for the end of this year. The software will be available free, prompting some to wonder whether Google will attempt to monetise users’ data.
Google has denied this, stating that the information and photos provided are private and encrypted, and will not be used to target advertising. It would save images in order to further train the Derm Assist algorithm, but only if users give explicit permission.
The uncertainties with this technology go beyond whether, or how, Google will make money from it. Given that the clinical data on Derm Assist do not assess it in its approved setting, the crucial point is that it remains unproven that the programme will actually improve care.