Google Lens can perform a skin analysis

[German]Google announced this week that its product "Google Lens", which is included in Google's camera app on Android (but is also available on iOS), has been expanded to include various functions. One function allows the analysis of images of the skin – to detect abnormalities there that suggest a clarification by a doctor.


Advertising

The announcement was already made on June 14, 2023 in the Google blog 8 ways Google Lens can help make your life easier. There, Google introduces eight new features that Google Lens will be able to handle in the future. These range from explanations of objects that you are currently seeing (and photographing) to the translation of traffic signs into over 100 languages to various image analyses. You should be able to search for what the camera sees, they say.

Hautanalyse mit Google Lens
Skin analysis with Google Lens

One of these features is the analysis of photos of the skin for anomalies. Google writes the following about it:

This new Google Lens feature looks for skin conditions that are visually similar to what you see on your skin. You can crop out the affected area while the results are displayed in a carousel above "Visual Matches."

The company notes that "search results are for informational purposes only and do not represent a diagnosis. Consult your physician for guidance." In addition to photos of skin, the feature also works for "blisters on your lip, a line on your nails, or hair loss on your head." Google states that it developed this feature because "describing a strange mole or rash on the skin is hard to do with words alone."

The feature is intended to help people "somehow analyze" skin lesions using Google Lens. This is made possible by Google's "image search" feature, which can recognize and display similar images. This is also evident in other features presented in the above post (like shopping an item you've seen).


Advertising

However, the feature of Google Lens have several problem areas. On the one hand, the images are transferred to Google servers for analysis – where an AI software then analyzes the images. Not everyone want this. The bigger problem, however, is the question of how reliable the whole thing is. If the AI behind Google Lens doesn't report anything when it takes a picture of the skin, that doesn't mean "there's nothing there". And a hit in an image database doesn't have to mean it's skin cancer. Ultimately, only a doctor (dermatologist) will be able to assess the skin change after an examination. So skin cancer screening remains important. On the other hand, there is hope that people who don't go for screening may be alerted to abnormalities with the feature in Google Lens and then have it clarified by a doctor.


Advertising

This entry was posted in Software and tagged . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

Note: Please note the rules for commenting on the blog (first comments and linked posts end up in moderation, I release them every few hours, I rigorously delete SEO posts/SPAM).