Show HN: I'm a dermatologist and I vibe coded a skin cancer learning app

332 sungam 212 9/7/2025, 10:38:29 AM molecheck.info ↗
Coded using Gemini Pro 2.5 (free version) in about 2-3 hours.

Single file including all html/js/css, Vanilla JS, no backend, scores persisted with localStorage.

Deployed using ubuntu/apache2/python/flask on a £5 Digital Ocean server (but could have been hosted on a static hosting provider as it's just a single page with no backend).

Images / metadata stored in an AWS S3 bucket.

Comments (212)

jmull · 12h ago
I kind of love the diy aspect of ai coding.

A dermatologist a short while ago with this idea would have to find a willing and able partner to do a bunch of work -- meaning that most likely it would just remain an idea.

This isn't just for non-tech people either -- I have a decades long list of ideas I'd like to work on but simply do not have time for. So now I'm cranking up the ol' AI agents an seeing what I can do about it.

Waterluvian · 5h ago
I feel like the name “vibe code” is really the only issue I have. Enabling everyone to program computers to do useful things is very very good.
sollewitt · 3h ago
It captures not understanding what you’re doing crossed with limited AI understanding which means the whole thing is running on vibes.
AuthAuth · 4h ago
I wish that computers were designed in a way that pushed the users to script more. Its such a powerful ability that would benefit almost every worker.
Waterluvian · 4h ago
Apple has always been pretty good at this. AppleScript, Automator, Shortcuts. I did all kinds of cool stuff in OSX 10.4 back before I wrote any traditional code.
mbreese · 1h ago
Before that was HyperCard. It was always amazing to me the types of applications that could be written with HyperCard.

In a similar way, VBA was amazing in MS Office back in the day. If you ever saw someone who was good at Visual Basic in Excel, it’s impressive the amount of work that could get done in Excel by a motivated user who would have been hesitant to call themselves a programmer.

sleepybrett · 2h ago
Applesoft Basic
farai89 · 1h ago
I believe this captures it well. There are many people that would have previously needed to hire dev shops to get their ideas out and now they can just get them done faster. I believe the impact will be larger in non-tech sectors.
utyop22 · 41m ago
Most ideas suck and never deserve to see the light of day.

True productivity is when what is produced is of benefit.

jmkni · 10h ago
Same, I've had ideas rattling around in my brain for years which I've just never executed on, because I'm 'pretty sure' they won't work and it's not been worth the effort

I've been coding professionally for ~20 years now, so it's not that I don't know what to do, it's just a time sink

Now I'm blasting through them with AI and getting them out there just in case

They're a bit crap, but better than not existing at all, you never know

citizenpaul · 8h ago
>They're a bit crap, but better than not existing at all, you never know

I don't agree. I think because of llm/vibe coding my random ideas I've actually wasted more time then if I did them manually. The vibe code as you said is often crap and often after I spend a lot of time on it. Realize that there are countless subtle errors that mean its not actually doing what I was intending at all. I've learned nothing and made a pointless app that does not even do anything but looks like it does.

Thats the big allure that has been keeping "AI" hype floating. It always seems so dang close to being a magic wand. Then upon time spent reviewing and a critical eye you realize it has been tricking you like a janitor that is just sweeping dirt under the rug.

At this point I've relegated LLM to advanced find replace and Formatted data structuring(Take this list make it into JSON) and that's about it. There are basically tools that do everything else llms do that already exist and do it better.

I can't count at this point how many times "AI" has taken some sort of logic I want then makes a bunch of complex looking stuff that takes forever to review and I find out it fudged the logic to simply always be true/false when its not even a boolean problem.

anthonypasq96 · 1h ago
brother, no one cares. if LLMs made something exist that did not exist previously, they worked. it doesnt matter if you could have done it faster by hand if doing so would have resulted in the program not existing.
ecocentrik · 8h ago
I'm a big fan of barriers to entry and using effort as a filter for good work. This derma app could be so much better if it actually taught laypeople to identify the difference between carcinomas, melanomas and non-cancerous moles instead of just being a fixed loop quiz.
ptero · 6h ago
IMO it is better to keep the barriers to entry as low as possible for prototyping. Letting domain experts build what they have in mind themselves, on a shoestring, is a powerful ability.

Most such prototypes get tossed because of a flaw in the idea, not because they lacked professional software help. If something clicks the prototype can get rebuilt properly. Raising the barriers to entry means significantly fewer things get tried. My 2c.

bluefirebrand · 5h ago
> IMO it is better to keep the barriers to entry as low as possible for prototyping

Not in an industry where prototypes very often get thrown into production because decision makers don't know anything about the value of good tech, security, etc

goosejuice · 3h ago
That's completely fine for most software.
AlecSchueler · 5h ago
Same here, that's why I only ever code in assembly and recommend everyone else to do the same.
jmkni · 8h ago
Well I mean more low-brow stuff like "Pint?", a social media app to find other people to go for a pint with :)
sungam · 10h ago
Yes I agree - I could probably have worked out how to do it myself but it would have taken weeks and realistically I would never have had the time to finish it.
amelius · 10h ago
Well, image classification tasks don't require coding at all.

You just need one program that can read the training data, train a model, and then do the classification based on input images from the user.

This works for basically any kind of image, whether it's dogs/cats or skin cancer.

chaps · 9h ago
...none of this requires coding?
amelius · 9h ago
No additional coding.

You can take the code from a dog/cat classifier and use it for anything.

You only need to change the training data.

chaps · 9h ago
I've done enough image classification stuff that, nah. If all you care about is high level confirmation with high error rates, sure. But more complex tasks like, "Are these two documents the same?" are much, much harder and the failure modes are subtle.
amelius · 9h ago
I think most experts wouldn't approach this problem as an image classification problem ...

And, more importantly, I don't think you'll see good results either from a vibe-coded solution.

So I don't think your comment makes sense here.

jacquesm · 7h ago
> I think most experts wouldn't approach this problem as an image classification problem ...

Indeed. It is first and foremost a statistics and net patient outcomes problem.

The image classification bit - to the best of the current algorithms abilities - is essentially a solved problem (even if it isn't quite that simple), and when better models become available you plug those in instead. There is no innovation there.

The hard part is the rest of it. And without a good grounding in medical ethics and statistics that's going to be very difficult to get right.

chaps · 8h ago
It's a problem that has many image classification components to it.

"Vibe coding" does a surprisingly good job at this problem.

Yes it does. :)

amelius · 8h ago
Maybe but you have broadened the scope from a simple image classification problem to a pipeline of multiple image classifications steps.
chaps · 7h ago
Friend, we're talking about classifying skin cancer. The topic is already quite broad.
amelius · 7h ago
I think it is a pointless discussion because at some level we are both right.

I'm not going to argue with the idea that a pre-made classifier can be improved upon by experts.

But pre-made classifiers exist and are useful for a very large variety of tasks. This was the original point.

runako · 9h ago
> No additional coding.

> You can take the code from

https://xkcd.com/2501/

More seriously, for most non-programmers, even typing into a console is "coding."

growingkittens · 9h ago
I am a "noncoder" because of a number of reasons. My best friend is a "coder" and still starts instructions with "It's easy! Just open the terminal...".

Unfortunately, I do advanced knowledge work, and the tools I need technically often exist...if you're a coder.

Coding is not that accessible. The intermediary mental models and path to experience required to understand a coding task are not available to the average person.

yread · 9h ago
Why? I know tons of coding MDs. Pathologist hacking the original Prince and adding mods also just in assembly. Molecular pathologist organizing their own pipelines and ETLs.

Lots of people like computers but earn a living doing something else

jonahx · 9h ago
He wasn't saying no coding MDs existed. Just that, generally speaking, most MDs would have had to partner with a technical person, which is true. And is now less true than it was before.
jjallen · 12h ago
Very cool. I learned a lot as a non dermatologist but someone with a sister who has had melanoma at a very young age.

I went from 50% to 85% very quickly. And that’s because most of them are skin cancer and that was easy to learn.

So my only advice would be to make closer to 50% actually skin cancer.

Although maybe you want to focus on the bad ones and get people to learn those more.

This was way harder than I thought this detection would be. Makes me want to go to a dermatologist.

alanfranz · 6h ago
> So my only advice would be to make closer to 50% actually skin cancer.

If I were to code this for "real training" of a dermatologist, I'd make this closer to "real world" training rate. As a dermatologist, I'll imagine that probably just 1 out of 100 (or something like that) skin lesions that people could imagine are cancerous, actually are so.

With the current dataset, there're just too many cancerous images. This makes it kind of easy to just flag something as "cancerous" and still retain a good "score" - but the point is moot, if as a dermatologist you send _too many_ people without cancer to do further exams, then you're negating the usefulness of what you're doing.

mewpmewp2 · 2h ago
It needs a specific scoring system where each false positive has a lower score drop, but false negative has a huge one. At the same time like you said positives would be much rarer. Should be easy to ask LLM to vibe code that so it would simulate real world and its consequences.
sungam · 12h ago
Thanks, this is a good point - I think a 50:50 balance of cancer versus harmless lesions would be better and will change this in a future version.

Of course in reality the vast majority of skin lesions and moles are harmless and the challenge is identifying those that are not and I think that even a short period of focused training like this can help the average person to identify a concerning lesion.

wizzwizz4 · 5h ago
jjallen · 4h ago
Thought about this some more. I think you want to start at 100% or high so people actually learn what needs to be learned: what malignant skin conditions actually look like.

And then once they have learned you get progressively harder and harder. Basically the closer to 50% you are the harder it will be to have a score higher than chance/50%.

loeg · 6h ago
I found the first dozen to be mostly cancer and then the next dozen were mostly non-cancer. (Not sure if it's randomized.) (Also, I'm really bad at identifying cancerous vs non-cancerous skin lesions.)
sungam · 6h ago
It is randomized so probably just bad luck! FWIW I get a high score and another skin cancer doctor who commented also gets a high score so it is possible to make the diagnosis in most cases on the basis of these images.
globalise83 · 4h ago
As someone with literally every single possible variation of skin blemish, mole and God knows what else, this scares the living hell out of me.
abootstrapper · 2h ago
Get a yearly full body skin check from a dermatologist. It’s a common thing. I’ve been doing it for years because of my skin type. They caught early Basal cell carcinoma the last time I went.
mewpmewp2 · 2h ago
Yeah, I only have just 1 concerning, but still made me spend 20 minutes googling difference between dermatofibroma and basal cell cancer. I think it is dermatofibroma, but I guess good point anyway to let it get checked out.
vindex10 · 11h ago
Hi! That's really useful tool!

I wish it also explained the decision making process, how to understand from the picture what is the right answer.

I'm really getting lost between melanoma and seborrheic keratosis / nevus.

I went through ~120 pictures, but couldn't learn to distinguish those.

Also, the guide in the burger menu leads to a page that doesn't exist: https://molecheck.info/how-to-recognise-skin-cancer

sungam · 10h ago
This is very helpful feedback. I will add some more information to help with the diagnosis and add an article in the burger menu with detailed explanation.

Being honest I didn't expect anyone apart from a few of may patients to use the app and certainly did not expect front page HN!

jgilias · 7h ago
Hey!

Thanks for making this! A bit more polish and this is something I’d make sure everyone in my family has played with.

Imagine a world where every third person is able to recognise worrying skin lesions early on.

jgilias · 7h ago
Also came to the same conclusion. I want a mode where 50% of the set are melanomas, and the other 50% are “brown benign things”.
sungam · 6h ago
Will add this in next version!
jacquesm · 7h ago
Nice job. Now you really need to study up on the statistics behind this and you'll quickly come to the conclusion that this was the easy part. What to do with the output is the hard part. I've seen a start-up that made their bread and butter on such classifications, they did an absolutely great job of it but found the the problem of deciding what to do with such an application without ending up with net negative patient outcomes to be far, far harder than the classification problem itself. The error rates, no matter how low, are going to be your main challenge, both false positives and false negatives can be extremely expensive, both in terms of finance and in terms of emotion.
sungam · 6h ago
Thanks for your comment - the purpose of this app is patient education rather than diagnosis but I will definitely have a look at the relevant stats in more detail!
jacquesm · 5h ago
The risk I think is that people will not understand that that is your goal, instead they will use it to help them diagnose something they might think is suspicious.

They will go through your images until they get a good score, believe themselves and expert and proceed to diagnose themselves (and their friends).

By the time you have an image set that is representative and that will actually educate people to the point where they know what to do and what not to do you've created a whole raft of amateur dermatologists. And the result of that will be that a lot of people are going to knock on the doors of real dermatologists who might tell them not to worry about something when they are now primed to argue with them.

I've seen this pattern before with self diagnosis.

thebeardisred · 2h ago
To that end I quickly learned something that AI models would as well (which isn't your intention):

Pictures with purple circles (e.g. faded pen ink on light skin outlining the area of concern) are a strong indicator of cancer. :wink:

lukko · 12h ago
I'm a doctor too and would love to hear more about the rationale and process for creating this.

It's quite interesting to have a binary distinction: 'concerned vs not concerned', which I guess would be more relevant for referring clinicians, rather than getting an actual diagnosis. Whereas naming multiple choice 'BCC vs melanoma' would be more of a learning tool useful for medical students..

Echoing the other comments, but it would be interesting to match the cards to the actual incidence in the population or in primary care - although it may be a lot more boring with the amount of harmless naevi!

sungam · 9h ago
Thanks for your comment. The main motivation for me in developing the app was that lots of my patients wanted me to guide them to a resource that can help them improve their ability to recognise skin cancer and, in my view, a good way to learn is to be forced to make a decision an then receive feedback on that decision.

For the patient I think the decision actually is binary - either (i) I contact a doctor about this skin lesion now or (ii) I wait for a bit to see what happens or do nothing. In reality most skin cancers are very obvious even to a non-expert and the reason they are missed are that patients are not checking their skin or have no idea what to look for.

I think you are right about the incidence - would be better to be a more balanced distribution of benign versus malignant, but I don't think it would be good to just show 99% harmless moles and 1% cancers (which is probably the accurate representation of skin lesions in primary care) since it would take too long for patients to learn the appearance of skin cancer.

jazoom · 7h ago
> most skin cancers are very obvious even to a non-expert and the reason they are missed are that patients are not checking their skin or have no idea what to look for

I am a skin cancer doctor in Queensland and all I do is find and remove skin cancers (find between 10 and 30 every day). In my experience the vast majority of cancers I find are not obvious to other doctors (not even seen by them), let alone obvious to the patient. Most of what I find are BCCs, which are usually very subtle when they are small. Even when I point them out to the patient they still can't see them.

Also, almost all melanomas I find were not noticed by the patient and they're usually a little surprised about the one I point to.

In my experience the only skin cancers routinely noticed by patients are SCCs and Merkel cell carcinomas.

With respect, if "most skin cancers are very obvious even to a non-expert" I suggest the experts are missing them and letting them get larger than necessary.

I realise things will be different in other parts of the world and my location allows a lot more practice than most doctors would get.

Update: I like the quiz. Nice work! In case anyone is wondering, I only got 27/30. Distinguishing between naevus and melanoma without a dermatoscope on it is sometimes impossible. Get your skin checked.

sungam · 6h ago
Thanks for your kind words with regards to the app and well done for getting such a high score!. I agree that BCC is often subtle. My practice is also largely focused on skin cancer. I would say that the majority of melanomas (and SCCs) that I diagnose would be obvious to a patient that underwent a short period of focused training and checked their skin regularly. A possible explanation for the difference in our experience is that the incidence of skin cancer (and also atypical but benign moles) a lot higher in Australia than in the UK.
jazoom · 5h ago
There would be quite the difference in our patient demographics.

I have quite a few patients from the UK who have had several skin cancers. Invariably they went on holidays to Italy or Spain as a child and soaked up the sun.

Keep up the great work.

meindnoch · 9h ago
sungam · 9h ago
According to the metadata supplied with the dataset yes

Could definitely be a misclassification, however a small proportion of moles that look entirely harmless to the naked eye and under the dermatoscope (skin microscope) can be cancerous.

For example, have a look at these images of naevoid melanoma: https://www.google.com/search?tbm=isch&q=naevoid+melanoma

This is why dermatology can be challenging and why AI-based image classification is difficult from a liability/risk perspective

I was previously clinical lead for a melanoma multidisciplinary meeting and 1-2 times per year I would see a patient with a melanoma that presented like this and looking back at previous photos there was no features that would have worried me.

The key thing that I emphasise to patients is that even if a mole looks harmless it is important to monitor for any signs of change since a skin cancer will almost always change in appearance over a period of several months

jonahx · 8h ago
> however a small proportion of moles that look entirely harmless to the naked eye and under the dermatoscope (skin microscope) can be cancerous.

That is very scary.

So the only way to be sure is to have everything sent to the lab. But I'm guessing cost/benefit of that from a risk perspective make it prohibitive? So if you're an unlucky person with a completely benign-presenting melanoma, you're just shit out of luck? Or will the appearance change before it spreads internally?

sungam · 8h ago
This is why dermatology involves risk management not just image interpretation. Yes the lesion will likely change with time. Realistically yes, if you have a melanoma that looks like a harmless mole then the diagnosis is likely to be delayed. But remember that these are a tiny proportion of all skin cancers and you are much more likely to get some other form of cancer - most of which occur internally and cannot be seen at all.
kmoser · 8h ago
This is a good example of what I find frustrating as a patient. Sure, cancers like that may be a tiny proportion of all skin cancers, but if I have it then it's 100% of my skin cancers. And given how serious skin cancer can be, I'd at least want my doctor to let me know how I could get this lesion tested, even if it's out of my own pocket.
daedrdev · 5h ago
The risk of misdiagnosis and thus unnecessary treatment, can mean that such testing can increase your actual chance of dying or decrease your life expectancy. It depends on the case but its why we don't test generically for cancers unless someone is high risk (such as being old)
sungam · 7h ago
I agree with you - if a patient is concerned by a specific skin lesion and requests removal then I will support this even it appears harmless particularly if it is new or changing
48terry · 5h ago
> According to the metadata supplied with the dataset yes

"idk but that's what it says" somehow this does not inspire confidence in the skin cancer learning app.

No comments yet

jonahx · 9h ago
Yeah that seems likely to be a misclassification...
DrewADesign · 12h ago
This is awesome. Great use of AI to realize an idea. Subject matter experts making educational tools is one of the most hopeful things to come out of AI.

It’s just a bummer that it’s far more frequently used to pump wealth to tech investors from the entire class of people that have been creating things on the internet for the past couple of decades, and that projects like this fuel the “why do you oppose fighting cancer” sort of counter arguments against that.

jacquesm · 7h ago
On the contrary. There is a whole raft of start-ups around this idea and other related ones. And almost all of them have found the technical challenges manageable, and the medical and ethical challenges formidable.
DrewADesign · 6h ago
I’m not exactly sure what in my comment you’re responding to, here: My appreciation that a subject matter expert is now capable of creating a tool to share their knowledge, that tech investors are using AI to siphon money from people that actually make things, or that good projects like this are used to justify that siphoning?
jacquesm · 6h ago
You wrote:

"This is awesome. Great use of AI to realize an idea. Subject matter experts making educational tools is one of the most hopeful things to come out of AI.

It’s just a bummer that it’s far more frequently used to pump wealth to tech investors from the entire class of people that have been creating things on the internet for the past couple of decades, and that projects like this fuel the “why do you oppose fighting cancer” sort of counter arguments against that."

Let's take that bit by bit then if you find it hard to correlate.

> This is awesome.

Agreed, it is a very neat demonstration of what you can do with domain knowledge married to powerful technology.

> Great use of AI to realize an idea.

This idea, while a good one, is not at all novel and does not require vibe coding or LLMs in any way, but it does rely on a lot of progress in image classification in the last decade or so if you want to take it to the next level. Just training people on a limited set of images is not going to do much of anything other than to inject noise into the system.

> Subject matter experts making educational tools is one of the most hopeful things to come out of AI.

Well.. yes and no. It is a hopeful thing but it doesn't really help when releasing it bypasses the whole review system that we have in place for classifying medical devices. And make no mistake: this is a medical diagnostic device and it will be used by people as such even if it wasn't intended as such. There is a fair chance that the program - vibe coded, remember? - has not been reviewed and tested to the degree that a medical device normally would be and that there has been no extensive testing in the field to determine what the effect on patient outcomes of such an education program is. This is a difficult and tricky topic which ultimately boils down to a long - and possibly expensive - path on the road to being able to release such a thing responsibly.

> It’s just a bummer that it’s far more frequently used to pump wealth to tech investors from the entire class of people that have been creating things on the internet for the past couple of decades

As I wrote, I'm familiar with quite a few startups in this domain. Education and image classification + medical domain knowledge is - and was - investable and has been for a long time. But it is not a simple proposition.

> and that projects like this fuel the “why do you oppose fighting cancer” sort of counter arguments against that.

Hardly anybody that I'm aware of - besides the Trump administration - currently opposes fighting cancer, there are veritable armies of scientists in academia and outside of it doing just that. This particular kind of cancer is low hanging fruit because (1) it is externally visible and (2) there is a fair amount of training data available already. But even with those advantages the hard problems, statistics, and ultimately the net balance in patient outcomes if you start using the tool at scale are where the harsh reality sets in: solving this problem for the 80% of easy to classify cases is easy by definition. The remaining 20% are hard, even for experts, more so for a piece of software or a person trained by a piece of software. Even a percentage point or two shift in the confusion matrix can turn a potentially useful tool into a useless one or vice versa.

That's the problem that people are trying to solve, not the image classification basics and/or patient education, no matter how useful these are when used in conjunction with proper medical processes.

But props to the author for building it and releasing it, I'm pretty curious about what the long term effect of this is, I will definitely be following the effort.

Better like that?

DrewADesign · 5h ago
> 684 words

I believe this is a simple educational quiz using a pre-selected set of images from cited medical publications to help people distinguish between certainly benign and potentially cancerous skin anomalies… Is that incorrect?

jacquesm · 4h ago
Yes, that's correct.

But that won't stop people from believing they are now able to self diagnose.

DrewADesign · 4h ago
Is that also a problem with pamphlets that juxtapose these same exact sort of images?
jacquesm · 4h ago
> Is that also a problem with pamphlets that have these same exact sort of images?

Such pamphlets typically contain a lot more guidance on what the context is within which they are provided. They don't come across as a 'quiz' even if they use the same images and they do not try to give the impression of expertise gathered. They tend to be created by communications experts who realize full well what the result of getting it wrong can be. Compared to 'research on the internet' there is a lot of guidance in place to ensure that the results will be a net positive.

https://www.kanker.nl/sites/default/files/library_files/563/...

Is a nice example of such a pamphlet. You were complaining about the number of words I used. Check the number of words there compared to the number of words in the linked website.

There is no score, there is no 'swiping' and there is tons of context and raising of awareness, none of which is done by this app. I'm not saying such an app isn't useful, but I am saying that such an app without a lot of context is potentially not useful and may even be a negative.

DrewADesign · 4h ago
Alrighty. I think you’re reading far far far too much into the implications of a slightly interactive version of a poster that was in my high school nurse’s office. I’m all set here. Have a good one.
jacquesm · 3h ago
That 'slightly interactive' bit and the fact that it is now in the home rather than in your high school nurse's office is what makes all the difference here.
pojzon · 5h ago
I hope at some point AI will replace most diagnostics and doctors that are not up to date.

I also hope it will completely kill US pharmacy conglomerate.

AI was trained on public domain knowledge. All things we get from it should be free and available everywhere.

I can only hope.

jacquesm · 4h ago
> I hope at some point AI will replace most diagnostics and doctors that are not up to date.

That's a valid hope, but not a very realistic one just yet. The error rates are just too high. Medicine is messy and complex. Yes, doctors get it wrong every now and then. But AI gets it wrong far more frequently, still. It can be used as a tool in the arsenal of the medical professional, but we are very far away from self-service diagnosis for complex stuff.

> I also hope it will completely kill US pharmacy conglomerate.

That is mostly based on molecules and patents, not so much on diagnostics, that's a different group of companies.

> AI was trained on public domain knowledge. All things we get from it should be free and available everywhere.

Not necessarily, but for the cases where it is I agree that the models should be free and open.

> I can only hope.

Yes. I've seen some very noble efforts strand on lack of capital and every time that happens I realize that not everything is as simple as I would like it to be. I've just financed a - small - factory for something that I consider both useful and urgent, but my means are limited and it was clear that I had no profit motive (which actually means my money went a lot further than if I had had a profit motive).

Once you get into medical education or diagnostics the amounts usually run into the millions if you want to really move the needle. No single individual is going to put that out there on their own dime unless they were very wealthy to begin with. I've invested in a couple of companies like that. They all failed, predictably, because raising follow on investments for such stuff is very hard, even if you can get it to work in principle.

The best example of stuff like that that did work is how the artificial pancreas movement is pushed forward hard by people hacking sensors and insulin pumps. They have forced industry to wake up and smell the coffee: if they weren't going to be the ones to offer it then someone else inevitably would. Even so it is a hard problem to solve properly. But it is getting there:

https://rorycellanjones.substack.com/p/wearenotwaiting-the-p...

sungam · 9h ago
Thanks for your comment - I'm pleased that people have found it useful and definitely only possible because of AI coding. I agree that this is likely to be applicable to non-experts in many different areas.
DrewADesign · 9h ago
Absolutely. I hope you’ll encourage your colleagues to follow suit!
jonahx · 9h ago
Cool project, and helpful for learning.

One concern:

I don't believe the rates that you see "concerning" vs "not-concerning" in the app match the population rates. That is, a random "mole-like spot or thingy" on a random person will have have a much lower base rate of being cancerous than the app would suggest.

Of course, this is necessary to make the learning efficient. But unless you pair it with base rate education it will create a bias for over-concern.

sungam · 9h ago
Yes you are right - the representation is biased due to the image dataset that I have used.

I don't think it would be useful to match the population distribution since the fraction of skin cancers would be tiny (less than 1:1000 of the images) so users would not learn what a skin cancer looks like, however in the next version I will make it closer to 50:50 and highlight the difference from the population distribution.

jonahx · 8h ago
Yes. As I said matching the population base rate wouldn't be practical, so you'd need to educate on that separately from the identification learning.

Let's say I achieve a 95% on the app though. Most people would have a massively over-inflated sense of their correctness in the wild. If the actual fraction is only 1/1000 and I see a friend with a lesion I identify as concerning, then my actual success rate would be:

    1*0.95 / (0.05*999 + 1*0.95)
So ~1.8%, not 95%. Few people understand Bayesian updating.
sungam · 7h ago
Thanks for this - I need to look at this more carefully
reilly3000 · 1h ago
There may be an interesting opportunity to gather data on the accuracy of guesses per image. You could use something like Google analytics, but simple server-side logging is more private and keeps the page light.

The question could be: What images are most often mistaken? What characteristics do they share? Knowing the highest false negative images would be really valuable people to know what not to ignore.

rfrey · 11h ago
Perfect use of AI assisted coding - a domain expert creating a focused, relatively straightforward (from a programming perspective) app.

@sungam, if your research agenda includes creating AI models for skin cancer, feel free to reach out (email in profile), I make a tool intended to help pure clinical researchers incorporate AI into their research programmes.

sungam · 10h ago
Thanks, I am not currently doing research in this area - my lab-based research is mainly focused on the role of fibroblasts in skin cancer development
andreasgl · 11h ago
I like the project! Congrats on the launch.

As I understand it, size is one of the key indicators of melanoma. But in some of these images, it’s difficult to tell whether the mole is 1 mm or 10 mm. I assume your image set doesn’t include size information. If you can find sources with rulers or some kind of scale, that would be very helpful.

sungam · 9h ago
I will have a look at this and include the size if it is possible
danlamanna · 7h ago
Many of the images do include a size, see https://api.isic-archive.com/images/?query=clin_size_long_di....

FWIW @sungam - I'm one of the maintainers of the ISIC Archive, so feel free to let me know if finding/downloading data could be made easier. It's always interesting to see people using our data in the wild :)

sungam · 7h ago
Thanks for this - and thanks for maintaining this incredibly useful resource. What would be the best way to contact you?
danlamanna · 2h ago
firstname.lastname at kitware dot com.
omer9 · 10h ago
Every image with a pen marking is dangerous/cancer. Check.
sungam · 10h ago
Haha not all of them - but actually this is an important observation because when training convnets for skin cancer diagnosis the presence of the pen marking can be an important confounding factor that needs to be accounted for
lazarus01 · 10h ago
What you created is a version of “am I hot or not” for skin cancer. The idea is constrained to the limitations of your programming capability. Showing a photo and creating 3 buttons with a static response is not very helpful. These are the limits of vibe coding.

I was thinking to train a convnet to accurately classify pictures of moles as normal vs abnormal. The user can take a photo and upload it to a diagnostic website and get a diagnosis.

It doesn’t seem like an overly complex model to develop and there is plenty of data referring to photos that show normal vs abnormal moles.

I wonder why a product hasn’t been developed, where we are using image detection on our phones to actively screen for skin cancer. Seems like a no brainer.

My thinking is there are not enough deaths to motivate the work. Dying from melanoma is nasty.

sungam · 10h ago
The goal of my app is to educate patients so that they recognise that they need to take further action.

Regarding AI-assisted skin cancer diagnosis: This is a huge area that started with the publication of Esteva et al (https://www.nature.com/articles/nature21056) and there have been hundreds of publications since. There are large publicly available datasets that anyone can work with (https://challenge.isic-archive.com/).

My lab has previously trained / evaluated convnets for diagnosis of skin cancer e.g. see this publication: https://pubmed.ncbi.nlm.nih.gov/32931808/

I have no doubt that it will be possible to train an AI model to perform at the same level as a dermatologist and AI models will become increasingly relevant. The main challenge at the moment is navigating uncertainty / liability since a very small proportion of moles / skin lesions that appear entirely harmless both the naked eye and with the dermatoscope (skin microscope) are cancerous.

lazarus01 · 10h ago
Thanks for including those information resources. This is something I’m interested in digging deeper into.
g-mork · 10h ago
You're talking down to a technically unskilled dermatologist for successfully producing a useful app without the help of an engineer? Curious behaviour! This is far from the first story like this, in combination they're a potent bellwether for the future of our little corner of the universe, engaging in denials really doesn't help anyone
lazarus01 · 10h ago
I wouldn’t call it “successful” or “useful”. It was a low effort attempt to make something interesting and it wasn’t. It’s a response to the hype of vibe coding. Lowers the bar for what good software really is.

Perhaps you may want to question your bias and ability to process criticism.

Anyone who shares their ideas publicly will receive criticism. Not only is it ok, it’s helpful to expand the discussion beyond your bias.

petralithic · 9h ago
> It was a low effort attempt to make something interesting and it wasn’t.

Maybe to you, but others in this thread found it interesting.

> Lowers the bar for what good software really is.

Software is a means to some end, not the end in itself. I can make the best coded software that does nothing [0], there is no point to that other than to practice one's skills, but again, those skills are to achieve something in the end.

[0] https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...

jgilias · 6h ago
The issue is that your criticism is misguided and not very helpful. In your parent comment you totally miss the forest for the trees. Or, the reason why this app has been made in the first place.

Further, your suggestions are inactionable, and again, miss the point. It’s a low effort - “Lol, why don’t you just…”. No, the point is not to find skin cancer. The point is to show a bunch of pictures to people who are interested, and let them see if they can identify worrying skin lesions.

Teknomadix · 9h ago
This vibe coded app totally is helpful.

Improved my score from an abysmal 40% in under 15 units to above 95% accuracy. Also realize that I have skin lesion that warrant an immediate dermatologist visit.

Your characterizations are unnecessarily salty.

nlawalker · 10h ago
I disagree, I found this very helpful. In a very short amount of time I was granted the insight, in a very clear way, that I am not very good at determining whether moles need treatment based on how they look.
raincole · 10h ago
I really don't think you can publish the app you described in any developed country without an army of lawyers. And this army had better be prepared to lose many battles.
thimkerbell · 10h ago
Could it be built from an island off Costa Rica?
cjbgkagh · 9h ago
“am I hot or not” is a great paradigm for many things, is it porn or not etc. 3 buttons are perfectly sufficient for getting this information from the users for rating systems in general. This is not a rating system as samples are labeled from actual test results.

AFAIK Netflix got rid of their 5 star rating as the signal over 2 stars wasn’t worth the mental overhead from users having to decide between a 4 and a 5. Also star rating are culturally dependent so you have to normalize for that effect. In general it’s a total hassle.

hombre_fatal · 10h ago
Every dermatologist (and developer with a dermatologist relative) in the world has had that app idea since most of your daily checkups are moles that you categorize in seconds.

The app already exists btw. Did nobody in this thread google it before saying it couldn't work?

hedgehog · 10h ago
Developing a model like that, and evaluating it with practicing doctors, is a good learning project.
i000 · 10h ago
What an utterly disappointing comment. FWIW I spent 15min on the app, and found it very helpful to see examples of the various kinds of skin lesion - it will likely motivate me to see a doctor when I see a similar malignant skin lesion. Educating people is very helpful.
rogerrogerr · 10h ago
We need liability reform - any app in the US would either tell you ~everything is skin cancer, or it would show one false negative and get sued into oblivion.
s3v · 7h ago
I found this app to be very helpful and educational. You, on the other hand, are being a jerk.
jampekka · 12h ago
To my eye most of the basal cell carsinomas looked like everyday rashes, pimples or scratches. My correct rate was under chance. This could be hypochondria inducing for many?
sungam · 12h ago
Basal cell carcinomas can look very similar to other harmless skin lesions. The key thing is that they will not resolve with time and will slowly grow whereas a rash, pimple or scratch will resolve over a few months.

Fortunately basal cell carciomas are very slow growing and do not spread elsewhere in the body or cause other health issues and a delay of a few months in diagnosis does not have a big impact on outcome.

owenversteeg · 9h ago
(spoilers!) here's how to win: everything is cancer, except the common moles and the keratoses.

OP, what are some of the other common options for a spot on the body aside from common moles, cancer, and keratoses? Solar lentigines, freckles, bug bites, eczema? I'm also curious what the actual chance of cancer is given a random mole anywhere on the body, obviously a more involved question.

sungam · 8h ago
Good observations! But hopefully you learned something in coming to those conclusion...

The chance of a random skin lesion being skin cancer is extremely low. Apart from the appearance key things to look for are a lesion that is not going away particularly if it is changing in appearance.

Here are some other common skin lesions: - Dermatofibroma (harmless skin growth) - Actinic keratosis (sun damage) - Milium - Comedome - Acne pustule / nodule - Viral wart - Molluscum contagiousum (harmless viral growth) - Cherry angioma (harmless blood vessel growth) - Spider naevus (another type of blood vessel growth)

There are more than 2000 diagnoses in dermatology so not an exhaustive list!

haspok · 9h ago
A few years ago there used to be an ML-based app for Android that could classify photos of lesions that you took with your phone and could recommend you a visit to the dermatologist (or not). Unfortunately it seems to be removed now, the webpage is still live (somewhat): https://emdee.ai/

It was done by a small team in Hungary, with the support of MDs of course. (I would guess that the majority of the work was coordinating with MDs, getting them to teach the software... and collecting photos of lesions. Must have been fun!)

They probably could not monatize it (or were not interested, or it was just too much work for a side hustle)... the sad reality of living in Eastern Europe.

I do think that the idea is perfect, it is non-invasive, but could warn you of a potentially very dangerous condition in time. You don't have to wait for the doctor, or unnecessarily visit them. I would actually pay for this as a service.

sungam · 8h ago
Making an app like this is (relatively) straightforward. The challenge is managing liability / risk / regulation. For individual doctors we accept that some errors will occur and there is a well defined insurance / liability framework. We do not yet have this for AI but I think it will come eventually.
epolanski · 8h ago
It's Hungary, a country in the EU, I see no reason why would they not be able to monetize it if they wanted to.

Bar a lack of a vibrant VC scene they have the very same monetization option one in SF would have.

The most probable reason they did not was to avoid assuming legal responsibility for the results.

lelele · 9h ago
There is another one such app: https://www.skinvision.com/
rcruzeiro · 11h ago
I’ve learned that basal cell carcinoma can look scarily unremarkable!

Would be useful to add some explanation on the defining features that would give it away to a dermatologist.

sungam · 9h ago
Yes I need to add this along with a tutorial on skin cancer diagnosis. Honestly wasn't expecting anyone to use the app so just did the basics!
agnishom · 13h ago
This is a good use of vibecoding. The main "algorithm" to be implemented is very straightforward , and for the hard stuff, we have an expert.
sungam · 12h ago
Yes I think so - it's a very simple application but I would never have had the time to do it myself.

If anyone is interested: Coded using Gemini Pro 2.5 (free version) in about 2-3 hours. Single file including all html/js/css, Vanilla JS, no backend, scores persisted with localStorage.

derbOac · 3h ago
Cool but it seems like it would get more difficult with more non-cancerous but medically concerning lesions (eg due to infectious disease).
sungam · 2h ago
This is true - there are more than 2000 different conditions in dermatology but the most important ones to recognise are skin cancers
JeremyJaydan · 2h ago
I absolutely love the reality check on doctors coding and skin cancer, great work!
thimkerbell · 2h ago
Why would a dermatologist want to just remove but not biopsy a suspected facial skin cancer?
sungam · 2h ago
There would be no reason to do this - skin lesions removed are almost always sent for pathological analysis even if expected to be harmless
y-curious · 12h ago
Half of these basal cell carcinomas look like picked pimples. Are there any sort of protocols for self screening for carcinomas a la self-testing ones testicles? I've never heard of anything other than the ABCDE for moles
sungam · 12h ago
Look for any new skin lesion that is not resolving with time especially if persisting for a number of months. You can take photos of different body sites and repeat every couple of months and then put the two photos side by side on a computer screen to look for any difference. If unsure about the lesions that are present then worth getting a full skin check with a dermatologist as a baseline so that you then just need to look for new/changing lesions.

Photos of basal cell carcinoma (no affiliation): https://dermnetnz.org/topics/basal-cell-carcinoma

lukko · 12h ago
Classically, BCC's have a pearly surface and 'rolled' edges, which differentiates them from pimples.
cjbgkagh · 10h ago
This is great, I had no idea how off base I was with my assumptions. It’ll be interesting to keep the usage data to find out what kinds of images people have the most trouble with. As in what kind of mole is the most likely to be missed. Though perhaps dermatologist already know that answer well enough.

I would love to see more of such classifiers for other medical conditions, googling for images tends not to produce a representative sample.

sungam · 9h ago
Thanks, I'm really pleased that people have found it useful! Wasn't expecting much from the app just coded it in an evening as it's something I've been thinking about for years
johannes_ne · 5h ago
I made a quite similar app 7 years ago. https://melanoma.jenevoldsen.com/

May have been in the training data.

sungam · 5h ago
That's great! I used the publicly available images from the ISIC challenge dataset which are CC licensed.
incone123 · 6h ago
The link to "how to recognise..." is broken.

Nice app. But wouldn't a doctor normally get a history as well? Anyway, I'm not a doctor which is probably why I got most of the answers wrong :)

sungam · 6h ago
Yes a doctor would get a history which will help, however I get a high score on these images and another skin cancer doctor that commented also got a high score so in most cases the diagnosis can be made accurately on the image alone.
toledocavani · 12h ago
Is there any reputable (reviewed, endorsed) AI model to detect skin cancer? I have a lot of similar moles, and playing with this app make me concern about all of them.
sungam · 7h ago
Lots of models out there but I would not trust any for diagnosis without review of a dermatologist yet. The challenge is unanticipated edge cases and managing risk/liability/regulation. I have no doubt that if a major AI company focused on this problem then these issues could be overcome with current technology but perhaps the market is not big enough to justify the investment required.
scotty79 · 12h ago
I heard that the good rule of thumb is to be concerned about unique ones. It much less probable that you develop exactly same looking cancer in two unrelated spots.
sungam · 6h ago
Yes we call this the "Ugly Duckling" sign
redox99 · 11h ago
Are there an equal amount of cancer and non cancer images? In my case the vast majority (I'd say around 75%) are cancerous.
sungam · 9h ago
You are right - the distribution is not equal largely because the dataset that I used had less pictures of harmless moles but I will aim to make it 50:50 in the next version
m-hodges · 9h ago
Good example of shovelware that some say is absent.¹

¹ https://mikelovesrobots.substack.com/p/wheres-the-shovelware...

sungam · 9h ago
Had not heard of that term before, but looking at the article I would agree!
ziptron · 11h ago
Thank you for making this.

My dad passed away from squamous cell carcinoma in 2010. In retrospect, through my casual research into the space and tools like this one, it occurs to me that the entire event was likely preventable and occurred merely because we did not react quickly enough to the cancer’s presence.

sungam · 9h ago
Thanks for your comment - my practice is focused on skin cancer and I see so many patients that bring a photo from many months earlier that shows an obvious skin cancer that could have been treated more easily at an earlier stage. Patient education should enable these to be picked up sooner.
saulpw · 9h ago
The hamburger menu "About" and "How To Recognize Skin Cancer" both go to a 404 page that's a copy of a company website called "Revessa Health". Is this your company?
sungam · 8h ago
Thanks for highlighting - I wasn't expecting anyone to use the app so haven't added these yet but will do so asap

The app is hosted on my Digital Ocean server that hosts a few other projects including my Revessa Health site

krunger · 6h ago
No reason why it couldn't have been done in reverse, have a programmer code it while using AI to understand skin cancer.
sungam · 6h ago
There are lots of apps that do this. It's (relatively) easy to get AI to perform at the same level of a dermatologist but liability/risk management/regulation is much harder to solve
ajkjk · 6h ago
Well.. there is somewhat more on the line if it's wrong that way.
leetrout · 13h ago
Why do the images get a weird offset slice effect on safari on mobile after submitting a guess with the buttons?
sungam · 12h ago
No idea, I will look into this
dhruvbird · 10h ago
This is awesome! After about 50 attempts, I have a much better sense of what to look out for when I see something. I wish there were more such focused apps. for specific specific health related things.
sungam · 10h ago
Thanks, I am pleased you found it useful. This is exactly how a dermatologist learns to recognise skin cancer - by making decisions and then getting feedback. I think anyone can improve dramatically with an hour or so of practice and then this skill is useful lifelong.
dhruvbird · 9h ago
I wonder if most people get the same things wrong. I checked your other comments and noticed that there is no server side component. In case you add one, I would be really interested in knowing which ones are most confused, and in which direction.
sungam · 6h ago
Yes I could add this quite easily - will do in the next version
Uptrenda · 1h ago
Biggest irony of the thread is the OP and the ones commenting celebrating the tech putting themselves out of the job while contributing it. Eg. a dermatologist who looks at skin conditions -- a very visual skill. They use their skills at that to build an app that people can use to check for skin cancer, rashes, whatever. Now, people have less incentive to see a dermatologist and might miss the zebras (and in fact: people are lazy and tend to hate doctors already so they won't.) Then there's the software engineers here who (even if you're a high level senior engineer) are further moved down the chopping board the better AI gets.

YAY, three cheers for all the soy boys building AI. See you on unemployment soon.

minton · 11h ago
The zoomed in view is great if you’re commonly examining under magnification, but perhaps a slightly less zoomed view (or ability to switch between each) might make this more practical for common folks.
sungam · 9h ago
Thanks, I will try to source alternative images for the next version
orliesaurus · 10h ago
What did you use to build this? Where did you deploy?
sungam · 10h ago
Coded using Gemini Pro 2.5 (free version) in about 2-3 hours.

Single file including all html/js/css, Vanilla JS, no backend, scores persisted with localStorage.

Deployed using ubuntu/apache2/python/flask on a £5 Digital Ocean server (but could have been hosted on a static hosting provider as it's just a single page with no backend).

Images / metadata stored in an AWS S3 bucket.

SilentM68 · 10h ago
Interesting. I wish there existed an app for actually finding a cure for every killer decease.
sungam · 9h ago
Haha I think will need more than just an app! But in all seriousness, the potential for applying modern AI techniques to DNA and protein sequence analysis, structural analysis and regulatory network modeling is immense and we have only scratched the surface so I am sure it will accelerate biological discovery.
geoffbp · 7h ago
“How to recognise skin cancer” link from the menu goes to 404
sungam · 7h ago
Sorry - wasn't expecting anyone to use the app or any traction on HN! Will update
aegypti · 13h ago
Basal Cell Carcinoma is very gross!

Think a set number of questions to start with would be good. Not sure if there’s an end point, I drifted off after ~20 or so

sungam · 12h ago
Good idea will implement this is a future version
Kibranoz · 3h ago
doing this in real dev could have taken similar time.
sungam · 2h ago
For an experienced front end dev yes - for the average dermatologist no chance!
bobmcnamara · 10h ago
I'm around 75%

Idea: distribution of player scores

I'm going to get some models checked out.

sungam · 10h ago
Great idea! Will aim to add to the next version
bix6 · 10h ago
The two links in your menu don’t work but otherwise this is awesome!
sungam · 9h ago
Thanks - I wasn't expecting anyone to use the app apart from perhaps a few patients and definitely didn't expect #2 spot on HN

Will add asap but currently focused on answering questions!

Fortunately my £5 Digital Ocean server is coping fine so far...

pama · 11h ago
Thanks for the reminder to schedule the annual dermatology appointment.
nasir · 12h ago
Learned quite a bit and seems like a basic but necessary thing to know about!
sungam · 12h ago
Thanks, I'm glad you found it useful. My patients were constantly asking for a way to learn what skin cancer looked like beyond the ABCDE rule and I wanted to try and introduce a gamification aspect to it.
nextworddev · 10h ago
Can people sue you for malpractice if something goes wrong?
sungam · 8h ago
No - the app is just asking users to give their own opinion on publicly available images so there is no duty of care
childintime · 11h ago
would a tool that can take a truly tiny sample out of the lesion be a valuable complement? so we can send it in (with the tool) and get a lab test done?
sungam · 8h ago
Unfortunately not that useful - we could do that now by taking a very small (e.g. 2mm) punch biopsy, and if it shows melanoma obviously that is helpful and the rest of the lesion needs to be removed. The issue is that a negative result doesn't exclude melanoma elsewhere in the lesion.

I have been working with a startup to try and develop a non-invasive molecular test for melanoma so hopefully this will be possible in the future.

ada1981 · 3h ago
How good is ChatGPT or Claude at classifying these? Have you tried?
sungam · 2h ago
Not tried but my guess is ChatGPT will be quite accurate but get a small proportion wrong. The challenge with skin cancer is that we cannot afford to miss even 1:10,000 cases
MistaGobo · 9h ago
WARNING: Not to be viewed while eating!
sungam · 9h ago
Perhaps should add a NSFE tag...
NoiseBert69 · 12h ago
What happens if I make a picture of my cat with it?
sungam · 12h ago
Not sure how you would do this but feel free to try!
mustaphah · 8h ago
cool, very nice. The real test starts when the first dependency gets deprecated.
sungam · 8h ago
No dependencies - single page app with no backend including all html/css/js
retinaros · 8h ago
Ok everything is cancer. Thanks for nothing now I wont sleep

No comments yet

roggenbuck · 5h ago
This is great!
k2xl · 13h ago
Wow this game just proves to me how difficult your job is. I am basically getting 50%.

One or two seemed quite obvious to me as concerning or not but turned out to be the other way

sungam · 12h ago
It can be challenging but the large majority of skin cancers are fairly obvious and the main reason people don't spot them is because they are not checking their skin regularly and don't have any idea what to look for. Hopefully this app will help patients to learn the basic things to look for.
kittikitti · 7h ago
This is actually a really great vibe coded app. It's simple and doesn't require much logic. Will vibe coding catch on to more sophisticated and complex use cases? That's only if the whispers about an upcoming AI Winter are false.
sungam · 6h ago
Thanks - I'm glad you found it useful! I have been meaning to make it for many years but could never justify the time involved until AI made it possible.
lvl155 · 10h ago
Dude, everything is “I am concerned.”
sungam · 9h ago
The representation is skewed towards skin cancers at this time due to the image dataset I have used. In a future version I will make it 50:50
lvl155 · 7h ago
It would be cool if you can add wider angle vs close up. It’s really easy to miss spots in your back for example.
sungam · 6h ago
Thanks for the suggestion I will do this if I can source the images
lvl155 · 6h ago
You can probably synthesize using AI.
yieldcrv · 8h ago
It’s great that more people can express themselves

For personal fulfillment, humanities evolutionary fitness, and for commercial purposes

hopelite · 10h ago
Today I learned most things are cancer
asdev · 9h ago
is this legal?
sungam · 9h ago
Why wouldn't it be? The images are publicly available and have creative commons license.
quantummagic · 12h ago
Nice Job. This really highlights that people who obsess in telling us that "AI hallucinates", and "AI isn't intelligent", are missing the point. At the end of the day, it's simply useful, and incredibly empowering.
sungam · 12h ago
Yes, without AI this app definitely would not exist as I would not have had time to make it. I think that this will apply to multiple other areas within the economy.
sky2224 · 5h ago
Another thing to add: developers don't have the time or don't want to spend time on apps like this, but it's not like this is anything complicated.

This app ultimately amounts to something that has been done millions of times, and so I think it's actually quite empowering for individuals to be able to quickly build mockups of apps like this for themselves without needing to spend upwards of $75/hr to hire some freelance dev to do it for them.

FpUser · 9h ago
I like the thing but ..

"Vibe coded" - asked ChatGPT or whatever alternative to do the thing for me. There is no fucking vibe here, just another cheesy term.