For anyone on Android, this is possible with Simulate Color Space option in the hidden Developer Tools menu. Amusingly it works with the camera too, so you can look around the real world with some sense of what it's like.
politelemon · 31d ago
That sounds more useful, cheers. I wonder why they haven't made it a non hidden option.
OJFord · 31d ago
It's annoying enough (especially as a child at school, but not at all exclusively) to be colourblind and put up with 'so what colour is this' without people waving their phone around exclaiming about how it says you see things, thanks.
cwillu · 30d ago
Sure, but at the same time, it's a useful tool to show people how unusable their ui is using something they already have.
The problem with “what colour is this? what colour is that?” is not the question, it's that the question comes up with an expectation of an answer, regardless of the context. If I'm _never_ willing to answer the annoying question, that makes me the asshole regardless of how poor my colour vision is.
OJFord · 30d ago
Absolutely, but developers with that legitimate use for it can use the developer tools.
oniony · 29d ago
I play a lot of board games and it would be so useful for game/graphic designers, I would imagine.
bluechair · 30d ago
I’ll highlight a note from the developer:
Sim Daltonism lets you see through the eyes of someone with a color blindness. While the colors shown are a good approximation of what a color blind person would see, you should not expect them to be perfect.
Everyone has his own perception of colors that differs slightly from other people, and color blindness are often partial at different degrees. More importantly, cameras do not have the same spectral response as cones in your eyes, so the simulation has to make some assumptions about the frequency composition of the colors.
I’m colorblind and haven’t found a simulator that comes close to what it’s like for me. This app doesn’t do it either.
adamgordonbell · 30d ago
I'm also color blind, red green, but not sure how you expected to be able to judge it.
You can't see how the app affects colors in absense of your own color blindness to compare.
egypturnash · 30d ago
What would “close to what it’s like” entail exactly?
Would it mean that when you look at a simulation of the effects of your colorblindness, you see zero change from the unaltered view?
Or would it mean that it looks absolutely nothing like what you see because it’s transforming the base image by clamping the input colors to what you can see, and stretching that decimated color space out over the entire range of normal sensitivity?
Sometimes I suspect that the range of color qualia the human mind experiences is the same regardless of what actual color receptors one has; the sensation we call “red” is assigned to the lowest end of the input scale, regardless of whether or not the lowest end is at the normal wavelength, and that every filter that just removes color and provides a duller image is doing completely the wrong thing. But it’s a much simpler transformation to implement.
(I think the key to checking this would involve violently clashing colors. Or a way to make someone start growing new cone cells in their eyes.)
Also if you have had entirely too many conversations with the normies about “what does it look like for you” then please just ignore this, my SO is partially colorblind and gets that a lot!
swiftcoder · 30d ago
> Would it mean that when you look at a simulation of the effects of your colorblindness, you see zero change from the unaltered view?
Ideally, yes. Although it's unlikely to match any one person's exact colour vision.
If you look at filtered images side-by-side, say from this collection on bored panda[1], to me the deutran images and the normal image are pretty much indistinguishable, while the protan image is close but slightly too green.
> Or would it mean that it looks absolutely nothing like what you see because it’s transforming the base image by clamping the input colors to what you can see, and stretching that decimated color space out over the entire range of normal sensitivity?
That's how most "colour blind filters" look in practice, yes. I don't think a lot of folks are setting up the transform correctly (or they are just straight-up using a colourblindness preview filter as if it were a colourblindness correction filter).
Part of it may be the display technology, rather than what the software thinks should look right.
Those RGB pixels are chosen and tuned to trick a certain homo sapiens baseline setup of chemical sensors neurological weighing of sensor inputs. Light from natural source is dramatically more-varied.
rollcat · 30d ago
Tangential: I've been playing around with the color blindness filters on my iPhone, and the grayscale filter had me thinking for a moment. I've set it to 50%, set up the accessibility shortcut (triple-click on the home button) to toggle it, and found myself using my phone with the filter on basically 99% of the time.
It's been a couple of months, and I've noticed that the oversaturated colors were making me slightly agitated, somehow captivating my attention. I sometimes disable the filter to look at a particular picture, or to figure out a detail in some context where the colors are already desaturated. Now my only wish is that it was less linear, maybe like a compressor in audio - maintain detail until it starts approaching the ceiling.
It may be a good idea if you'd consider <thelightphone.com> but don't want to switch.
I bet there's a way to accomplish what you're looking for with ICC profiles. They allow arbitrary functions and LUTs in addition to standard matrix math. There's typically a way to set this for your OS, an individual image, and I'm pretty sure an individual app as well (but I'd imagine only your own).
Edit: Actually, this may not be editable on iOS, but it is on macOS, Windows, probably Linux, and it looks like Android too.
You can do this on Android as well (manufacturer results may vary...)
This is on Android 14, but I initially turned it on in an earlier version:
1. unlock the Developer options (you can search this; depends on OS version)
2. in Developer options, scroll down to "simulate color space", choose grayscale
3. back to main settings --> Accessibility --> advanced, there is an "accessibility button" option
4. set that to "color correction"
Now I have a small icon (looks like a person) that I can use to toggle monochrome. Indeed it was someone I heard on a train who had this turned on for iPhone as described above for the same reason ("lowers the dopamine response", he explained to the conductor) that intrigued me into looking into it
Samsung also has a decent automation infrastructure ("modes and routines"), so I set that up to automatically disable the color correction / grayscale in certain apps (camera, photos, maps, etc.)
DawsonBruce · 30d ago
Huge fan of this tool for working on GUI implementations, ensuring the color choices and contrasts make sense for users that see GUIs differently than I do.
egypturnash · 30d ago
This is such a useful tool, I constantly pop it up to check contrast in my art.
AprilArcus · 30d ago
Could this be used in reverse to correct for color vision disorders, e.g. by punching down greens and blues and punching up reds into the outer range of the P3 gamut?
lastdong · 30d ago
Kudos to the developer for creating such an optimized app! It's only 444KB for iOS.
The problem with “what colour is this? what colour is that?” is not the question, it's that the question comes up with an expectation of an answer, regardless of the context. If I'm _never_ willing to answer the annoying question, that makes me the asshole regardless of how poor my colour vision is.
Sim Daltonism lets you see through the eyes of someone with a color blindness. While the colors shown are a good approximation of what a color blind person would see, you should not expect them to be perfect.
Everyone has his own perception of colors that differs slightly from other people, and color blindness are often partial at different degrees. More importantly, cameras do not have the same spectral response as cones in your eyes, so the simulation has to make some assumptions about the frequency composition of the colors.
I’m colorblind and haven’t found a simulator that comes close to what it’s like for me. This app doesn’t do it either.
You can't see how the app affects colors in absense of your own color blindness to compare.
Would it mean that when you look at a simulation of the effects of your colorblindness, you see zero change from the unaltered view?
Or would it mean that it looks absolutely nothing like what you see because it’s transforming the base image by clamping the input colors to what you can see, and stretching that decimated color space out over the entire range of normal sensitivity?
Sometimes I suspect that the range of color qualia the human mind experiences is the same regardless of what actual color receptors one has; the sensation we call “red” is assigned to the lowest end of the input scale, regardless of whether or not the lowest end is at the normal wavelength, and that every filter that just removes color and provides a duller image is doing completely the wrong thing. But it’s a much simpler transformation to implement.
(I think the key to checking this would involve violently clashing colors. Or a way to make someone start growing new cone cells in their eyes.)
Also if you have had entirely too many conversations with the normies about “what does it look like for you” then please just ignore this, my SO is partially colorblind and gets that a lot!
Ideally, yes. Although it's unlikely to match any one person's exact colour vision.
If you look at filtered images side-by-side, say from this collection on bored panda[1], to me the deutran images and the normal image are pretty much indistinguishable, while the protan image is close but slightly too green.
> Or would it mean that it looks absolutely nothing like what you see because it’s transforming the base image by clamping the input colors to what you can see, and stretching that decimated color space out over the entire range of normal sensitivity?
That's how most "colour blind filters" look in practice, yes. I don't think a lot of folks are setting up the transform correctly (or they are just straight-up using a colourblindness preview filter as if it were a colourblindness correction filter).
[1]: https://www.boredpanda.com/different-types-color-blindness-p...
https://www.theverge.com/23650428/colorblindness-design-ui-a...
Those RGB pixels are chosen and tuned to trick a certain homo sapiens baseline setup of chemical sensors neurological weighing of sensor inputs. Light from natural source is dramatically more-varied.
It's been a couple of months, and I've noticed that the oversaturated colors were making me slightly agitated, somehow captivating my attention. I sometimes disable the filter to look at a particular picture, or to figure out a detail in some context where the colors are already desaturated. Now my only wish is that it was less linear, maybe like a compressor in audio - maintain detail until it starts approaching the ceiling.
It may be a good idea if you'd consider <thelightphone.com> but don't want to switch.
Also, Rob Pike: <https://commandcenter.blogspot.com/2020/09/color-blindness-i...>
Also: <https://duckduckgo.com/?q=plan+9+from+bell+labs+rio&ia=image...>
Edit: Actually, this may not be editable on iOS, but it is on macOS, Windows, probably Linux, and it looks like Android too.
https://en.wikipedia.org/wiki/ICC_profile
This is on Android 14, but I initially turned it on in an earlier version: 1. unlock the Developer options (you can search this; depends on OS version) 2. in Developer options, scroll down to "simulate color space", choose grayscale 3. back to main settings --> Accessibility --> advanced, there is an "accessibility button" option 4. set that to "color correction"
Now I have a small icon (looks like a person) that I can use to toggle monochrome. Indeed it was someone I heard on a train who had this turned on for iPhone as described above for the same reason ("lowers the dopamine response", he explained to the conductor) that intrigued me into looking into it
Samsung also has a decent automation infrastructure ("modes and routines"), so I set that up to automatically disable the color correction / grayscale in certain apps (camera, photos, maps, etc.)