Why you can't color calibrate deep space photos

212 LorenDB 95 7/23/2025, 12:16:18 AM maurycyz.com ↗

Comments (95)

pizzathyme · 4h ago
The question that the general public always wants answered is roughly, "If I was floating in a safe glass bubble in outer space looking at this object with my eyes, what would I see?"

Does anyone know the answer to this? Would it just be black? Or just a bright white star?

hnuser123456 · 3h ago
Well, some of the densest, nearest, most interesting structures are right behind the rest of the Milky Way, which causes up to 30 magnitudes of extinction in visible light, but deeper IR and UV passes through much more easily.

I think a better way to describe the issue is that much of the structure of the cosmos is only visible in non-visible wavelengths, so while calibrated, accurate visuals "like you were in a safe glass bubble" is a good category of astrophotography to continue to refine, it's a tiny slice of what's emitting electromagnetic radiation that's worth visualizing. And if cameras can convert invisible colors into visible ones, that's a blessing of a capability.

jacobr1 · 2h ago
Even in the case of the Milky Way blocking the light, we could probably do the math to estimate would an observer would see if they were outside the Milky Way, interpolating from the observed IR/UV and some model of the visible spectra would be be without the local interference. Or is that infeasible in some way?

Certainly some phenomena just don't have much visible light of interest - so shifting the the spectra is inherently necessary in those cases and is thus inherently subjective.

yig · 2h ago
Can we infer what that visible light would look like unobstructed and with appropriate intensity to activate our eyes' cones?
indoordin0saur · 3h ago
You can look through a telescope and the actual light from the stars will be hitting your retinas, so you can answer this question for yourself!

For most things like nebula and galaxies the shapes can be made out easily but often the colors are absent or muted because our eyes' color cones don't do as well in low-light as a monochromatic rods. This is similar to how if you have ever seen the aurora the colors are hard to perceive but if you use a camera they pop out a lot more.

There's a big range though and it depends on the intensity of light from the object. There are certainly nebulas and other objects out there with enough light intensity that our eyes can naturally perceive the color.

The other thing to keep in mind is that people's night vision and ability to perceive color in low light is highly variable, much like how our visual acuity (near/far sightedness) vary greatly by individual and worsen somewhat with age.

cconstantine · 3h ago
For the vast majority of things in the sky, they'd see black. This stuff is incredibly dark, and we need hours of exposure time to get enough signal. Even after hours of exposure the raw stacked frame is a black field with some pinpoints of lights.

The exception to this is stuff in our own solar system.

cruffle_duffle · 2h ago
Another thing not mentioned is all the stuff further away gets redshifted down into IR thanks to the universe expanding. So while the original output might have been in the visible part of the spectrum its wavelengths have now been stretched out to something your eye can no longer see.

If you want to see all the cool shit 4 billion light years away, you are gonna have to get those retinal IR implants you keep asking for each Christmas installed.

klysm · 17h ago
Recently I've been on a bit of a deep dive regarding human color vision and cameras. This left me with the general impression that RGB bayer filters are vastly over-utilized (mostly due to market share), and are they are usually not great for tasks other than mimicking human vision! For example, if you have a stationary scene, why not put a whole bunch of filters in front of a mono camera and get much more frequency information?
_alternator_ · 3h ago
The vast majority of consumers want their camera to take pictures of people that “look good” to the human eye; the other uses are niche.

But that said, I’m actually surprised that astrophotographers are so interested in calibrating stars to the human eye. The article shows through a number of examples (IR, hydrogen emission line) that the human eye is a very poor instrument for viewing the “true” color of stars. Most astronomical photographs use false colors (check the captions on the NASA archives) to show more than what the eye can see, to great effect.

petsfed · 2h ago
I suspect its because when conditions are right to actually see color in deep-sky objects, its confounding that it doesn't look the same as the pictures. Especially if seeing the colors with your own eyes feels like a transcendent experience.

I've only experienced dramatic color from deep sky objects a few times (the blue of the Orion Nebula vastly outshines all the other colors, for instance), and its always sort of frustrating that the picture show something so wildly different from what my own eyes see.

Dylan16807 · 1h ago
There's a good chance the real problem there is limited gamut on the screen, and with the right viewing method the RAW photo could look much much better.
indoordin0saur · 2h ago
If you get a big enough telescope it will gather enough light to where you'll see things in proper color. I've seen the Orion nebula with a 10 inch reflector in a good location and the rich pinks, blues and reds were impossible to miss. This is the actual photons emitted from that object hitting your retina so it's about as "true color" as you can get.

I think when astrophotographers are trying to render an image it makes sense that they would want the colors to match what your eyes would see looking through a good scope.

richarme · 5h ago
The technical term for this is multispectral imaging. Lots of applications across science and industry.

[0] https://en.wikipedia.org/wiki/Multispectral_imaging

nothacking_ · 16h ago
That's common in high end astophotography, and almost exclusively used at professional observatories. However, scientists like filters that are "rectangular", with a flat passband and sharp falloff, very unlike human color vision.
rachofsunshine · 15h ago
Assuming the bands are narrow, that should allow approximately true-color images, shouldn't it?

Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)

and similarly for M and L cone channels, which goes to the integral representing true color in the limit.

Are the bands too wide for this to work?

nothacking_ · 15h ago
> Are the bands too wide for this to work?

For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.

For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.

queuebert · 4h ago
One big reason for filters in astronomy and astrophotography is to block certain frequency ranges, such as city lights.
cconstantine · 3h ago
> why not put a whole bunch of filters in front of a mono camera and get much more frequency information?

Just rgb filters aren't really going to get you anything better than a bayer matrix for the same exposure time, and most subjects on earth are moving too much to do separate exposures for 3 filters.

The benefits of a mono camera and rgb filters is that you can take advantage of another quirk of our perception; we are more sensitive to intensity than color. Because of this, it's possible to get a limited amount of exposure time with the rgb filters, and use a 4th "luminance" filter for the majority of the time. During processing you can combine your rgb images, convert that to HSI and replace the I channel with your luminance image. Because the L filter doesn't block much light it's faster at getting signal, but it's only really a benefit for really dark stuff where getting enough signal is an issue.

larrik · 5h ago
Yeah, I was surprised to learn that camera technology was calibrated primarily towards making white people look normal on film. Everything else was secondary. This is why cameras often have a hard time with darker skin tones: a century of the technology ignoring them.

Then I felt surprised that I was surprised by that.

chaboud · 14h ago
I think you want a push broom setup:

https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...

Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.

But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.

jofer · 17h ago
In case you weren't already aware, that last bit basically describes most optical scientific imaging (e.g. satellite imaging or spectroscopy in general).
adornKey · 11h ago
And don't forget about polarization! There's more information out there than just frequency.
klysm · 3h ago
I guess that’s yet another dimension. Perhaps spin a polarizing filter in front of the camera to grab that?
rtkwe · 4h ago
We do quite a bit, multispectral imaging is a well worn field used a lot in astronomy, scientific research and when studying art and other historical artifacts. Some photographers use it too it just gets harder because the scene is more likely to change slightly making the image blurry when you go to layer the different spectra and generally photographers are trying to capture more human adjacent representations of the scene.

https://en.wikipedia.org/wiki/Multispectral_imaging

https://colourlex.com/project/multispectral-imaging/

queuebert · 4h ago
That would trade time and frequency information for spatial information, which is what you want in astronomy, but maybe not for candid family photos.
kabouseng · 7h ago
Yes off course, but with the obvious disadvantage that you lose resolution for every filter you add. Then you say let's just increase the pixel count, which means smaller pixel pitch. But then you lose low light sensitivity, have to decrease your lens f/#, so more expensive lenses etc... Which is why it isn't done for commercially / mass market sensors.
jonhohle · 6h ago
I read that as: take a bunch of pictures of a static scene, each with a different filter capturing specific frequency bands individually. Merge afterwards with whatever weights or algorithms you want.

No comments yet

cyb_ · 14h ago
Having dabbled a bit in astrophotography, I would suggest that color is best used to bring out the structure (and beauty) of the object. Trying to faithfully match the human eye would, unfortunately, cause a lot of that data to be harder to see/understand. This is especially true in narrowband.
malfist · 4h ago
The Hubble palette was specifically chosen to make contrast better. So you can actually see different parts of the nebula.

Otherwise it's all just slightly different shades of red and IR.

stavros · 7h ago
Plus, what's the point? It's not like anything will change if the object looks a bit more green rather than blue, it makes no difference to the wonder of the universe.
Retr0id · 17h ago
The next space mission should be to leave a colour calibration chart on the moon.
embedded_hiker · 16h ago
They brought a gnomon, with a color chart, on the Apollo missions. They would set it up for many of the pictures of samples.

https://airandspace.si.edu/collection-objects/gnomon-lunar-a...

pgreenwood · 15h ago
Here's a shot of a color chart on the moon from Apollo 17 (AS17-137-20900):

https://tothemoon.im-ldi.com/data_a70/AS17/extra/AS17-137-20...

gowld · 4h ago
Is there a document that explains that color chart?

This version shows different shades of colors: https://eol.jsc.nasa.gov/SearchPhotos/photo.pl?mission=AS17&...

rtkwe · 4h ago
The first one has been adjusted to bring out the contrast and saturation while the one you link is an original scan. They brought a number of different swatches like that and took a picture of a calibration array at the end to account for the effects of space exposure on the film.
pgreenwood · 3h ago
The one I posted is from an original scan.
pgreenwood · 3h ago
I think we are looking at different scans. See: https://tothemoon.im-ldi.com/about
jofer · 16h ago
The moon itself already is one. Moonshots are widely used in calibration, at least for earth observation satellites. The brightness of the full moon at each wavelength at each day of the year is predictable and well-known, so it makes a good target to check your payload against.
shagie · 16h ago
They also put color calibration charts on Mars rovers. For example https://www.lucideon.com/news/colour-standards-on-mars
JNRowe · 11h ago
There is even a Damien Hirst¹ haphazardly spread about the surface for that purpose.

One of the great gifts Pillinger² had was being able to shake up public interest via pop culture; there was also call sign by Blur for Beagle 2.

¹ https://www.researchgate.net/figure/Spot-Painting-Beagle-2-C...

² https://en.wikipedia.org/wiki/Colin_Pillinger

strogonoff · 10h ago
It is not just in space where nothing is lit by a uniform light source or with a uniform brightness. This is also true for many casual photos you would take on this planet.

Outside of a set of scenarios like “daylight” or “cloudy”, and especially if you shoot with a mix of disparate artificial existing light sources at night, you have a very similar problem. Shooting raw somewhat moves this problem to development stage, but it remains a challenge: balance for one, make the others look weird. Yet (and this is a paradox not present in deep space photography) astoundingly the same scene can look beautiful to the human eye!

In the end, it is always a subjective creative job that concerns your interpretation of light and what you want people to see.

HPsquared · 10h ago
I suppose the human visual system is already adapted to deal with the same problem.
jofer · 16h ago
These same things apply to satellite images of the Earth as well. Even when you have optical bands that roughly correspond to human eye sensitivity, they're a quite different response pattern. You're also often not working with those wavelength bands in the visualizations you make.

Scientific sensors want as "square" a spectral response as possible. That's quite different than human eye response. Getting a realistic RGB visualization from a sensor is very much an artform.

mystraline · 15h ago
The proper color of an image would be a multispectral radiograph similar to a waterfall plot for each point. Each FFT bin would be 100GHz in size, and the range would be over 1000THz. And in a way, that'd what a color sensor is doing at the CCD level too - collapsing and averaging the radio energy its susceptible to a specific color.
7373737373 · 11h ago
ekunazanu · 10h ago
> Because there’s a lot of overlap between the red and green cones, our brain subtracts some green from red, yielding this spectral response:

No, cones do not produce a negative response. The graph shows the intensity of the primaries required to recreate the spectral colour at that wavelength. The negative implies that the primary was added to the spectral colour to match it with itself, instead of adding it with the other primaries.

https://en.wikipedia.org/wiki/CIE_1931_color_space#Color_mat...

rf15 · 9h ago
> No, cones do not produce a negative response.

not what was claimed at all...

ekunazanu · 1h ago
>> yielding this spectral response: [graph with negative values]

That's what I gathered from spectral response. Usually spectral response in this context refers to the responsivity of the cones. Even when accounting for 'brain subtracting green from red' (which I assume comes from the opponent process theory) the following graph has nothing to do with it. The captions too read 'Yes, this results in red having negative sensitivity @500 nm', implying the red (L) cones have a negative sensitivity to cyans — which, again, is not really the case.

echoangle · 6h ago
That's not a inherent property of deep space photos though, just of the sensors that are commonly used. There's no physical reason you couldn't build a telescope with a human response curve. It just maybe doesn't make a lot of sense from a science standpoint.
hliyan · 15h ago
I still haven't forgiven whoever made Voyager's first images of Jupiter's moon Io bright red and yellow, and The Saturnian moon Enceladus green.
ianburrell · 15h ago
Neptune was shown as deep blue for a long time, but it is really a similar color as Uranus, a pale greenish-blue.
jpizagno · 10h ago
As a former astronomer, this was a great post. (The website can use some post-90s styling however :> )
indy · 8h ago
That aesthetic is how you know you're on a good astronomy site
bhouston · 17h ago
> Many other cameras, particularly those with aggressive UV-IR cut filters, underespond to H-a, resulting in dim and blueish nebula. Often people rip out those filters (astro-modification), but this usually results in the camera overresponding instead.

Hmm... astrophotographers do not use cameras with UV-IR cut filters at all. For example, I owned a few of these:

https://www.zwoastro.com/product-category/cameras/dso_cooled...

They also generally do not use sensors that have Bayer filters. This also screws things up.

Instead they use monochromatic sensors with narrowband filters (either one band or multiple) over them keyed to specific celestial emissions. The reason for this is that it gets rid of light pollution that is extensive and bumps up the signal to noise for the celestial items, especially the small faint details. Stuff like this:

https://telescopescanada.ca/products/zwo-4-piece-31mm-ha-sii...

https://telescopescanada.ca/products/zwo-duo-band-filter

Often these are combined with a true color capture (or individual RGBL narrowband) just to get the stars coloured properly.

Almost everything you see in high end astrophotography is false color because they map these individual narrowband captures on the monochrome sensors to interesting colours and often spending a lot of time manipulating the individual channels.

This is done at the medium to high end using the PixInsight software - including by NASA for the recent James Webb images: https://www.pbs.org/video/new-eye-on-the-universe-zvzqn1/

The James Web telescope has a set of 29 narrowband filters for its main sensor: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

Hubble pictures were famously coloured in a particular way that it has a formal name:

https://www.astronomymark.com/hubble_palette.htm

(My shots: https://app.astrobin.com/u/bhouston#gallery)

recipe19 · 16h ago
What you're describing is the domain of a very, very small number of hobbyists with very deep pockets (plus various govt-funded entities).

The vast majority of hobby astrophotography is done pretty much as the webpage describes it, with a single camera. You can even buy high-end Canon cameras with IR filters factory-removed specifically for astrophotography. It's big enough of a market that the camera manufacturer accommodates it.

bhouston · 16h ago
> What you're describing is the domain of a very, very small number of hobbyists with very deep pockets

Sort of. The telescope used for the Dumbbell nebula captures featured in the article was at worth around $1000 and his mount is probably $500. A beginner cooled monochrome astrophotography camera is around $700 and if you want filters and a controller another $500.

There are quite a few people in the world doing this, upwards of 100K:

https://app.astrobin.com/search

Various PixInsight videos have +100K views: https://youtu.be/XCotRiUIWtg?si=RpkU-sECLusPM1j-&utm_source=...

Intro to narrowband also has 100K+ views: https://youtu.be/0Fp2SlhlprU?si=oqWrATDDwhmMguIl&utm_source=...

looofooo0 · 13h ago
Some even scratch of the bayer pattern of old cameras.
tecleandor · 14h ago
You don't need very big pockets for that.

Today you can find very affordable monochromatic astrophotography cameras, and you can also modify cheap DSLR cameras or even compact cameras to remove its IR/UV/low pass filters. You can even insert a different semi permanent internal filter after that (like a IR or UV band pass)

I've done a Nikon D70 DSLR and a Canon Ixus/Elph compact.

Some cameras are very easy, some very difficult, so better check first some tutorials before buying a camera. And there are companies doing the conversion for you for a bunch of hundred dollars (probably 300 or 400).

looofooo0 · 13h ago
You can even do the conversion diy.
tecleandor · 9h ago
Yep. I did both myself, as I was using old cameras that I had hanging around and if I sent them for conversion it would be more expensive than the cost of the camera.

Conversions done in places like Kolari or Spencer run about $300-500 depending on the camera model.

If I were to buy a brand new A7 IV or something like that, I would of course ask one of those shops to do it for me.

tomrod · 15h ago
And the entire earth observation industry, which doesn't look the same way but uses the same base tech stack.
verandaguy · 17h ago

    > astrophotographers do not use cameras with UV-IR cut filters at all
I'll be pedantic here and say that the author's probably talking to people who use DSLRs with adapter rings for telescopes. I've been interested in doing this for a while (just unable to financially justify it), and I think this is actually something people in this niche do.

Then there are things like the Nikon D810A, which remove the UV-IR filter from the factory (but IIRC retain the Bayer filter).

bhouston · 16h ago
My recommendation, as someone who started with a DSLR and then modded it to remove the UV-IR filter, I would have been better to just skip to a beginner cooled mono astrophotography camera, like the ASI533MM Pro. It is night and day difference in terms of quality and roughly the same cost and it automates better much better.

A high end DSLR is a huge waste of money in astrophotography. Spend the same amount on a dedicated astrophotography camera and you’ll do much better.

schoen · 15h ago
> It is night and day difference

Particularly high praise in astronomy!

verandaguy · 16h ago
How do you recover colour from a mono astro camera? Just run it for 3 exposures behind a gel of each of the R/G/B colours, then comp?
bhouston · 51m ago
> How do you recover colour from a mono astro camera? Just run it for 3 exposures behind a gel of each of the R/G/B colours, then comp?

Essentially yes. To get faint details in astrophotography, you actually will capture a series of images of each filter with long exposure times like 3 minutes per capture with a total capture time per filter measured in hours. You then star align everything, then you integrate the captures for each filter into a single frame to remove noise and boost signal, then you comp them together.

gibybo · 15h ago
Yes, and you would almost certainly want to automate it with a filter wheel that changes the filters for you on a schedule. However, a key advantage of a mono camera is that you don't have to limit yourself to RGB filters. You can use some other set of filters better suited for the object you are capturing and map them back to RGB in software. This is most commonly done with narrowband filters for Hydrogen, Sulfur and Oxygen which allow you to see more detail in many deep space objects and cut out most of the light pollution that would otherwise get in your way.
dheera · 16h ago
It's worth noting that many NASA images use the "HSO" palette which is false color imagery. In particular the sulfur (S) and hydrogen (H) lines are both red to the human eye, so NASA assigns them to different colors (hydrogen->red, sulfur->green, oxygen->blue) for interpretability.
Aaargh20318 · 6h ago
Amateur astrophotographers, like myself, also often use the HSO palette.

The nice thing about deep space objects is that they don't change at a rate that there is any visible difference over timespans relevant to humans. This means that you can do very, very long exposures. It also means that you don't need to capture all colours at once.

The linked article talks about using a color camera with a bayer matrix that uses dyed glass to get color data and how the filters don't filter out light in the 800-1000nm range.

Lots of amateur astrophotographers don't use color camera's but monochrome cameras in combination with separate filters. These filters are of much higher quality than the dye based filters used on camera sensors. Instead of dyes they use interference filters[1]. These filters do not have the same issue as described in the article. For example the LRGB filter set I use only lets through a 100nm band for each color. Next to that you can use filter that only let through emissions from specific elements, like the HSO filters mentioned (Hydrogen-alpha, Sulphur-II, Oxygen-III). In my case I have filters with a bandwidth of 3nm around the emission line of each of these elements.

The bigger issue with making these photo's look like what they 'really look like' is that they really look like nothing at all. The light from these objects is so faint that even with a highly sensitive cooled camera you need many hours of exposure to grab enough data to make a nice picture. What they 'really look like' to the human eye is black.

The issue is worse when you add in the narrowband data, because that overlaps with the visible spectrum, but in that case you usually emphasise certain bands to stand out more in the final picture. It's not uncommon to combine the LRGB data with e.g. H-alpha data.

In the end the colours you choose are up to the photographer and their goals. Are you just going for a pretty picture or are you trying to visualize scientific data?

[1]: https://en.wikipedia.org/wiki/Interference_filter

dheera · 1h ago
I do a lot of amateur astrophotography too (https://www.instagram.com/dheeranet/), mainly with the objective of trying to show what things "would look like" if these deep space objects were much brighter; I maintain compositional and visual accuracy; all of these things could have theoretically been shot as-is as a single image if the objects were brighter.

> It's not uncommon to combine the LRGB data with e.g. H-alpha data.

Yeah I do this too. Typically I do HOO to maintain as close to spectral accuracy as possible. I don't have an S filter.

It is, by the way, possible to get DSOs shot as a single image, it's tricky. See the 3rd image in this series:

https://www.instagram.com/p/CKXdL1YndQi/?img_index=3

I want to try to fine tune a diffusion model to make it possible to actually shoot these single shot with good fidelity.

Aaargh20318 · 50m ago
Is that a single exposure? How did you prevent the foreground from blowing out on that, were you in a super dark location?

Unfortunately I'm in a Bortle 5 to 6 area so I'd need quite a bit of integration time to get a decent SnR. There aren't really any proper dark sites in western Europe.

vFunct · 17h ago
You can if you use hyper spectral imaging...
nothacking_ · 15h ago
The problem with hyperspectral imaging is that it ends up throwing away 99.9% of all the light that hits your camera. It's been done for the sun and some very bright nebulae, but really isn't practical for most of the stuff in space.
choonway · 16h ago
probably will come out within the next 5 iphone generations.

POC already out...

https://pmc.ncbi.nlm.nih.gov/articles/PMC8404918/

kragen · 16h ago
People have been making production hyperspectral sensors for decades, including hobbyists in garages; we're well beyond the proof-of-concept stage.
kurthr · 17h ago
What's the white point? Is it D65? Not when the sun isn't out.
klysm · 17h ago
I've always been confused by what the white point actually _means_. Since we are dealing with strictly emissive sources here, and not reflected sunlight, does the whitepoint even mean anything?
procflora · 1h ago
Don't forget my fav, the humble reflection nebula. Of course there are always emissive stars in the background of any image of one so the point is moot. But still, there are a few hundred! :D
esafak · 16h ago
In a scene lit overwhelmingly by one approximately Planckian light source, the white point is the color of the closest Planckian light source.

If the light source is not approximately Planckian, or if multiple illuminants have different temperatures, a white point is not defined.

queuebert · 4h ago
As an aside, I think you are referring to black body sources, which is described by Planck's law. Stars dgaf about color spaces of computer monitors. They are relatively close to black bodies as thermal emitters, though, at least on main sequence and +/- some spectral lines.

We physicists never use the term Planckian for thermal black bodies. That adjective would be used in quantum mechanics, though, for very small things.

esafak · 2h ago
Yes, I am. Planckian radiation is the term of art in color science, prescribed by its standards body, the CIE. https://files.cie.co.at/CIE_TN_013_2022.pdf

To understand what I mean by "closest Planckian light source" see https://en.wikipedia.org/wiki/Planckian_locus

klysm · 14h ago
So in this case there is no sensible white point since there is no illuminant right?
esafak · 14h ago
I'm not sure what case we're talking about but if it serves visible light it is an illuminant.
klysm · 3h ago
Deep space photos - I don’t think there is an illuminant at all
bhickey · 17h ago
The tiniest of corrections: Ha is 656.28nm not 565.
execat · 16h ago
At risk of going off-topic, when I see comments like these, I wonder how the comment author comes up with these corrections (cross-checked, the comment is in fact true)

Did you have the number memorized or did you do a fact check on each of the numbers?

kragen · 16h ago
I didn't know the number was wrong, but something about the statement seemed very wrong, because the 565nm number is only 10nm away from 555nm, conventionally considered the absolute maximum wavelength of human visual sensitivity (683lm/W). And you can see that in the photopic sensitivity curves in the rest of the article: both red and green cones respond strongly to light all around that wavelength. So it seemed implausible that 565nm would be nearly invisible.

But I didn't know whether Ha was actually highly visible or just had a different wavelength. I didn't know 683lm/W either, and I wasn't exactly sure that 555nm was the peak, but I knew it was somewhere in the mid-500s. If I'd been less of a lazy bitch I would have fact-checked that statement to see where the error was.

kragen · 11h ago
I see that there's a [dead] reply by the kind of person who thinks "tryhard" is an insult and has applied it to me.

When I compare people I know about who tried hard to the people I know about who didn't try hard, literally every single person I would want to be like is one of the people who tried hard. I'm unable to imagine what it would be like to want to be like the other group.

I mean, I don't want to be like Michael Jordan, but I can imagine wanting to be like him, and in part this is because specifically what he's famous for is succeeding at something very difficult that he had to try unbelievably hard at.

So I'm delighted to declare myself a tryhard, or at least an aspiring tryhard.

Completely by coincidence, when I saw the tryhard comment, I happened to be reading https://www.scattered-thoughts.net/writing/things-unlearned/:

> People don't really say this [that intelligence trumps expertise] explicitly, but it's conveyed by all the folk tales of the young college dropout prodigies revolutionizing everything they touch. They have some magic juice that makes them good at everything.

> If I think that's how the world works, then it's easy to completely fail to learn. Whatever the mainstream is doing is ancient history, whatever they're working on I could do it in a weekend, and there's no point listening to anyone with more than 3 years experience because they're out of touch and lost in the past.

> Similarly for programmers who go into other fields expecting to revolutionize everything with the application of software, without needing to spend any time learning about the actual problem or listening to the needs of the people who have been pushing the boulder up the hill for the last half century.

> This error dovetails neatly with many of the previous errors above eg [sic] no point learning how existing query planners work if I'm smart enough to arrive at a better answer from a standing start, no point learning to use a debugger if I'm smart enough to find the bug in my head.

> But a decade of mistakes later I find that I arrived at more or the less the point that I could have started at if I was willing to believe that the accumulated wisdom of tens of thousands of programmers over half a century was worth paying attention to.

> And the older I get, the more I notice that the people who actually make progress are the ones who are keenly aware of the bounds of their own knowledge, are intensely curious about the gaps and are willing to learn from others and from the past. One exemplar of this is Julia Evans, whose blog archives are a clear demonstration of how curiosity and lack of ego is a fast path to expertise.

bhickey · 15h ago
In this case I coincidentally spent a few hundred hours of hobby time over the last year designing hydrogen alpha telescopes.
nothacking_ · 15h ago
Fixed.
system2 · 17h ago
Isn't this why they always use the term "artist's impression" when they are colored?
nwallin · 16h ago
No.

When you see "artist's impression" in a news article about space, what you're looking at is a painting or drawing created from whole cloth by an artist.

This article is about how sensors turned signals into images. When you take pictures with a 'normal' camera, we've designed them so that if you take certain steps, the image on your screen looks the same as what it would look like in real life with no camera or monitor. This article is stating that with the cameras and filters they use for telescopes, that same process doesn't really work. We use special filters to measure specific spectral properties about an astronomical object. This gives good scientific information, however, it means that in many cases it's impossible to reconstruct what an astronomical object would really look like if our eyes were more sensitive and we looked at it.

okanat · 17h ago
There are different reasons for that. Things like black holes are really hard to observe even in other light spectrums. Same for other objects like planets. So the drawings are made in hypothetical expectations based on simulations rather than direct observations.

Many observations come from scientific cameras rather than actual visible spectrum cameras discussed in TFA. They are not artist's impression like the first case. They will have a completely different view of the object so any visible-light predictions will have some guessing in it but the final picture will be not 100% you would see.

recipe19 · 17h ago
I think that term is reserved mostly for actual artwork (renderings, paintings, etc).

Some deep-space astronomy pictures are in completely made-up color, often because they're taken at wavelengths different than visible light and then color-mapped to look pretty.

But the point here is even if you're taking images with a regular camera pointed at the sky, it's pretty much impossible to match "reality".

monkeyelite · 15h ago
Disappointing that most space photos are made by mapping an analog input onto a gradient and that this isn’t stated more directly.